Demystifying Array.reduce in TypeScript

Explaining reduce in a way that’s understandable.

For the longest time I didn’t use array.reduce() in JavaScript. It simply didn’t make sense. Reduce was for senior engineers, not me. Why would I use it when it was simple to chain other methods like .map and .filter? Once I took the time to understand reduce, it became super helpful and one of my go-to array methods to simplify operation chaining. You can map and filter at the same time, and even turn arrays into objects!

I also have a JavaScript version of this article so you can use what you’re familiar with.

Helpful Tips

Naming

Get creative with your variable naming. Name the parameters something that makes sense to you. I’ve seen a lot of code bases that use (acc, curr) as the reduce params, which was the main reason I didn’t use it. When I started changing the names, it made a lot more sense. You’ll notice this in the examples below.

In college I had a calculus professor who replaced variables x, y, z with emoji like 😀, 👍, ❤️ and calculus made sense after that. It forced me to separate the name of a thing from its value or purpose. So if something doesn’t make sense, just call it “stuff”.

Reduce is Recursive

Reduce is a recursive function. Whatever is returned gets passed in as the first argument in the next loop. So if one of the conditions returns something other than your final output it could break. I use this snippet when starting with Reduce so I don’t forget.

array.reduce((itemsToReturn, item) => {
  // code goes here.
  return itemsToReturn;
}, [])

Return Type

There are several places you can put the return type of reduce, but I’ve found that typing the first argument of the function gives me the best developer experience. It ensures that my return value is correct, each loop properly updated the return value, and then I let TypeScript infer the rest.

array.reduce((itemsToReturn: Array<string>, item) => {}, []);

Examples

For each example I’ll use this same fake data set for consistency.

type Post = {
  id?: number | null | undefined;
  status?: 'published' | 'draft' | null | undefined;
  title?: string | null | undefined;
};
type Posts = Array<Post | undefined>;

const posts: Posts = [
  { id: 1, status: 'published', title: 'article one' },
  { id: 2, status: 'draft', title: 'article two' },
  {},
  { id: 4, status: null, title: 'article four' },
  { id: 5, status: 'published', title: 'article five' },
  { id: null, status: 'draft', title: null },
  { id: 7, status: 'published', title: 'article seven' },
];

Filter empty and transform the resulting array

Scenario: there is another function that expects the post to have complete data and its status to be uppercase. Since our post data is possibly incomplete and the status is lowercase, we need to clean up both of those issues before passing the post data to this other function.

Filter + Map

Using filter and map we can process the data in an easily readable way. We first simplify the array to only have valid posts, then we transform all of the statuses to uppercase. While it may seem simple, this method has to iterate over the array twice which is not very efficient.

type ValidPosts = Array<NonNullable<Omit<Post, 'status'> & { status: 'DRAFT' | 'PUBLISHED' }>>
// Chained filter and map
const validPosts: ValidPosts = posts
  .filter(post => post?.id && post.title && post.status)
  .map(post => ({ ...post, status: post.status.toUpperCase() }));

// validPosts = [
//  { id: 1, status: 'PUBLISHED', title: 'article one' },
//  { id: 2, status: 'DRAFT', title: 'article two' },
//  { id: 5, status: 'PUBLISHED', title: 'article five' },
//  { id: 7, status: 'PUBLISHED', title: 'article seven' },
// ]

Reduce

Using reduce to solve this problem can also be readable and simple. The biggest benefit here is that it only processes the array one time. I added some comments below to show the same logic from the chained example above. They’re quite similar!

type ValidPosts = Array<NonNullable<Omit<Post, 'status'> & { status: 'DRAFT' | 'PUBLISHED' }>>

// Combine map and filter
const validPosts = posts.reduce((validPostsArray: ValidPosts, post) => {
  // .filter(post => post?.id && post.title && post.status)
  if (post?.id && post.title && post.status) {
    // .map(post => ({ ...post, status: post.status.toUpperCase() }));
    validPostsArray.push({ ...post, status: post.status.toUpperCase(), });
  }

  // `reduce` is recursive, so the result must be returned.
  // If it's not returned, it can't be used by the next item in the loop.
  return validPostsArray;
}, []);

// validPosts = [
//  { id: 1, status: 'PUBLISHED', title: 'article one' },
//  { id: 2, status: 'DRAFT', title: 'article two' },
//  { id: 5, status: 'PUBLISHED', title: 'article five' },
//  { id: 7, status: 'PUBLISHED', title: 'article seven' },
// ]

Return a specific number of filtered items

Filter + Map + Slice

In this example I want to get the last two published posts from the data set. Using .filter().map().slice() might be a little easier to read than reduce, but it performs two loops on the array, and then chops the result set to the desired size. It’s not the most efficient way to process the data.

const lastTwoPublishedPosts: Array<Omit<Post, 'status'>> =
  posts
  // `filter` loops through all items to check for id and status
  .filter(post => post?.id && post.status === 'published')
  // `map` processes all published items
  .map(post => ({ id: post?.id, title: post?.title }))
  // `slice` trims down to the items that are wanted
  .slice(0, 2);

// lastTwoPublishedPosts = [
//   { id: 1, title: 'article one' },
//   { id: 5, title: 'article five' },
// ]

A slightly more efficient way would be to re-order the operations so that only the sliced items get processed with map, but it’s still performing more operations than it needs to.

const lastTwoPublishedPosts: Array<Omit<Post, 'status'>> = posts
  .filter(post => post?.id && post.status === 'published')
  // `slice` trims down to the items that are wanted
  .slice(0, 2)
  // `map` processes the two published items
  .map(post => ({ id: post.id, title: post.title }));

// lastTwoPublishedPosts = [
//   { id: 1, title: 'article one' },
//   { id: 5, title: 'article five' },
// ]

Reduce

Combining the desired operations into a single reduce operation only loops through the data one time, and skips the additional processing done in the previous map function once it’s found the two pieces it’s looking for. It also eliminates the need to slice the mapped data.

// Combine map, filter, and slice
const lastTwoPublishedPosts = posts.reduce((publishedPosts: Array<Omit<Post, 'status'>>, post) => {

  // .slice(0, 2)
  if (publishedPosts.length === 2) {
    return publishedPosts;
  }

  // .filter(post => post?.id && post.status === 'published')
  if (post?.id && post.status === 'published') {

    // .map(post => ({ id: post.id, title: post.title }));
    publishedPosts.push({
      id: post.id,
      title: post.title,
    });
  }

  return publishedPosts;
}, []);

// lastTwoPublishedPosts = [
//   { id: 1, title: 'article one' },
//   { id: 5, title: 'article five' },
// ]

Transform an array into an object

forEach

When transforming arrays into objects, forEach is pretty efficient. But there are som potential pitfalls. The result object has to be defined before looping, and is not enclosed within the code that processes the array.

const postStatuses: Record<number, string> = {}

posts.forEach((post) => {
  if (post?.id && post.status) {
    postStatuses[post.id] = post.status
  }
})

// postStatuses = {
//  1: 'published',
//  2: 'draft',
//  5: 'published',
//  7: 'published'
// }

Reduce

The primary benefit here is the “closure”. You can ensure that only the reduce operation is working with the resulting object.

// Return an object instead of an array
const postStatuses = posts.reduce((postStatusObject: Record<number, string>, post) => {
  if (post?.id && post.status) {
    postStatusObject[post.id] = post.status;
  }

  return postStatusObject;
}, {});

// postStatuses = {
//  1: 'published',
//  2: 'draft',
//  5: 'published',
//  7: 'published'
// }

Conclusion

Reduce is a powerful tool that can be very efficient at processing arrays, especially when dealing with large data sets. Hopefully the examples helped to explain the benefits of using reduce. It doesn’t have to be scary or out of reach! By creating custom variable names and using simple conditional logic, reduce can be a valuable addition to your developer toolkit.

Have you found other use cases for reduce? Let me know, I’d love to discuss this topic with you!