This page looks best with JavaScript enabled

Iterators and Generators – when are they actually useful?

 ·  ☕ 6 min read  ·  ✍️ Iskander Samatov

Iterators and generators


Iterators and generators are (somewhat) new JavaScript features introduced with ES6. When I first encountered iterators and generators, I was excited to add them to my tools arsenal. I even wrote an article about Iterators and generators, which you can check out here.

However, working as a front-end developer on multiple projects in recent years, I’m yet to encounter one where iterators or generators are used actively. The counterpoint to that statement might be redux-saga, where generators are the main building block. But I don’t see that as a worthy example since, in redux-saga, the usage of the generators is a forced side effect of using the library.

After doing more digging, I found a few use cases that go further than a simple “Hello World” type of code and are closer to the real-world scenario. Let’s start with iterators.

Why use iterators

In ES6 Iterators are built-in symbols for turning your custom objects into iterables. If you want to learn more about symbols and iterators take a look at my post. 

You might think: “Why go through the trouble of adding iterators to your custom objects”? You could accomplish the same thing by writing a custom utility function to iterate through the object instead. And you’re not wrong – with any programming language, there are a dozen of different ways to solve any given problem.

So it’s useful to think of iterators as more of an elegant way to standardize your custom objects. They provide a way for custom data structures to work nicely in the larger JS environment.

For that reason, libraries that provide custom data structures use them often. For example, the Immutable.JS library uses iterators to provide native Javascript functionality for their custom objects such as Map.

As a rule of thumb, it’s worth considering iterators when you need to provide native iteration capability to a well-encapsulated custom data structure. I say well-encapsulated because you don’t want to confuse people that are not familiar with the iterators with the implementation details.

Another use for iterators that’s interesting and that I incorporated with my projects is creating a simple utility method to convert plain old JavaScript objects into iterables:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
const getIterableMap = (map) => {
    const keys = Object.keys(map)
    let nextIndex = 0;
    const iterable = {
        ...map
    }
    iterable[Symbol.iterator] = () => {
        return {
            next: () => {
                return nextIndex < keys.length ?
                    { value: map[keys[nextIndex++]], done: false } :
                    { done: true };
            }
        }
    }

    return iterable
}
1
2
3
4
5
6
7
8
9
const cakes = {
    butter: 'Butter Cake',
    pound: 'Pound Cake',
    sponge: 'Sponge Cake'
}

for (const cake of getIterableMap(cakes)) {
    console.log({ cake });
}

Why use generators

One of the well-known uses for generators is making it easier to write iterators. The example above with getIterableMap can be written in a more compact form:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
const getIterableMap2 = (map) => {
    const iterable = {
        ...map
    }

    iterable[Symbol.iterator] = function* myGenerator() {
        for (const key in map) {
            yield (map[key])
        }
    }

    return iterable
}

Seems a lot simpler right? If you would like to know more about the inner workings of this generator function and why it’s a perfect tool for creating iterators check out my post.

When thinking of generators it’s useful to try not to compare them to the regular for loops. Instead, think of them more as a powerful stream primitive built-in the JavaScript language. And that leads us to the main advantage of generators – lazy evaluation. 

Lazy evaluation is a robust data-flow technique. It’s especially useful for dealing with data that is ambiguous or that has a high memory cost. With lazy evaluation, you compute the dataset on demand rather than upfront like with traditional for loops. It provides a much more efficient memory usage and can help perform operations that would otherwise freeze up the application.

Here’s an example where we are fetching movies from an API as the user scrolls down:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
const fetch = require('node-fetch')

const API_BASE = 'https://movie-database-imdb-alternative.p.rapidapi.com'

async function* fetchMovies() {
    let page = 1;
    const result = await fetch(`${API_BASE}?s=Avengers Endgame&page=${page++}`, {
        method: 'get',
    });

    const jsonResult = await result.json();

    const { Search } = jsonResult
    if (Search) {
        yield (Search)
    } else {
        return []
    }
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
const client = () => {
    const moviesGenerator = fetchMovies()
    const movies = []

    const interval = setInterval(async () => {
        const newMoviesResult = await moviesGenerator.next();
        const { value, done } = newMoviesResult
        if (done) {
            return clearInterval(interval)
        }
        movies.push(value)
    }, 2000)
}

To imitate the user scrolling, I used a simple setInterval client implementation. But that doesn’t change the main point. As you can see, we don’t know the final size of the collection upfront. Instead, we fetch using the on-demand approach and let the API tell us when to stop. The generator provides a convenient done result property to signal to us when we have exhausted our list.

So here are the two main criteria to consider when it comes to generators:

  • You don’t know the final size of the collection you’re iterating over. For example, when your collection comes from paginated requests and you need to keep fetching new items until the stream runs out.
  • You are potentially performing computation on huge binary files that might freeze up your application if you try to do them all at once. So instead, you apply the computation on the chunks. You do so one at a time, using the on-demand approach.

Conclusion

The two examples I provided are generally not something you encounter very often when building a web app. That’s why the generators still have pretty niche use cases. So it’s always worth double-checking whether you’re adding a real value by including iterators or generators in your codebase. Otherwise, you might unnecessarily confuse other people working with you. As a side note, even engineers at Airbnb advise against using generators and iterators in your code.

I’m inclined to think that generators and iterators are safer to use when writing a well-encapsulated code, like a utility library. Because then the client doesn’t need to bother with the implementation details.

Share on

Software Development Tutorials
WRITTEN BY
Iskander Samatov
The best up-to-date tutorials on web and mobile development.