ICS 33 Fall 2024
Notes and Examples: Using and Combining Generators


Background

Generators are a more important feature in Python than they may appear to be at first, because they provide a combination of three desirable qualities.

One thing we haven't considered yet is the extent to which generators can be combined with each other to good effect. Because each generator tends to be targeted at a very particular problem, while offering a common protocol that's the same as any other generator, we find ourselves able to construct complex logic out of simple parts that interact in powerful ways, with the values produced by one generator used by another, which might then be enriched by still another, and so on. And since the Python language and its standard library provide a number of generator functions that solve common problems, there are a lot of tools available that we won't even need to write, with third-party libraries providing still other commonly needed tools.

Becoming comfortable with using generators and combining them together is the first step toward being able to make use of these pre-existing tools, so we'll need to build those skills first.


Generator comprehensions

Not long ago, we looked in depth at a Python feature known as comprehensions, including a very brief look at something called a generator comprehension, which we stumbled upon when looking for the (non-existent) tuple comprehension. Now that we've learned about generators, though, we have the context to understand the mechanics and benefits of generator comprehensions, so we should circle back and examine them in some detail.

A generator comprehension is a comprehension that builds a generator just like the ones we've seen. They produce a sequence of values one at a time, doing only enough work to produce each value, then pausing their execution until asked for another.


>>> (x * x for x in range(3))
    <generator object <genexpr> at 0x000002CEA10F4A50>
                      # Generator comprehensions return generators, just like
                      # generator functions do.
>>> i = (x * x for x in range(3))
>>> next(i)
    0                 # Just like any other generator, these generators can be iterated.
>>> next(i)
    1
>>> next(i)
    4
>>> next(i)
    Traceback (most recent call last):
      ...
    StopIteration
                      # Just like any other generator, these generators raise StopIteration
                      # when there are no more values in the sequence.

Syntatically, generator comprehensions are built from the same parts as list comprehensions: an initial expression, a for..in clause specifying its inputs, and zero or more additional for..in or if clauses to refine their behavior further. So, if you understand list comprehensions in detail, you've got a handle on generator comprehensions, too. Just as not all lists can be built with list comprehensions, not all generators can be built with generator comprehensions. But for many generators that follow commonly occurring patterns, they can be a terse and clear way to express something that might otherwise be more cumbersome to write without them.

Why do we need generator comprehensions specifically, though? If we have list, set, and dictionary comprehensions, what benefit do generator comprehensions offer that the others don't?

The impact of lazy evaluation

Generators in Python, including the ones built for us by generator comprehensions, employ a technique known as lazy evaluation. You might be used to thinking of the word "lazy" as having a negative connotation — it's rare when a person is referred to as "lazy" that it's meant in a positive way — but when it comes to computing, the connotation is different. The goal of computing, much more often than not, is to do the work that's necessary while consuming as few resources (time, memory, etc.) as possible. One way to minimize resource consumption is only to do a piece of work when it's definitely going to be necessary. That's what lazy evaluation is all about: only calculate a result when you're sure you'll need it.

Of course, there's plenty of subtlety in the question of whether a value is really needed, so when we're depending on automated tools to decide, they'll need to apply heuristics, which essentially means that they make an educated guess about what they can't know based on the things they can. For example, Python can't know whether you really need a value to be calculated to solve a real-world problem, but it can know whether you ever attempted to use a value. If a value was never used, we can be sure it wasn't needed, even if it's less obvious whether we really needed the values we did use.

A generator operates on that principle, by only determining its values during iteration, and only when the iteration proceeds far enough to need each one. If we only ever ask a generator for its first value, it only ever calculates the first one.

When we're talking about generator comprehensions, though, there are two sides to this coin, because there are two iterations happening simultaneously.

How much does it cost to evaluate the expression (x * x for x in range(100000000))? To the uninitiated, the answer is a surprise — though you'll likely know the answer already.


>>> (x * x for x in range(100000000))
    <generator object <genexpr> at 0x0000025461F449E0>
              # ^^^ This result came back (more or less) immediately.

The reason it happened so fast isn't because "computers are fast" or "the miracle of modern technology." It's because almost no work has been done yet! The generator built by our comprehension will neither iterate its inputs nor calculate its results until asked. This means the act of calling next sets off a chain of events.


>>> i = (x * x for x in range(100000000))
>>> next(i)
    0      # Only now is the range asked for its first element, so it calculates
           # its first element (and no others), which is then squared and returned by
           # our generator.
>>> next(i)
    1      # The second element of the range is calculated, and its square is
           # then calculated, but that's it.

This makes range(100000000) less problematic than you might think. It's an object that describes an enormous range without actually being enormous. The call to range doesn't return a generator, but it returns an iterable whose iterator behaves pretty similarly; ranges, too, are lazy evaluated when you iterate them.

What makes range(100000000) problematic is when you do something that materializes its sequence, which is to say that you actually attempt to calculate the whole thing. For example, if you wrote list(range(100000000)), you'd consume a lot of time and a lot of memory, because that would produce 100,000,000 integers and store them in a list. Or if you used range(100000000) as the input to a list, set, or dictionary comprehension instead, you'd have a similar problem, since those comprehensions consume their entire inputs and build proportionally sized data structures.

Now that we understand the impact of lazy evaluation, though, we can use iterators, generators, and generator comprehensions to enable some patterns that might otherwise seem a little strange.


Infinite generators

When we first wrote a generator function, we wrote one that generated a sequence of integers, given a start and an end value for the sequence. What would have happened if we hadn't had an end value in the sequence at all? What happens if we write a generator function that produces an infinitely long sequence? Should we expect that calling it will cause it to run forever?


>>> def sequence(start = 0):
...     current = start
...     while True:
...         yield current
...         current += 1
...
>>> i = sequence(11)        # This was processed fairly instantaneously.
>>> next(i)
    11                      # So were these.
>>> next(i)
    12
>>> next(i)
    13

Having discussed lazy evaluation, we're already in tune with why this works the way it does. If generators only produce as many values as we ask for, then an infinite generator, one that produces an infinitely long sequence, is not itself problematic. Materializing that entire sequence will be problematic, though — list(sequence()) will run infinitely, or at least until it's consumed all the memory we have available — so this is a feature we'll want to use with some caution, but isn't something we'll simply avoid outright.


Combining generators

Using a generator to supply the inputs to another

How can we make use of our sequence generator function without manually iterating it? We can't materialize the entire sequence, so we won't be able to pass a sequence generator to the list constructor or use it to drive a for loop (unless it has a break or return statement in it, to get us out early somehow), but if we had a way to say "Take the first five elements of this sequence," we could materialize just those. Conceptually, "take the first five elements of this sequence" is a general enough idea that we might imagine being able to write it once, such as this function that returns a list of the first five elements from an iterable. (That iterable might, in turn, be an infinite generator, but if we only ask for its first five values, then the fact that it's infinite is harmless.)


>>> def take_five(values):
...     i = iter(values)
...     return [next(i), next(i), next(i), next(i), next(i)]
...
>>> take_five(sequence(11))
    [11, 12, 13, 14, 15]
>>> take_five(sequence())
    [0, 1, 2, 3, 4]

What if I wanted the elements in something other than a list? By being opinionated about the resulting data structure, this function is less broadly useful than it could be. "Take the first five elements of this sequence" is more general than that, so we should return a more general result. It, too, should be a generator function.


>>> def take_five(values):
...     i = iter(values)
...     yield next(i)
...     yield next(i)
...     yield next(i)
...     yield next(i)
...     yield next(i)
...
>>> list(take_five(sequence(11)))
    [11, 12, 13, 14, 15]

Let's think for a minute about how that last expression, list(take_five(sequence(11))), works.

There are three functions here: list, take_five, and sequence. They've been called in a nested fashion, which would usually mean that the innermost one would run to completion, passing its result to the middle one, which would run to completion and pass its results to the outermost one. But the altered character of generators — they do only as much work as you ask them to do — has enabled a very different sequence of events here, but one that's both useful and common: a pipeline of results flowing from one function to the next, one at a time.

Still, take_five is a little unsatisfying. What if I wanted the first six, the first three, or the first thousand values in a sequence instead? Wouldn't the idea be similar? If so, why not make the number of values a parameter instead? "Take the first n elements of this sequence" is an even more broadly useful concept, after all.


>>> def take(count, values):
...     i = iter(values)
...     for _ in range(count):
...         yield next(i)
...
>>> list(take(5, sequence(11)))
    [11, 12, 13, 14, 15]
>>> list(take(10, sequence(1)))
    [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]

There's one remaining issue to be dealt with: The input iterable values may not have enough elements in it to satisfy our request, so we want to be sure we're handling that.


>>> list(take(10, [1, 2, 3]))
    Traceback (most recent call last):
      File "<input>", line 4, in take
    StopIteration

    The above exception was the direct cause of the following exception:

    Traceback (most recent call last):
      ...
    RuntimeError: generator raised StopIteration

Unfortunately, we're not. When we exit a generator function normally (e.g., when the flow of control falls off the end of it), the generator raises a StopIteration exception. When our generator function's body itself raises a StopIteration exception, then that's actually considered a kind of failure. In our case, our fourth call to next(i) failed, because there were only three elements in the list being iterated. We can solve that problem by recognizing that situation and handling it in some way. A reasonable design for our take generator function might be to return what's there in this case; if we ask for the first ten values in a three-value sequence, we'll get only those three values.


>>> def take(count, values):
...     i = iter(values)
...     for _ in range(count):
...         try:
...             yield next(i)
...         except StopIteration:
...             break
...
>>> list(take(5, sequence(11))
    [11, 12, 13, 14, 15]
>>> list(take(10, [1, 2, 3]))
    [1, 2, 3]

Finally, we should take note of the fact that generator functions and generator comprehensions both build generators, so we would reasonably expect to be able to combine them in ways similar to what we've seen. For the purposes of clarity, we can introduce names for our intermediate sequences — notably without burdening our performance, since storing generators in variables doesn't cause them to be materialized.


>>> from_zero = sequence(0)
>>> squares = (x * x for x in from_zero)
>>> list(take(10, squares))
    [0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
               # Only ten values from sequence are generated, only ten squares
               # are calculated, and only ten elements are appended to the list.
>>> list(take(10, (x * x for x in sequence(0))))
    [0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
               # We could also have written this in one expression, though these become
               # more difficult to read as they consist of more moving parts and the
               # parentheses nest more deeply.

Forwarding the entire output of one generator via another

We've seen that you can use the outputs of one generator as the inputs to another, which can be a useful way to build a pipeline that lazily evaluates multiple operations one input at a time.

Another way we can combine generators together is to forward the entire output of one generator as the output of another. For example, let's suppose that we want to build a list containing the values 0..9 followed by the values 20..24. Given our previously built sequence and take generator functions, we could write something like this.


>>> list(take(10, sequence(0))) + list(take(5, sequence(20)))
    [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 20, 21, 22, 23, 24]
This has the unfortunate characteristic of building a total of three lists: one containing the integers 0..9, another containing the integers 20..24, and a third containing their concatenation. If we could build a function that takes the two generators as parameters, we could concatenate them together within the function ourselves.


>>> def concatenate(first, second):
...     result = []
...     for value in first:
...         result.append(value)
...     for value in second:
...         result.append(value)
...     return result
...
>>> concatenate(take(10, sequence(0)), take(5, sequence(20)))
    [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 20, 21, 22, 23, 24]

But we should think more carefully about this. The idea of concatenating two iterables together — all of the values in the first, followed by all of the values in the second — is a fairly universal one, so perhaps we should be building a universal tool to solve it instead. Let's turn concatenate into a generator function, with the same two loops in it, but yielding its values instead of appending them to a list.


>>> def concatenate(first, second):
...     for value in first:
...         yield value
...     for value in second:
...         yield value
...
>>> list(concatenate(take(10, sequence(0)), take(5, sequence(20))))
    [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 20, 21, 22, 23, 24]

This works really nicely, and provides a tool we can use anywhere an iterable can be consumed. The only place in concatenate where there's some friction is in the fact that we had to write two for loops by hand — which certainly isn't the end of the world, but is noisier than it could have been.

A number of years ago, Python added (in its 3.3 version) a statement known as yield from that provides a simple solution to this kind of problem. Within a generator function, the yield from statement yields every element from an iterable, one at a time, as though you had written a for loop to iterate it and used yield to generate each value separately.


>>> def concatenate(first, second):
...     yield from first
...     yield from second
...
>>> list(concatenate(take(10, sequence(0)), take(5, sequence(20))))
    [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 20, 21, 22, 23, 24]

(There's more to the story of yield from — and generators in general — than we've seen here, but this is good enough for us for now. If you want to read more about the design and history of yield from, PEP 380 describes it in detail. Meanwhile, we'll likely see yield from again later in the course when we talk about recursion and recursively organized data structures; that's one place where the technique really shines.)

As an aside, it's worth noting that the iterable unpacking notation that we've seen previously can also be used to assist with the particular case of building a list from two iterables, because function arguments aren't the only place where iterable unpacking is permitted. We can also perform iterable unpacking in literal lists, in which case the contents of the list are determined by the values in those iterables. Generators are iterable, so they can be unpacked just like any other iterables can.


>>> [*take(10, sequence(0)), *take(5, sequence(20))]
    [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 20, 21, 22, 23, 24]

Nonetheless, the value in having a generator function like concatenate is that it solves a broader problem. Rather than specifically solving the problem of building a list containing the contents of two iterables, it instead solves the problem of generating the contents of two iterables lazily, for whatever purpose it's required. We can use it to build a list, but we can just as well use it for many other things, as well.

Additionally, we should consider that the related feature of tuple-unpacking parameters offer an ability that's more useful in this context. Why should the concatenate function be limited to concatenating exactly two iterables together? It would be more useful if we could concatenate as many as we want. A tuple-unpacking parameter in our concatenate generator function gives us a natural way to make that possible.


>>> def concatenate(*sequences):
...     for sequence in sequences:
...         yield from sequence
...
>>> list(concatenate(take(7, sequence(0)), take(5, sequence(11)), take(2, sequence(17))))
    [0, 1, 2, 3, 4, 5, 6, 11, 12, 13, 14, 15, 17, 18]
>>> list(concatenate())
    []


Related functions in Python and its standard library

In your prior coursework, you likely gained some experience with some parts of Python's standard library, which naturally would have led you to realize that Python has a lot of commonly used tools available already, so that we don't have to build them ourselves. If you consider the kinds of generator functions we've written here, you realize that they were hardly esoteric.

These are solutions to commonly occuring problems — you'll rarely write programs that don't encounter things of this nature, even if not precisely these — so they might be just the kinds of things you'd find built into the Python language or available in its standard library. And, indeed, there are many related tools that are implemented already, some of which are generator functions, others of which produce iterators (which, ultimately, are quite similar), and still others that accept iterables as arguments and evaluate them lazily.

Notable built-in functions

Without importing any modules in Python, a fair number of functions and types are available automatically; they're referred to as built-ins. For example, Python's documentation describes its collection of Built-in Functions. At present, we're interested in functions that produce or consume iterables or generators. What's available? Below is a (non-exhaustive) collection of useful functions.


>>> list(map(lambda n: n * n, [1, 2, 3]))
    [1, 4, 9]
           # map applies a function to each element in an iterable, returning an
           # iterator that produces those results.
>>> list(filter(lambda n: n > 0, [-1, 1, -2, 2, -3, 3]))
    [1, 2, 3]
           # filter applies a function to each element in an iterable, returning an
           # iterator producing only the values for which the function returned a truthy value.
>>> any(['Boo', 'is',' happy', 'today'])
    True
           # any takes an iterable and returns True if any of the elements are truthy,
           # or False if none of them are truthy
>>> any(len(n) > 3 for n in ['Boo', 'is', 'happy', 'today'])
    True
           # Generator comprehensions provides a nice (and performant) way to turn any()
           # into a function that returns True if any element in an iterable has some
           # characteristic.
>>> all(['Boo', 'is', 'happy', 'today'])
    True
           # all is similar to any, except it only returns True if all of the elements in
           # the iterable are truthy.
>>> all(len(n) < 8 for n in ['Boo', 'is', 'happy', 'today'])
    True
           # Generator comprehensions can open up possibilities for all(), too.
>>> list(enumerate(['Boo', 'is', 'happy', 'today']))
    [(0, 'Boo'), (1, 'is'), (2, 'happy'), (3, 'today')]
           # enumerate returns an iterator that produces a sequence of two-element
           # tuples, where the first element is a consecutively increasing index and the
           # second element is one of the elements of the given iterable.
>>> list(zip(['A', 'B', 'C'], ['1', '2', '3']))
    [('A', '1'), ('B', '2'), ('C', '3')]
           # zip returns an iterator that produces a sequence of two-element
           # tuples, where each tuple contains an element from the first given iterable
           # and an element from the second given iterable.
>>> list(zip(['A', 'B', 'C'], ['1', '2', '3'], ['!', '%', '$']))
    [('A', '1', '!'), ('B', '2', '%'), ('C', '3', '$')]
           # zip can combine more than two iterables.

The terminology here is important: Functions that return iterators calculate their result one element at a time (just like generators do). Functions that can process iterables lazily — like any or all, which might know their answer after processing their first input element — will generally do so.

The itertools module

Meanwhile, the itertools module in the standard library provides a number of additional tools that can be useful for building and manipulating iterators (and, therefore, generators). Those that can consume their inputs or produce their outputs (or both) lazily will do so, just like the generators we've seen so far.


>>> import itertools
>>> list(itertools.islice('Boo is happy today', 6))
    ['B', 'o', 'o', ' ', 'i', 's']
            # islice is similar to the take generator function we wrote previously,
            # returning a "slice" of an iterable.  In this case, we're asking for the first
            # six values.
>>> list(itertools.islice('Boo is happy today', 7, 12))
    ['h', 'a', 'p', 'p', 'y']
            # islice can also be given start and end indices separately.  Unlike the
            # slice notation for lists, strings, etc., negative indices are not supported.
>>> list(itertools.islice(itertools.count(0), 10))
    [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
            # count is similar to the sequence generator function we wrote previously,
            # returning an infinite sequence of consecutive integers.
>>> list(itertools.islice(itertools.repeat(17), 5))
    [17, 17, 17, 17, 17]
            # repeat produces a potentially infinite sequence of the same value repeating.
>>> list(itertools.islice(itertools.chain('Boo', 'Alex'), 7))
    ['B', 'o', 'o', 'A', 'l', 'e', 'x']
            # chain is identical to the last version of the concatenate
            # generator function that we wrote previously.

There are many other iterator-producing and iterable-consuming functions available in the itertools module. It's worth taking a look through the documentation, so that you'll know what's available; that way, when you need one of these tools, you can simply use it, rather than building and testing it yourself.