JavaScript generators part 2: Generators and asynchronicity
10 Mar 2019
Welcome to the second part of my three-part guide to JavaScript generators. If you're new to generators and haven't yet read part 1, I'd strongly recommend starting there so the concepts discussed in this article make more sense. I'll wait for you here :)
Synchronous, but not as we know it 🔗
In the last article we saw how generators have the unique ability to pause and resume execution within a single function. We didn't really exploit that capability to do anything terrible exciting, but that certainly changes once we start to get our generators to kick off asynchronous operations.
If you're familiar with ECMAScript 2017's async
/await
combo, this idea of synchronous-yet-asynchronous should be pretty familiar. In fact, async
/await
offers an even more elegant (and more concise) way of handling asynchronous operations with synchronous syntax. We'll look more at generators vs. async
/await
a little later.
Recall that generators pause when they come to a yield
point, and resume on the next call to next()
, which may pass in a value that is assumed by that waiting yield
. Let's apply that to an asynchronous situation.
//util to create random number 0-10
let rand = () => Math.floor(Math.random() * 10);
//generator stuff
function* myGenerator(howMany) {
let total = 0;
for (let j=0; j < howMany; j++)
total += yield setTimeout(() => it.next(rand()), 100);
console.log(total);
}
let it = myGenerator(5);
it.next();
There, we define a simple random number factory, and call it an arbitrary number of times from our generator, delayed by 0.1 seconds each time (thus making it asynchronous). The timeout callback sends the generated number back into our generator, adding it to the running total, until we end up with the final total.
Notice in particular how yield
in this case is used as a technique rather than a means to spit out meaningful data. We yield the "value" of the timeout, which isn't of interest to our purposes, and nothing is listening for it.
Now this is all fine, but it's not optimal. For one thing, we're sort of defeating the point of generators by including our asynchronous legwork in the generator body. What's the point of synchronous syntax if we mix it up with asynchronous syntax?
Better would be to farm out our asynchronicity, and this is a great opportunity to look at the powerful combination of generators and promises.
Generators and promises 🔗
Generators and promies merge the syntactic advantages of generators with what Kyle Simpson calls the "trustability and composability" of promises. This means that promises have a rich API, work in a highly codified, predcitable way, and are thus trustworthy within a wider code sphere.
Let's get our AJAX on. Suppose we wanted to log in a user based on an entered username and password, resolve that user to an ID, then feed that ID to a second request to get data about the user.
First let's set up a helper function that handles our requests. We'll do these via fetch()
, because fetch()
creates and returns a promise. Our function will receive two arguments - the URL to query, an object of params, and a reference to our generator iterator. We'll assume our server always responds with JSON.
function req(url, params = null, it) {
fetch(url, {method: 'post', body: JSON.stringify(params)})
.then(data => data.json())
.then(data => it.next(data));
}
Notice that it doesn't return anything. That's fine - once again we're going to use yield
as a technique, not for its returned (yielded) value, and nothing is waitng to grab that yielded value from next().value
. We effectively yield undefined
, and instead we simply need our helper to kick off the next generator cycle once the promise resolves.
Now for our generator. We'll assume we've already grabbed the entered user/pass and stored them in variables.
function* myGenerator(user, pass) {
let data1 = yield req('/login', {user: user, pass: pass}, it);
let data2 = yield req('/data', {user_id: data1.user_id}, it);
console.log(data2); //{name: 'Dmitry', age: 36, ...}
}
let it = myGenerator(user, pass);
it.next();
Now our generator is looking like generators were supposed to - synchronous - because we've abstracted out the asynchronicity to a helper function, powered by the well-structured and reliable nature of promises.
Let's take a closer look at how this works. First we initiate our generator and in so doing pass it our stored user and pass, which we need to pass to the first request. We kick things off with next()
and the generator runs up to the first yield point, at which point it returns a promise.
After the first request is fired, our helper method, which is responsible for the hidden-away asynchronicity, handles the response and eventually pass it back to our generator - to the yield
point where execution paused - and the returned object is assigned to our data1
variable. If we suppose the server response was:
...this means data1 stores an object of that nature.
Great, now we're free to resolve that user ID to some meaningful user data. So our generator continues its execution, and hits the second yield
point. From here, it's like the first. Execution is paused, we send out another request, this time passing along the user ID we extracted from the first request.
Once again our helper handles the request, and assuming all is well passes the resultant object back into our generator, to the waiting yield, and ultimately to our data2
variable.
And that's the end of the process! Synchronous-looking code, with hidden away asynchronous abstraction. But we can go further.
Handling multiple requests 🔗
So this is all very well. In the last example we needed the requests to proceed one after the other, in a specific order. But what if we didn't? How do we use generators to power concurrent asynchronous requests? Given their stop-start nature, won't they always do one request at a time?
Except where order is important, you'll normally want multiple asynchronous requests to happen concurrently, for the performance benefits.
Yes - unless we use some more promises magic in the shape of Promise.all()
, which takes an array of promises and itself returns a promise which is resolved once all the inner promises have resolved.
Suppose we want to get a bunch of products of different categories in our store. First we'll slightly modify (and simplify) our req()
helper that handles the requests. This time, it won't be responsible for kicking off the next generator cycle via next()
(nor does it need to receive and pass along any params
) - we just want it to return a promise (from fetch()
).
And now our generator:
function* myGenerator() {
let types = ['books', 'comics', 'games'],
subPromises = types.map(type => request('items/'+type));
let data = yield Promise.all(subPromises);
console.log(data);
}
let it = myGenerator();
let prom = it.next().value.then(data => it.next(data));
There's a couple of key differences this time around.
Firstly, our request()
helper now returns a value - because our generator is expecting it to, when compiling the array of sub-promises that form our three product requests. Armed with this, the generator can then feed this array of promises to Promise.all()
and wait for all three to complete - in any order - before moving on.
Secondly, whereas previously outside the generator we just kicked things off with it.next()
and nothing was listening for the result, this time we are. We listen for the resolution of the outer promise, and then send the data (the array of responses from the sub-promises) back into our generator to the waiting yield so it can populate our data
variable. From there we can console.log()
it or work with it further.
And most important of all, it all still looks synchronous. We're not doing the asynchronous parts in the generator - there's no callbacks - our code is easy to read and flat.
Error handling 🔗
One of the other great things about promises is they provide a clear, well-defined way of handling errors. This is good news for generators, who also have a clear way of handling errors.
Suppose we redid our AJAX example with jQuery AJAX or good-old XMLHTTPRequest
- in other words, some approach other than promises. There's no clear pipeline for errors, and if one is thrown, it's not clear what that would mean for our overall flow.
With promises, we can listen for server errors by testing the response.ok
property, which will be true on a HTTP response code of 2xx or false otherwise. We can then throw an error if there's a problem.
With generators, we can throw errors via iterator.throw()
. Let's modify our previous example to handle errors. First we'll add in a check to our request function for any problems.
function request(uri) {
return fetch(uri, {method: 'post'})
.then(r => { if (!r.ok) throw new Error('Oh no!'); })
.then(data => data.json());
}
Then we'll wrap the yielded requests part of our generator around a try-catch block.
function* myGenerator() {
try {
let types = ['books', 'comics', 'games'],
subPromises = types.map(type => request('items/'+type));
let data = yield Promise.all(subPromises);
console.log(data);
} catch(e) {
console.error('Something went wrong. Message: ', e);
}
}
Lastly we need to "catch" that error that our request function throws. We'll do that by chaining it to the promise yielded out of our generator by Promise.all()
:
Summary 🔗
The pause-resume nature of generators make them a great fit for hiding away asynchronous operations as an implementation detail behind synchronous-looking code.
Promises are perfect bed fellows for generators, because the codified, trustworthy workflow of promises slot together nicely with the synchronous-yet-asynchronous nature of generators. Promises have well-defined error handling that can be wired directly into generators' own mechanism for error handling, namely iterator.throw()
.
While generators may seem at first better suited to handling one promise at a time or cases where promise resolution order is important, they can support multiple, parallel promises via techniques such as Promise.all()
, yielding out a promise wrapper which is responsible for monitoring the sub-promises fed to it.
In part 3, we'll finish up by taking generators further, looking at delegated generators, generators versus async/await
, and generators as iterables.
Did I help you? Feel free to be amazing and buy me a coffee on Ko-fi!