JavaScript acync/await part 1: How we got here
13 Oct 2019
It's fair to say that, in 2017, JavaScript changed forever with the arrival of the async
/await
combo. Finally JavaScript supported a way of writing synchronous-looking code that handled asynchronous processes, hidden away as an implementation detail.
In this two-part guide, I'll first be looking at how we got here - the history of trying to solve the problem asynchronous code resulting in messy, hard-to-maintain code - and then in part two how to use the rather revolutionary async
/await
.
Back story 🔗
Asynchronicity has long been a big part of JavaScript. But the problem was always that it ruined the top-down, sequential look of your code and you had to go into a world of callbacks - and if you needed to do several, sequential requests, nested callbacks.
This is known as callback hell - an ever-deeper voyage into more and more closures just so you can handle discretely-ordered asynchronous operations.
Suppose we have a function which receives a URI and a callback, and performs an AJAX request. For brevity, we'll leave out its body, but it looks something like this:
Then say we needed to do a couple of requests, the first completing before the second is begun (perhaps the first one returns some data that the second one needs to send.) We'd have to do something like this:
request('foo', function(response) {
request('bar', function(response2) {
//finally do something
}
});
somethingSynchronous();
There's two problems there. FIrstly, if you were new to JavaScript and understandably expected things to happen in the order you write them, you'd be forgiven for wondering why somethingSynchronous()
happened before either of our requests completed. That's how we read the code flow, but it's not what happens.
Secondly, we descend into an ever-deeper level of indentation, making code hard to manage and hard to read.
As humans we think like this:
Not
We think in terms of a single flow of operations. And it turns out that's what JavaScript does too; it's a single-threaded language, meaning it only does one thing at a time.
It is possible to make AJAX requests synchronous, but this is bad practice; it blocks the browser from executing further lines of code and if your requests takes a long time the browser could become unresponsive.
What happens when we kick off an asynchronous request is JavaScript tells the browser to go fetch something, and promises to call back later when it gets a free second between executing other, synchronous tasks.
do something
* kick off async request - JS will check later on progress *
do something
* JS checks if request complete yet - it's not *
do something
* JS checks again - now it's complete; call callback *
do something
jQuery Deferreds and promises 🔗
Back in 2011, jQuery 1.5 introduced what it called deferred objects, whcih proved an early blueprint for native Promises that arrived four years later in ECMAScript 2015 (although they work slightly differently).
It's safe to say they revolutionised AJAX in JavaScript. Although they still didn't let us write synchronous-looking code that did asynchronous stuff - only a change to JavaScript syntax would allow that - they did make code flow much friendlier. Suddenly, all jQuery AJAX functions implicitly returned deferred objects, which meant they could be used with the rich deferred object API.
A deferred object is created in an unresolved state, a callback attached to it, and it is later resolved or rejected.
let dfd = $.Deferred();
dfd.done(data => console.log(data));
request('foo.html', response => dfd.resolve(response));
Let's say we wanted to retrieve a file over AJAX the first time a function was called, but on subsequent calls return a cached version of the data.
let cache;
function getData() {
let dfd = $.Deferred();
if (!cache)
$.get('file.txt', data => dfd.resolve(data));
else
dfd.resolve(cache);
return dfd;
}
getData().done(data => console.log(data)); //first time from request
getData().done(data => console.log(Data)); //second time from cache
That deferred.done()
method works great for such situations.
Even better, you could create a global deferred that listened for sub-deferreds to complete before doing something:
let req1 = $.get('foo.html'); //returns a deferred object
let req2 = $.get('bar.html');
let req3 = $.get('etc.html');
$.when(req1, req2, req3).then((r1, r2, r3) => {
console.log('Responses were:', r1, r2, r3);
});
You could even chain them! Suppose we needed to do several requests where order was important:
$.get('foo.txt')
.then(r => $.get('bar.html?data='+r))
.then(r => $.get('etc.html?data='+r))
.then(r => console.log(r));
See how we don't have to descend into callback hell, getting deeper with each request?
In 2015 JavaScript got its own version of deferrers in the shape of promises. I won't go into the ins and outs of promises here but needless to say they were designed with the same goals in mind.
They have their differences; for one thing, promises are normally resolved from inside their own callback, whereas deferreds allow for resolution from outside.
Anyway, that's the jQuery diversion. Things really started to change in 2015 with the arrival of generator functions, however.
Generator functions 🔗
Generators were the first real sign that JavaScript was actively addressing the problem of asynchronicity being synonymous with messy, hard to maintain code.
If you're unfamiliar with generators, check out my three-part guide to them.
Generators can be used with async
/await
when they exist in the form of asynchronous generators!
Suddenly it was possible to literally pause and resume code execution - but not in a browser-blocking way like a synchronous AJAX function would; execution was paused only inside the generator function; synchronous code outside of it would continue as normal. This was a dream-come-true for synchronous-looking-yet-actually-asynchronous code design.
The key to this is the yield
keyword which, when hit, pauses execution at that point and spits out - "yields" - the value to its right. We can then feed a value back to it by returning to our generator.
function* myGen() {
let foo = yield 5;
yield foo;
}
let it = myGen(), //returns an iterator - doesn't execute the function!
val1 = it.next().value, //5
val2 = it.next(val1 * 2).value; //10
Generators can be very odd-looking at first if you're new to them, so if the above looks bizarre check out my guide to generators. But what we're basically doing is yielding out values - at which point execution within the generator function (only) pauses - and then resume it later with a further call to iterator.next()
, passing in a value as we do, which becomes the value of the let foo = assignment
. Finally, we then yield out this value.
Knowing this, it's not hard to see how we can exploit this to hide away asynchronous behaviour. Generators work particularly well with promises, so let's hook it up to the Fetch API, which provides a promises-based means of handling AJAX requests.
Let's modify our request()
function from earlier to work with a generator and the Fetch API.
Now for our generator, which will get data from file 1, then pass its response to a second request to get file 2.
function* myGen() {
let data1 = yield request('foo.txt');
let data2 = yield request('bar.txt?id='+data1);
console.log(data1, data2);
}
let it = myGen(); //get iterator
it.next(); //kick things off
See how our generator looks for all the world synchonous? Suddenly we could run sequential requests without hitting our main code flow having to handle callbacks, thanks to the stop-start code execution, and our asynchronicity is hidden away by the magic of the yield
keyword and the request()
function.
Summary 🔗
Right from the early days JavaScript developers have craved synchronous-looking code that is capable of asynchronous execution. FIrst we had synchronous AJAX requests, which are a bad idea because they're browser blocking.
Later came jQuery deferred objects, which went a long way to informing the eventual promises API that came out some years later. These allowed promise chaining, and provided a rich API for handling situations where resolution or rejection didn't come until a later, arbitrary point.
Generator functions were the first real breakthrough that literally allowed stop-start code execution (within the generator function), via the yield
keyword, which meant we really could hide awa asynchronous operations under the guide of synchronous-looking code.
And then came the async
/await
combo. Let's head over to part 2 to see how they work!
Did I help you? Feel free to be amazing and buy me a coffee on Ko-fi!