#100DaysOfWeb in Python Transcripts
Chapter: Days 21-24: Async Flask APIs with Quart
Lecture: Asynchronous Python execution with async and await

Login or purchase this course to watch this video and the rest of the course contents.
0:00 We've seen our synchronous version running. Now, let's improve it so we can have our code running during these downtimes
0:07 and we're going to create an asynchronous version. So we'll just duplicate that file in PyCharm and we'll look over here.
0:17 So the first thing that we need to do in order to make our code async in Python is to do what's called an event loop.
0:26 So there's going to be little bit more stuff to orchestrate these things like, main is going to set slightly more complicated.
0:31 Let's not necessarily focus on main right away let's look at the other two pieces here and see how this works. So in order to use
0:41 the asynchronous programming model in Python we really need to use some of these asynchronous data structures
0:48 and for that reason, we're going to import asyncio. Now, over here, we have a list and getting items out of the list and putting items into the list
1:00 there's no way to sort of coordinate that and say I would like, for example, to wait until there's an item in this list and then get it out.
1:10 But you're welcome to go do other work while I'm waiting. Lists don't work that way, they're not built for that.
1:14 So what we're going to use is going to convert this to a queue that comes with asyncio, like that. And notice, PyCharm is saying
1:25 you said this was a list, it's no longer a list. So where can I go next change our little annotation here and let's look at this one, this generator
1:36 this producer bit, in order for us to make our functions operate. In this asynchronous mode, we have to make them as async.
1:45 We have to say, this method participates in a different way in fact, the way it's implemented when it runs is really different.
1:52 So we come over here and we say this is an async method, async def rather than say, this one which is a synchronous one.
1:59 Once we've said it's an async method then we find all the places that we are waiting. So there's going to be actually two places.
2:08 This doesn't have an append, but it has a put. And what we're putting into it is this tuple. So really, one thing is going to it
2:18 even though it may look like two. Now, PyCharm say, whoa, whoa, something is wrong here. We hover over it, it say, you're not waiting on this.
2:25 So this is one of these async methods itself and it may take some time to put this in here because thread coordination and whatnot.
2:34 So we come over here, and say we would like to pause our thread until this operation is done potentially giving it up for
2:43 some other bit of our code to run. We don't have to really do anything special we just have to say, we're willing to wait
2:49 we're willing to let Python interrupt our execution here. And then, this time.sleep well, we want a different kind of sleep
2:57 we want asyncio because this one is awaitable. Again, this is signaling to Python I would like you to run through this loop until you get here
3:07 then you can let my thread run or if you have data, just give it to me and we'll do a little print, and then we're going give up our thread's time
3:14 and let other stuff work. This one is really the important one. We're going to let other code run while we're waiting.
3:22 That's what that await's just done and when it's done, then we go back and go to the next thing through the loop. Python handles it beautifully for us.
3:27 Let's do the same thing here. So async def, and then notice there's a get on our queue. Now, for some reason, PyCharm doesn't detect this
3:38 but if you go to the definition for get you can see it's an async method. That means we have to await it within our async method.
3:46 So we do this, and this is going to be asyncio await Actually, we don't even need this anymore. This didn't used to be a blocking call, but now it is.
3:55 And then, down here, asyncio I have to await it, PyCharm tells us, there we go. So, when this one's processing, it's busy
4:04 and then it says, hey, while I'm waiting you can go back and do other work like say, for example, generate_data.
4:11 That is what we need to do to make our methods async. Really, you just put the async word and everywhere you might be waiting
4:19 you just throw in the keyword await. So I told you, this programming model is straight forward. Understanding all the nuances of it
4:27 how it's implemented, different story. We're actually going to only touch a little bit on that. Now, in order to run these async methods
4:35 there is a little bit of ceremony that we have to go through. Not all languages work this way Python does for whatever it's worth.
4:43 So what you have to do is create this thing called an async loop and basically, things within that given loop share execution time
4:51 so they're all trying to run and when one of them says, hey, I'm awaiting something the others get a chance to run
4:57 so it's like this shared cooperative multi-threading. So I'm going to create a thing called called a loop and we just call get_event_loop here.
5:07 Now, we have our data. Now, instead of directly calling these we're going to do something different.
5:12 We're going to go over here and say loop, create a task. The reason we need to create the task is we want to wait until these two operations are done
5:23 and then, our program is going to be done. So I'll call this task1 and same thing for this async method but of course, task2.
5:33 That's pretty straight-forward. And then, finally, we would like to create something that lets us say we're done, our program is done
5:42 when all of these tasks are done so when all the generation is done and all the data processing is done. So what we can do is
5:48 create a third task from the first two. The way we do that is we can so what what's called gathering the task.
5:56 So we'll say gather, and we just give all the tasks become star args there and then, we can tell our loop to run. So up to here, nothing has happened
6:04 nothing has executed, so down here, we say loop.run_until_complete(final_task). So it's a little bit more ceremony
6:13 to sort of coordinate all this stuff, but it's not too bad. However, once you get the start-up stuff done everything is golden.
6:23 So, you're ready to run it? Are you excited to see what happens? Well, first of all, hopefully, I didn't screw this up. We're going to find out.
6:30 So we're going to run this one and now, look at this we're processing an item, are generating an item processing, generating, processing
6:38 generating, processing. Now notice, it's one and then the other because they don't take very long for them to do this work.
6:46 As soon as one goes into the queue, we're processing it and because we're processing faster than they're being generated, we can keep up.
6:54 But what's really interesting here we can actually generate more and more data and just have the one processor.
7:00 So let's go task3, I'm going to generate more data and task4, I'm going to generate more data and then, we'll just wait on those two.
7:10 So were generating 60 pieces of data and just because the way we wrote this so it exits right at the end we need to tell how much data to process it.
7:18 It would be easy enough to write this in a way that can be canceled, by we're not doing that we're just adding them up.
7:25 Now, let's run it, and this see this cooperative bit here. We're generating a bunch of items and notice, let's go back up to the top here
7:33 we've generated three, and we process one and we process two, and then we started to process three but then, it turned out, we ran out of time
7:42 and this one sort of kicked in again and these other three are running at parallel. Processed a couple, these others got to go.
7:50 Process one more, one got to go. And you can seem sorting starting to interleave here until they're all done.
7:56 We generated data faster than we could process it so we had a few backed up here in the end. Well, let's go back and tone this down a little.
8:05 Let's just put one extra, let's just put 40 two producers for every consumer. Run this again. The other thing I want you to notice.
8:16 The first time when we ran the synchronous version remember that, the synchronous version the very quickest response we could get
8:23 was 10.5 seconds, that was the best. The worst one was like 30 seconds. Look at these numbers. Because we processing them when we're waiting
8:34 we can get to them right away. A request comes in, we can get to it because the other one's waiting on the next request to be generated, and so on.
8:40 So these are extremely small numbers because we don't have to wait for all the work to be done to start the processing.
8:47 We're doing this cooperative multi-tasking. Also, if you notice here we have 20 seconds completion time. I think it's 30 seconds in the other
8:54 but if we had actually generated this much work it'll be as we did double it'll actually be even quite a bit more. So it's much faster
9:01 and the response time is insanely faster. This is the behavior we want to add to our web application
9:07 and with Quart and async programming, it's pretty easy.

Talk Python's Mastodon Michael Kennedy's Mastodon