#100DaysOfWeb in Python Transcripts
Chapter: Days 21-24: Async Flask APIs with Quart
Lecture: Asynchronous Python execution with async and await

Login or purchase this course to watch this video and the rest of the course contents.
0:00 We've seen our synchronous version running.
0:02 Now, let's improve it so we can have
0:04 our code running during these downtimes
0:06 and we're going to create an asynchronous version.
0:10 So we'll just duplicate that file in PyCharm
0:13 and we'll look over here.
0:16 So the first thing that we need to do
0:18 in order to make our code async in Python is
0:21 to do what's called an event loop.
0:25 So there's going to be little bit more stuff
0:27 to orchestrate these things
0:28 like, main is going to set slightly more complicated.
0:30 Let's not necessarily focus on main right away
0:33 let's look at the other two pieces here
0:36 and see how this works. So in order to use
0:40 the asynchronous programming model in Python
0:43 we really need to use
0:45 some of these asynchronous data structures
0:47 and for that reason, we're going to import asyncio.
0:52 Now, over here, we have a list
0:54 and getting items out of the list
0:57 and putting items into the list
0:59 there's no way to sort of coordinate that and say
1:01 I would like, for example, to wait until
1:04 there's an item in this list and then get it out.
1:09 But you're welcome to go do other work while I'm waiting.
1:11 Lists don't work that way, they're not built for that.
1:13 So what we're going to use is going to convert this to
1:16 a queue that comes with asyncio, like that.
1:21 And notice, PyCharm is saying
1:24 you said this was a list, it's no longer a list.
1:26 So where can I go next
1:29 change our little annotation here
1:31 and let's look at this one, this generator
1:35 this producer bit, in order for us
1:37 to make our functions operate.
1:40 In this asynchronous mode, we have to make them as async.
1:44 We have to say, this method participates in a different way
1:48 in fact, the way it's implemented when it runs
1:50 is really different.
1:51 So we come over here and we say
1:52 this is an async method, async def
1:55 rather than say, this one which is a synchronous one.
1:58 Once we've said it's an async method
2:01 then we find all the places that we are waiting.
2:05 So there's going to be actually two places.
2:07 This doesn't have an append, but it has a put.
2:12 And what we're putting into it is this tuple.
2:15 So really, one thing is going to it
2:17 even though it may look like two.
2:18 Now, PyCharm say, whoa, whoa, something is wrong here.
2:22 We hover over it, it say, you're not waiting on this.
2:24 So this is one of these async methods itself
2:28 and it may take some time to put this in here
2:31 because thread coordination and whatnot.
2:33 So we come over here, and say
2:35 we would like to pause our thread
2:38 until this operation is done
2:40 potentially giving it up for
2:42 some other bit of our code to run.
2:44 We don't have to really do anything special
2:46 we just have to say, we're willing to wait
2:48 we're willing to let Python interrupt our execution here.
2:52 And then, this time.sleep
2:54 well, we want a different kind of sleep
2:56 we want asyncio because this one is awaitable.
3:00 Again, this is signaling to Python
3:03 I would like you to run through this loop
3:05 until you get here
3:06 then you can let my thread run
3:08 or if you have data, just give it to me
3:09 and we'll do a little print, and then
3:11 we're going give up our thread's time
3:13 and let other stuff work.
3:15 This one is really the important one.
3:17 We're going to let other code run while we're waiting.
3:21 That's what that await's just done
3:22 and when it's done, then we go back
3:23 and go to the next thing through the loop.
3:24 Python handles it beautifully for us.
3:26 Let's do the same thing here.
3:28 So async def, and then notice
3:30 there's a get on our queue.
3:34 Now, for some reason, PyCharm doesn't detect this
3:37 but if you go to the definition for get
3:39 you can see it's an async method.
3:41 That means we have to await it within our async method.
3:45 So we do this, and this is going to be asyncio await
3:50 Actually, we don't even need this anymore.
3:52 This didn't used to be a blocking call, but now it is.
3:54 And then, down here, asyncio
3:58 I have to await it, PyCharm tells us, there we go.
4:01 So, when this one's processing, it's busy
4:03 and then it says, hey, while I'm waiting
4:06 you can go back and do other work
4:07 like say, for example, generate_data.
4:10 That is what we need to do to make our methods async.
4:14 Really, you just put the async word
4:16 and everywhere you might be waiting
4:18 you just throw in the keyword await.
4:21 So I told you, this programming model is straight forward.
4:24 Understanding all the nuances of it
4:26 how it's implemented, different story.
4:28 We're actually going to only touch a little bit on that.
4:31 Now, in order to run these async methods
4:34 there is a little bit of ceremony
4:36 that we have to go through.
4:38 Not all languages work this way
4:40 Python does for whatever it's worth.
4:42 So what you have to do is create this thing called an async loop
4:44 and basically, things within that given loop
4:48 share execution time
4:50 so they're all trying to run
4:52 and when one of them says, hey, I'm awaiting something
4:55 the others get a chance to run
4:56 so it's like this shared cooperative multi-threading.
4:59 So I'm going to create a thing called called a loop
5:03 and we just call get_event_loop here.
5:06 Now, we have our data.
5:07 Now, instead of directly calling these
5:10 we're going to do something different.
5:11 We're going to go over here and say loop, create a task.
5:16 The reason we need to create the task is
5:18 we want to wait until these two operations are done
5:22 and then, our program is going to be done.
5:24 So I'll call this task1
5:26 and same thing for this async method
5:29 but of course, task2.
5:32 That's pretty straight-forward.
5:34 And then, finally, we would like to
5:36 create something that lets us say
5:39 we're done, our program is done
5:41 when all of these tasks are done
5:42 so when all the generation is done
5:44 and all the data processing is done.
5:46 So what we can do is
5:47 create a third task from the first two.
5:52 The way we do that is we can so what what's called gathering the task.
5:55 So we'll say gather, and we just give all the tasks
5:58 become star args there
5:59 and then, we can tell our loop to run.
6:01 So up to here, nothing has happened
6:03 nothing has executed, so down here, we say
6:05 loop.run_until_complete(final_task).
6:10 So it's a little bit more ceremony
6:12 to sort of coordinate all this stuff, but it's not too bad.
6:16 However, once you get the start-up stuff done
6:20 everything is golden.
6:22 So, you're ready to run it?
6:23 Are you excited to see what happens?
6:25 Well, first of all, hopefully, I didn't screw this up.
6:28 We're going to find out.
6:29 So we're going to run this one
6:31 and now, look at this
6:33 we're processing an item, are generating an item
6:35 processing, generating, processing
6:37 generating, processing.
6:39 Now notice, it's one and then the other
6:42 because they don't take very long
6:44 for them to do this work.
6:45 As soon as one goes into the queue, we're processing it
6:48 and because we're processing faster
6:51 than they're being generated, we can keep up.
6:53 But what's really interesting here
6:55 we can actually generate more and more data
6:58 and just have the one processor.
6:59 So let's go task3, I'm going to generate more data
7:03 and task4, I'm going to generate more data
7:06 and then, we'll just wait on those two.
7:09 So were generating 60 pieces of data
7:12 and just because the way we wrote this
7:13 so it exits right at the end
7:16 we need to tell how much data to process it.
7:17 It would be easy enough to write this
7:19 in a way that can be canceled, by we're not doing that
7:22 we're just adding them up.
7:24 Now, let's run it, and this see this cooperative bit here.
7:27 We're generating a bunch of items
7:29 and notice, let's go back up to the top here
7:32 we've generated three, and we process one
7:35 and we process two, and then we started to process three
7:39 but then, it turned out, we ran out of time
7:41 and this one sort of kicked in again
7:43 and these other three are running at parallel.
7:46 Processed a couple, these others got to go.
7:49 Process one more, one got to go.
7:51 And you can seem sorting
7:52 starting to interleave here until they're all done.
7:55 We generated data faster than we could process it
7:58 so we had a few backed up here in the end.
8:01 Well, let's go back and tone this down a little.
8:04 Let's just put one extra, let's just put 40
8:07 two producers for every consumer.
8:11 Run this again.
8:13 The other thing I want you to notice.
8:15 The first time when we ran the synchronous version
8:17 remember that, the synchronous version
8:20 the very quickest response we could get
8:22 was 10.5 seconds, that was the best.
8:26 The worst one was like 30 seconds.
8:27 Look at these numbers.
8:29 Because we processing them when we're waiting
8:33 we can get to them right away.
8:34 A request comes in, we can get to it
8:36 because the other one's waiting on
8:37 the next request to be generated, and so on.
8:39 So these are extremely small numbers
8:42 because we don't have to wait for
8:44 all the work to be done to start the processing.
8:46 We're doing this cooperative multi-tasking.
8:48 Also, if you notice here
8:49 we have 20 seconds completion time.
8:52 I think it's 30 seconds in the other
8:53 but if we had actually generated this much work
8:55 it'll be as we did double
8:57 it'll actually be even quite a bit more.
8:59 So it's much faster
9:00 and the response time is insanely faster.
9:03 This is the behavior we want to add to our web application
9:06 and with Quart and async programming, it's pretty easy.