Python for .NET Developers Transcripts
Chapter: async and await in Python
Lecture: Async for scalability

Login or purchase this course to watch this video and the rest of the course contents.
0:00 Let's start with scalability. Suppose we're running a web app, a web server and we want to do as much processing
0:09 with the minimal amount of resources as possible. One option would be to spin up a whole bunch of threads and have each thread handle a request.
0:19 That would work but that's actually a lot of overhead. Threads take up a lot of memory the contact switching can be high. If we do them in order
0:27 and request one then request two and request three like in this example here. Well, we'd have to stack them on top of each other
0:33 and that would be super slow. But as I laid out in the beginning these requests are generally for web apps mostly about waiting.
0:40 I got a little bit of stuff from the requests and then, oh I need to take this user id in the cookie and go ask the database for the user details
0:48 or once I get the user details I'm going to call an external API like, you know charge a credit card or something
0:53 and I wait on that. And then I get it back and then I'm going to say, Oh here's the, you know update their record, yeah, now they own this thing
1:00 in the database and wait again on that. And then I'm going to return some stuff over the network. So all of this waiting means that
1:06 we can break up our requests and while though we're waiting we can just immediately pick up the other thing. So if we look at the response time
1:12 yeah, we're going to take however long those waiting periods take for response one to execute but we don't have to wait
1:19 for response one to be totally done. There's probably right at the top of two is we're probably already just waiting on some database or something.
1:25 So we can go ahead and just put that work aside using the async and await keyword pick up request two and get it going down the road
1:32 until it starts waiting on the database for something. That pretty much means the latency of the additional time
1:37 for two requests versus one at a time is really, really low. Again, request three here we can respond pretty much right away.
1:45 With just a little bit of AsyncIO a little async and await, we can do a lot of magic to speed up this execution
1:53 and we can do this even with just a single thread without adding all the overhead of many, many, many threads.
1:59 This is the fundamental idea of how Node.js is actually able to handle a ton of concurrent requests and you'll see that's also what Python's idea is.


Talk Python's Mastodon Michael Kennedy's Mastodon