Async Techniques and Examples in Python Transcripts
Chapter: async and await with asyncio
Lecture: I/O-driven concurrency

Login or purchase this course to watch this video and the rest of the course contents.
0:00 If you haven't worked with AsyncIO before it's going to be a little bit of a mind shift. But it turns out the programming model
0:07 is actually the easiest of all the concurrent programming models that we're going to work with. So don't worry, it's actually
0:13 the easiest, newest, and one of the best ways to write concurrent programs in Python. So let's step back a minute and think about
0:23 how do we typically conceptualize or visualize concurrency? Well, it usually looks something like this. We have some kind of code running
0:31 and then we want to do multiple things at a time so we might kick off some other threads or some other processes.
0:37 And then our main thread, and all the other threads are going to run up to some point, and we're going to
0:42 just wait for that secondary extra work to be done. And then we're going to continue executing along this way. Like I said, this is typically done
0:50 with threads or multiprocessing, okay? Many languages it's only threads in Python, because the GIL multiprocessing is often involved.
0:59 Now, this model of concurrent programming one thread kicking off others waiting for them to complete, this fork-join pattern this makes a lot of sense.
1:08 But in this AsyncIO world, this is typically not how it works. Typically, something entirely different happens.
1:16 So in this world, we're depending upon the operating system to schedule the threads or schedule the processes and manage the concurrency.
1:25 It's called preemptive multithreading. Your code doesn't get to decide when it runs relative to other parts of your code.
1:31 You just say I want to do all these things in parallel it's the operating system's job to make them happen concurrently.
1:37 Now, contrast that with I/O driven concurrency. So in I/O driven concurrency we don't have multiple threads.
1:44 We just have one thread of execution running along. This may be your main app, it actually could be a background thread as well
1:50 but there's just one thread managing this parallelism. Typically, when we have concurrency if we have multiple cores, we're actually doing
1:59 more than one thing at a time assuming the GIL's not in play. We're doing more than one thing at a time. If we could take those multiple things
2:07 we're trying to do and slice them into little tiny pieces that each take a fraction of a second or fractions of milliseconds
2:14 and then we just interweave them, one after another switching between them, well it would feel just the same, especially if there's
2:22 waiting periods, this I/O bit. So what if we take our various tasks green task, pink task, blue task, and so on
2:30 and we break them up into little tiny slices and we run them a little bit here, a little bit there. So here's a task. Here's another task.
2:40 And we find ways to break them up. And these places where they break up are where we're often waiting on a database
2:46 calling a web service, talking to the file system doing anything that's an external device or system and we keep going like this.
2:54 This type of parallelism uses no threads adds no overhead really, and it still gives a feeling of concurrency, especially if we
3:03 break these things into little tiny pieces and we would've spent a lot of time waiting anyway. This is the conceptual view you should have of AsyncIO.
3:12 How do we take what would be big, somewhat slow operations and break them into a bunch of little ones and denote these places where we're
3:19 waiting on something else. In fact, Python has a keyword to say we're waiting here, we're waiting there we're waiting there, and the keyword is await.
3:28 So we're going to be programming two new keywords async and await, and they're going to be based on this I/O driven concurrency model
3:33 and this you might call cooperative multithreading. It's up to our code to say, I'm waiting here so you can do, so you can go do something else
3:41 when I get my callback from the web service please pick up here and keep going. It's really awesome how it works
3:47 and it's actually the easiest style of parallelism that you're going to work with.

Talk Python's Mastodon Michael Kennedy's Mastodon