Async Techniques and Examples in Python Transcripts
Chapter: async and await with asyncio
Lecture: I/O-driven concurrency

Login or purchase this course to watch this video and the rest of the course contents.
0:00 If you haven't worked with AsyncIO before
0:02 it's going to be a little bit of a mind shift.
0:04 But it turns out the programming model
0:06 is actually the easiest of all the
0:09 concurrent programming models
0:10 that we're going to work with.
0:11 So don't worry, it's actually
0:12 the easiest, newest, and one of the best ways
0:16 to write concurrent programs in Python.
0:20 So let's step back a minute and think about
0:22 how do we typically conceptualize or visualize concurrency?
0:26 Well, it usually looks something like this.
0:28 We have some kind of code running
0:30 and then we want to do multiple things at a time
0:32 so we might kick off some other threads
0:34 or some other processes.
0:36 And then our main thread, and all the other threads
0:39 are going to run up to some point, and we're going to
0:41 just wait for that secondary extra work to be done.
0:45 And then we're going to continue executing along this way.
0:48 Like I said, this is typically done
0:49 with threads or multiprocessing, okay?
0:53 Many languages it's only threads
0:54 in Python, because the GIL
0:56 multiprocessing is often involved.
0:58 Now, this model of concurrent programming
1:00 one thread kicking off others
1:03 waiting for them to complete, this fork-join pattern
1:05 this makes a lot of sense.
1:07 But in this AsyncIO world, this is typically
1:11 not how it works.
1:12 Typically, something entirely different happens.
1:15 So in this world, we're depending upon the operating system
1:20 to schedule the threads or schedule the processes
1:22 and manage the concurrency.
1:24 It's called preemptive multithreading.
1:26 Your code doesn't get to decide when it runs
1:28 relative to other parts of your code.
1:30 You just say I want to do all these things in parallel
1:32 it's the operating system's job
1:34 to make them happen concurrently.
1:36 Now, contrast that with I/O driven concurrency.
1:40 So in I/O driven concurrency
1:41 we don't have multiple threads.
1:43 We just have one thread of execution running along.
1:46 This may be your main app, it actually could be
1:48 a background thread as well
1:49 but there's just one thread managing this parallelism.
1:53 Typically, when we have concurrency
1:55 if we have multiple cores, we're actually doing
1:58 more than one thing at a time
2:00 assuming the GIL's not in play.
2:02 We're doing more than one thing at a time.
2:04 If we could take those multiple things
2:06 we're trying to do and slice them into
2:08 little tiny pieces that each take
2:09 a fraction of a second or fractions of milliseconds
2:13 and then we just interweave them, one after another
2:16 switching between them, well
2:18 it would feel just the same, especially if there's
2:21 waiting periods, this I/O bit.
2:23 So what if we take our various tasks
2:26 green task, pink task, blue task, and so on
2:29 and we break them up into little tiny slices
2:32 and we run them a little bit here, a little bit there.
2:35 So here's a task. Here's another task.
2:39 And we find ways to break them up.
2:41 And these places where they break up
2:43 are where we're often waiting on a database
2:45 calling a web service, talking to the file system
2:47 doing anything that's an external device or system
2:51 and we keep going like this.
2:53 This type of parallelism uses no threads
2:55 adds no overhead really, and it still gives
2:59 a feeling of concurrency, especially if we
3:02 break these things into little tiny pieces
3:04 and we would've spent a lot of time waiting anyway.
3:07 This is the conceptual view you should have of AsyncIO.
3:11 How do we take what would be big, somewhat slow operations
3:14 and break them into a bunch of little ones
3:16 and denote these places where we're
3:18 waiting on something else.
3:20 In fact, Python has a keyword to say
3:21 we're waiting here, we're waiting there
3:24 we're waiting there, and the keyword is await.
3:27 So we're going to be programming two new keywords
3:28 async and await, and they're going to be based
3:30 on this I/O driven concurrency model
3:32 and this you might call cooperative multithreading.
3:35 It's up to our code to say, I'm waiting here
3:38 so you can do, so you can go do something else
3:40 when I get my callback from the web service
3:43 please pick up here and keep going.
3:45 It's really awesome how it works
3:46 and it's actually the easiest style of parallelism
3:49 that you're going to work with.