Modern APIs with FastAPI and Python Transcripts
Chapter: Modern language foundations
Lecture: Non-async web scraper
Login or
purchase this course
to watch this video and the rest of the course contents.
0:00
Async and await is definitely one of the most exciting features added to Python in the last couple of years and FastAPI makes it really easy to use.
0:09
It handles actually most of the juggling of all the asynchronous stuff, you just have to make your code asynchronous for it to be able to work
0:17
with it. So let me paste the program in here, and it's gonna have two versions,
0:21
a synchronous and an asynchronous version. So we're going to start with the synchronous version. Now notice it has a couple dependencies,
0:28
those are all listed over here. So we're just gonna go and go into async, sync version, we're just going to run its requirements.
0:43
So it's using the usual characters here. It's using beautiful soup to work with HTML, and it's using requests to make some kind of requests.
0:51
So what is this thing going to do, anyway? It's going to go out to "talkpython.fm", pull up the page associated, the HTML page
0:59
with that episode, it's going to download the HTML and use beautiful soup to get the header and use that to grab the title,
1:07
okay? The way it works, just like you'd expect, it just goes one at a time, goes from 270, 271, 272, and so on and
1:15
it says, give me the HTML for that one, process it, print it. So, easy right? Let's go do that. Notice there it goes, one, and then the next,
1:25
and then the next. The Talk Python server is super fast, so it only takes five seconds to do that for 10 requests.
1:31
Run it a few times just to see where it lands. Another five seconds, pretty stable. Even under five seconds, how fast.
1:43
But here, let me ask you this question: Where are we spending our time? Where are you waiting for this response?
1:49
I can tell you the server response time for these pages is like 50 milliseconds. But the ping time from here to the server
1:57
is at least 100. So we're not even just waiting on the server, we're mostly just waiting on the vague Internet,
2:04
right? Like the request making its way all the way over to the east coast of the US From the west coast,
2:09
where I am. Could we do more of that? The Internet's really scalable. It would be great if we could send all these requests out at once and then
2:16
just get them back as they get done. So what we're gonna do is we're gonna convert this from running in this traditional synchronous
2:23
way to using the new async and await language features and the libraries that we'll actually make use of later as well.