Modern APIs with FastAPI and Python Transcripts
Chapter: Error handling and performance
Lecture: Concept: Caching data
Login or
purchase this course
to watch this video and the rest of the course contents.
0:00
We saw that it takes 400, 700 milliseconds, almost a second, for us to get the
0:05
weather report from the API.
0:07
But how frequent does that information actually change?
0:10
Is it good to get the forecast from 30 seconds ago or the current weather from
0:14
30 seconds ago? Five minutes ago?
0:15
probably. If that's the case,
0:17
we can cache that information to allow us respond much,
0:20
much faster, as well as to not use up our API
0:24
calls, right? You saw with a free tier
0:26
we only get 60 calls a minute and a million a month.
0:29
So if somebody asked for the same thing a bunch of times,
0:32
let's just give them the same thing back.
0:34
And we can do that by creating a cache.
0:36
And we create this thing called the weather cache and the beginning of our get report
0:39
We just say, "Do we have it saved and is it not too old"?
0:42
Then give them that. Then we're gonna go do the async stuff with httpx.
0:47
And before we return the weather,
0:48
let's save it in our cache,
0:50
so the next time they ask for it,
0:52
they ask for it within an hour,
0:53
They're going to just get that one back
0:54
instead of making a new call over to the API. Again, in our weather
0:58
cache, we put this in memory.
1:00
That's not ideal, because in production you typically have multiple processes,
1:04
multiple copies, of this app running over there. You'd probably put it in somewhere like Redis or
1:08
a database, but even just putting it in memory is gonna help quite a bit.