Effective PyCharm Transcripts
Chapter: Performance and profiling
Lecture: Profiling the slow app
Login or purchase this course to watch this video and the rest of the course contents.
0:00 Now that you're familiar with this program, it's time to profile it. But one quick hang up here that I just got
0:07 to tell you about. There was a bug in one part of the profile visualization that was in the latest version of PyCharm.
0:13 Not the EAP but the actual shipping one. So I had to come back here and roll back to two 2021.1.3
0:20 . I'm sure by the time you get this course it'll be fixed, it's already been addressed. The only reason I mentioned this is for whatever reason to
0:28 lay out of my toolbar changed. So beware, it probably will be back over here for you. Now. What we want to do is we want to profile this.
0:36 So you're familiar with running the program and you're familiar with debugging it. But if we go over here we can click this and say profile.
0:44 So long as you have a run configuration, you can just press this button and it will run and use Python's c profile to
0:52 figure out where we're spending our time. Let's go. So they're ran like usual. But now we got this cool report, let's check it out over here.
1:06 It shows you where you're spending your time and it uses this own time by default
1:11 And what that means is if a function is being called by another function, how much time are you spending at that level of the call stack?
1:20 It doesn't tell you how much time that function takes to run because if it calls
1:24 other functions itself, it ignores the time that is down calling those other functions. I find this to be not a very helpful terms.
1:32 So what I want to know is how long did the function take to run, including things that might have had to do while it was running.
1:40 So that's this one here. So we're going to sort it by the top and now over here we've got the interesting things.
1:45 We've got program, obviously this one took the longest because that's the start up.
1:50 We've got main and we've got to go and then we started getting into the interesting
1:54 things. So the pieces that we worked on is we have compute analytics. Remember up here we have search get records,
2:04 we have compute analytics. So here's our computer analytics. That one is the slowest. We spent 3.3, seconds there And 68% of the time.
2:15 We have our search where we spent 353 milliseconds, only 7% of the time and get records the database.
2:23 One was the third where we spent 1.2 seconds or 23% of the time. So this is helpful. And you can even do cool things like let's go over
2:31 to this function if I double click it, it takes us right there, which is fantastic.
2:36 And if you right click you can navigate the source or show on call graph. Let's check out the call graph.
2:42 This is actually probably the most useful as your first trying to understand what performance means See how useful and meaningful.
2:52 No, wait, this is not very helpful. We gotta zoom in. Okay, let's zoom in and we'll go over here and I got to find your way to
2:59 the bottom. So notice there's all this stuff around. Find and load and frames and what not that's not really relevant to us.
3:09 We can't control that. That's just Python starting up. So we want to start with when is our script running here?
3:16 They wouldn't come over and see that We're going to run program and notice it's 4.9 seconds or the total time, but 0% of its own time.
3:25 Same for main and same for go. This is like basically just orchestrating these three functions.
3:30 Right? And again, they don't take much of their own time either. They're just calling a bunch of things that themselves take time.
3:36 Notice the colors, red means really slow, relatively speaking. Green means pretty fast, relatively speaking and we're still spending a second here,
3:45 but it's still better than the five. And again, green. So what we can do is we can quickly get a look here and say, all right,
3:53 well, where are we spending our time? What is slow? We should probably start right there. That one is the slowest. So if we could optimize that,
4:02 it's going to have the biggest impact. For example, if we could make a search instant if somehow we could make ping time on the internet go away,
4:11 if we could make the talk Python server was turn instantly to us, what benefit would we have working on this function?
4:19 353 milliseconds out of 4800. So we can go from 4.8 Down to 4.5 seconds So no matter how much we work on this function,
4:31 it's not going to get faster than, you know, 7% faster. Right? So you really want to think about,
4:36 you know, where can I spend my time and where is it meaningful to try to improve this? A lot of times people work on code and they say,
4:43 well this could be faster if we do some crazy things do really optimize it. That makes the code harder to maintain and understand,
4:50 but it will make it faster. Just remember You're only going to ever get 7.2% improvement here. So also think about maintenance, readability,
4:59 all those kind of things. All right. So use this to kind of initially guide where you want to go and again we
5:06 can click over here and navigate the source. Boom, that's probably what we should be working on first. Another thing to notice is, here's our result,
5:15 this course 6 P stat that means it's the sixth time I ran it because I was fiddling around with the previous version when it had the bug,
5:23 I'm gonna put that one on the left and when we run it again we've made no changes yet. But we will and when we run it and we make those
5:31 changes we're going to get another one. So we'll be able to do things like come over here and say well let's sort
5:36 this like that and sort this one like that and then we could just cycle between them. Right? That's uh that's one way we can kind of get a sense
5:45 for how are we making improvements? How was it before? And how was it after? I'm going to make a point to leave this course 6p stat over here so that
5:54 I can have it as a reference as we improve throughout this chapter.