Mastering PyCharm Transcripts
Chapter: Performance and profiling
Lecture: Profiling the slow app

Login or purchase this course to watch this video and the rest of the course contents.
0:02 We saw this program is not especially fast,
0:04 how long it takes maybe five seconds or something like that
0:08 and we would like it to be faster, if that is at all possible.
0:12 So what we're going to do is we're going to use the profiling tools
0:17 to ask where time is spent
0:20 and if we want to just profile the entire application
0:22 that's probably the easiest thing to do,
0:24 we can go over here and push this button,
0:28 to me this looks a little bit like a cd, but it's like a speedometer type thing
0:33 so once you have a run configuration
0:35 this can be for unit tests or for regular apps, or web apps even,
0:40 what you can do is click here to run that with profiling, so let's do it.
0:48 So it started the c profiler which is probably the better of the two profilers,
0:53 it did what it does and then it saved a snapshot here
0:56 based on the project name and then it exited
0:59 and immediately what came up is this statistics page here,
1:03 so if we look at the statistics,
1:06 what we have is the number of times the function was called
1:09 so for example, this raising something to a power was called 600 thousand times
1:14 that's pretty intense, right, this is probably our machine learning thing
1:18 in terms of time though, that didn't take that long,
1:21 120 milliseconds, that is some time.
1:25 There's some other things going on here,
1:28 learn was called one time but it took half a second,
1:30 let see our socket, connect took a little while, but it's only 4% of the time
1:35 so you can go on here and sort of, I would say sort these
1:39 and then go and find where that's interesting,
1:41 so own time, this means only this function spent this time
1:45 not the things it calls, I find this to be super helpful sometimes
1:49 but usually what I'd like to know is how long does it take for this function to run
1:54 not how long just inside that level in the call stack,
1:57 how long did it run that is this one.
2:01 So program.py took 1.6 seconds,
2:04 main took that long, okay, we called go
2:08 and that's really all that main was doing, it was calling go.
2:11 So here we have compute analytics, we had three things
2:15 we were doing a search, we were doing get records
2:19 and we were doing a compute analytics
2:21 so it looks like this analytics thing is the slowest
2:23 and then, we have learn over here
2:27 and down here we have get records I think that's what we were calling it
2:31 so you can sort of see the relative breakdown
2:34 and this is really helpful, we can do things like let's say I would like
2:37 if you just double click it I guess, I meant to click once
2:41 but if you click once you can say navigate the source
2:43 and it'll take you right to that function,
2:46 okay this one is probably the worst one that we control
2:48 actually well, this one is.
2:52 It's spending almost all of its time in learn down here, which is where we were.
2:57 So we can think about how we might be able to make this better
3:00 we could think about the algorithm here
3:03 now that's one way to look at it, and this is okay
3:07 but I don't really like it that much,
3:09 I guess it depends on how complex things are.
3:11 The other thing you can look at is the call graph
3:13 and the call graph is awesome, this does not look awesome, does it?
3:17 So let's try to zoom in a little bit
3:19 so we can get something more meaningful, so if we come down here
3:22 there's a bunch of junk that's going to be in here
3:24 that has nothing to do with us
3:26 so like here you can see this is all the load module start up time from Python
3:29 we can't make that any faster that's C Python,
3:32 that is what it is, if you want that to be faster
3:34 you need to use PyPy or Cython or some other runtime,
3:39 maybe you could somehow pre-compile, get some pyc files
3:45 but that's generally out of our control,
3:47 this though, program calling main, calling go, calling these three functions,
3:52 this is where it's interesting.
3:54 Notice the colors, this is red, this is like as bad as it gets
3:58 and this is slightly less bad, just slightly
4:01 because something else is happening and the start up
4:04 and then this is among these three, not too bad so it's lighter green,
4:08 this is yellow because it's maybe
4:10 3 times as bad as the other stuff on that level, and so on
4:14 so we can go through and actually see what's happening,
4:16 like this one is actually going out and calling get on request
4:20 which is going over the socket,
4:22 this one is calling learn and read data
4:25 and those are both pretty bad it turns out;
4:28 and this one's calling get records and the real slow part of get records is
4:32 creating the connection here,
4:34 actually I think I remember, I wanted this to be a little bit slower
4:37 so let me go over here, we can navigate the source
4:40 and let's make actually this part which we'll talk about in a second a little slower
4:44 so one more profile thing here with the call graph
4:51 it always looks easy like that so we'll just zoom it
4:58 it's time it rearrange itself a little bit, here we go
5:04 so now we're spending some time on read query
5:06 and get row as well as connecting,
5:09 so these all kind of have their own issues
5:11 so how are we going to fix it?
5:14 Well, we can go through and optimize,
5:16 let's say I would start with the worst thing,
5:19 is there a way to optimize this for example,
5:21 is there a way to optimize that,
5:24 so I think we'll just start here, this is the slowest thing that we have control of,
5:27 this is slow but it's only slow because it's just literally calling these, right
5:31 and you can see right here its own time is zero milliseconds, but total it's that.
5:36 Similarly, here these are calling things that are slow.
5:39 Alright, so the goal is, armed with this information
5:42 and having this loaded up here, let's say this is going to be the starting point
5:48 we'll have this one and we could make some changes
5:50 and we'll have another one of these show up
5:53 and we can do a quick comparison and see how things are working.