-
hey! thanks for this profiler, it looks very useful, if i can figure out how to use it :) i have a short script that, in short, generates a bunch of data, and then plots it. depending on how much i generate, memory use can be many gigabytes. so i'd like to profile it so i can find out when and where i may have some dataframes hanging around from function calls that i can delete when they're not needed anymore, like after i have dumped it to a file. however, running it with fil, i get this: the light pink on the left are the plotting calls, but what does it mean that it says tldr version of my code is:
(i've installed fil to the same conda env i use for the script, if that matters) |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 5 replies
-
In other words, it's the memory from importing NumPy. Is there a third stack if you scroll to the right, not shown in the screenshot? If not, something strange (a bug?) is happening, because importing NumPy is not 2GB worth of RAM, so where is the rest of the memory? |
Beta Was this translation helpful? Give feedback.
-
Oh you're right there are more columns to the right, very thin ones. I removed the plotting stuff to make it easier to see (in the attached svg), but as you say it still looks like numpy for whatever reason uses the 2GB. In my mind it should be the Oh I forgot to mention, I'm running fil in Ubuntu through WSL2, maybe that causes some issue? |
Beta Was this translation helpful? Give feedback.
-
So I went through this a bit more systematically with new conda environments etc. Also tested on macOS. All below results are using the example code from the documentation: https://pythonspeed.com/fil/docs/fil/trying.html Not sure if I'm doing something wrong but results are definitely not as easy to interpret as in the documentation example :) WSL on Win10/Ubuntu20.04
Created new conda environment, installed fil-profiler and numpy:
That gets fil version Ran the example: macOS 11.6.3
Installed fil into new conda env like above, gets me the same Just to make sure it's not mamba doing things in slightly different ways than conda, i also created new env and installed with just conda commands:
But that gave exact same svg result (double checked path to |
Beta Was this translation helpful? Give feedback.
-
I've added a numpy-specific workaround to the latest release (2022.3.0). More broadly, tracking how much of an mmap() is actually used (since by default no RAM is used until you start writing to it, allowing sparse usage) is tracking by #308. |
Beta Was this translation helpful? Give feedback.
I've added a numpy-specific workaround to the latest release (2022.3.0). More broadly, tracking how much of an mmap() is actually used (since by default no RAM is used until you start writing to it, allowing sparse usage) is tracking by #308.