The availability of huge amounts of data is a defining characteristic of today's modern computing. With the advent of algorithms with excellent scalability properties, the problem is often switched from computing time to memory availability.
I present memory_profiler, a python module for monitoring memory consumption of a process as well as a line-by-line analysis of python code.
- line-by-line report of memory usage for python code.
- Monitoring memory usage for arbitrary process from python.
- pure python, single file, easy installation from PyPY.
- works both on windows and UNIX (requires the psutil module on windows)
The line-by-line analysis of memory consumption is specified in a similar fashion as line_profiler: functions susceptible of being profiled must be decorated with @profile. To print the report, script using the profiled function is run as usual, adding the options "-m memory_profiler" to the python interpreter. Full instructions and examples are available from the project page . As output, the corresponding lines are printed, prefixed with the memory consumption after the corresponding line was executed:
Line # Mem usage Line Contents =================================== 3 @profile 4 14.19 MB def my_func(): 5 14.27 MB a = np.zeros((100, 100)) 6 21.91 MB b = np.zeros((1000, 1000)) 7 98.20 MB c = np.zeros((10000, 1000)) 8 98.20 MB return a, b, c
The module also is capable of periodically fetching memory usage of arbitrary (python or not) process from python. This makes it possible to plot the memory usage of a process as a function of time, as done in this blog post to compare the memory consumption of two QR decompositions
The module is pure python, distributed as a single file, with no dependencies on external module. Only an implementation of the unix ps command is necessary, already available in any linux distribution.