Ask Your Question

Revision history [back]

Python multiprocessing in sage script - is memory freed after subprocess terminates?

I am trying to run some rather memory heavy calculations in a sage scipt (version 9.0, with Python 3.8.5).

Basically, my programm finds estimates for a given parameter set that satisfy a certain requirement. Hence, I iterate over multiple parameter sets and run the estimates (that use sage) for each set until the termination condition is fulfilled.

To benefit from mutliple cores, I use the Python's multiprocessing module and run the estimates in multiple subprocesses.

However, I noticed that after a process terminated, the memory that was allotted is not released again (at least not to the system). Hence when I run multiple parameter searches in one program, eventually my main memory is full. Apparently, Sage has a memory leak somewhere. I use a submodule - not my own - to do the estimates, so I won't be able to fix that any time soon.

I'm interested in how sage handles (Python) subprocesses:

  1. Is a new sage instance spawned for every subprocesses?
  2. If not is there a way to do so? Then I would be able to make use of Python's way of freeing up all memory (after a process terminates all used memory is returned to the system).

Something like that would be kind of nice.