Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Using Sage's save and load methods leads to high memory usage

I am using Sage to store and load big matrices and I am monitoring the memory usage of the program with htop in a separated terminal window. (is there a better way to look how the sage is using the memory?)

Apparently, the software expends too much memory when saves and loads objects. Anyone knows what is going on and how to workaround it?


Next, an example code to show it. Commented, the memory used by the system on each step:

# 211 Mb

A = Matrix.random(RealField(200)['u','v','t'], 200)

# 291 Mb (+ 80 Mb)

save(A, 'A')

# 458 Mb (+ 168 Mb)
# A.sobj has size 8.55 Mb on disk

Closing and starting a new session of sage...

# 211 Mb

A = load('A')

# 360 Mb (+ 149 Mb)

So, if the matrix A occupies 80Mb, why almost the double of this is used when saving it? And I don't know how to recuperate it without closing the sage session. Also, why when loading A, again almost the double of the memory is used?


I've found a related question on stackoverflow:

http://stackoverflow.com/questions/20294628/using-pythons-pickle-in-sage-results-in-high-memory-usage

But use the open/write method suggested there leads to the same behavior here...