Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

problem loading large file

I have a file which was generated and saved from a Sage notebook worksheet:

save(G,'G4D200')

where G is a list of vectors of length roughly 20 million, and the file 'G4D200.sobj` is about 370Mb.

When I try to load this file later to do some analysis

new_G = load('/path_of_file.../G4D200.sobj')

my computer goes deep into swap memory (with python eating it all up) and basically freezes. I have done the exact same thing for smaller file sizes (<10Mb) with no issues. I've tried waiting over 30 minutes and nothing seems to happen. Is this possibly a memory issue? Something with Sage? Possible workaround? I'm running the Sage notebook in fedora linux with an older laptop with 8Gb of RAM.

Thanks!