problem loading large file
I have a file which was generated and saved from a Sage notebook worksheet:
save(G,'G4D200')
where G
is a list of vectors of length roughly 20 million, and the file 'G4D200.sobj` is about 370Mb.
When I try to load this file later to do some analysis
new_G = load('/path_of_file.../G4D200.sobj')
my computer goes deep into swap memory (with python eating it all up) and basically freezes. I have done the exact same thing for smaller file sizes (<10Mb) with no issues. I've tried waiting over 30 minutes and nothing seems to happen. Is this possibly a memory issue? Something with Sage? Possible workaround? I'm running the Sage notebook in fedora linux with an older laptop with 8Gb of RAM.
Thanks!
Is there any update on this matter? I am having the same problem here (the generator was suggested by @slelievre to process further)
After 7 days, I got the 1.84 GB file (renamed and uploaded by me, no virus, only Sagemath format file): https://ufile.io/kakhi
After downloading/having the file, we load the file:
However, it is too large for my RAM. Is there any solution? We might have to split it before "load". Besides it, better solution to save data is appreciated.