problem loading large file

asked 6 years ago

Daniel L gravatar image

I have a file which was generated and saved from a Sage notebook worksheet:

save(G,'G4D200')

where G is a list of vectors of length roughly 20 million, and the file 'G4D200.sobj` is about 370Mb.

When I try to load this file later to do some analysis

new_G = load('/path_of_file.../G4D200.sobj')

my computer goes deep into swap memory (with python eating it all up) and basically freezes. I have done the exact same thing for smaller file sizes (<10Mb) with no issues. I've tried waiting over 30 minutes and nothing seems to happen. Is this possibly a memory issue? Something with Sage? Possible workaround? I'm running the Sage notebook in fedora linux with an older laptop with 8Gb of RAM.

Thanks!

Preview: (hide)

Comments

Is there any update on this matter? I am having the same problem here (the generator was suggested by @slelievre to process further)

v = MatrixSpace(GF(2), 3, 6)
rank = 6
im = identity_matrix(3)
g3 = (c for c in Combinations(len(v), 3) if block_matrix(3, 2, [ [ im,v[c[0]] ], [ im,v[c[1]] ], [ im,v[c[2]] ] ]).rank() >= rank)
g3_ = []
for g in g3:
   g3_.append(g)
save(g3_, 'g3_.sobj')

After 7 days, I got the 1.84 GB file (renamed and uploaded by me, no virus, only Sagemath format file): https://ufile.io/kakhi

After downloading/having the file, we load the file:

g3 = load('g3_.sobj')

However, it is too large for my RAM. Is there any solution? We might have to split it before "load". Besides it, better solution to save data is appreciated.

imnvsh gravatar imageimnvsh ( 6 years ago )