Save/load huge dictionary

asked 2019-03-04 17:08:10 +0200

imnvsh gravatar image

updated 2019-03-04 17:56:07 +0200

I have a huge dictionary about 50 GB. After generating this dictionary, I do not have any space left on my memory. I still run Sagemath standard save(mydict, 'myfile'). However, the save runs almost forever.

What should I do? Storing it in multiple files is also fine to me. I really need to load it again to use in the future.

Maybe another approach is helpful. Besides the above dictionary, I have another huge redundant dictionary mydict2, which I tried using del mydict2 to get some extra memory for the above Sagemath save; however, the memory usage still stays the same as before calling del mydict2. I guess its keys are still stored in memory. I do not need keys from mydict2, but its value is used in mydict.

edit retag flag offensive close merge delete

Comments

If you use something that big, you should rather use databases (there are a lot of Python libraries available). It make no sense to load the whole dictionary in memory.

vdelecroix gravatar imagevdelecroix ( 2019-03-04 18:04:03 +0200 )edit

@vdelecroix Thank you for your reponse. Could you give me an example? Or something best matched to Sagemath based on your experience? I see this one supported in Sagemath, but not really sure about its quality: http://doc.sagemath.org/html/en/refer...

imnvsh gravatar imageimnvsh ( 2019-03-04 18:42:32 +0200 )edit

The tool you want to use depends on what are the keys and values of your dictionary. The simplest database is sqlite3. You can alternatively use the thing which is in SageMath and is based on sqlite. Though I would rather go to with sqlite3.

vdelecroix gravatar imagevdelecroix ( 2019-03-04 18:56:01 +0200 )edit

Thank you @vdelecroix.

Side information: my dictionary is in format of keys as tuple (int,int) and values as a list [int]*2110.

imnvsh gravatar imageimnvsh ( 2019-03-06 13:34:16 +0200 )edit