Ask Your Question

Array of graphs consuming too much memory?

asked 2013-04-22 04:20:57 +0200

Gordon gravatar image

I want to work in Sage with a large array of small graphs.

To create this array, I load a textfile that has 100000 lines of the following form

gs = []
gs.append(Graph([(0,1,0),(0,1,1),(0,1,2),(0,1,3),(2,3,4),(2,3,5),(2,3,6),(4,5,7),(4,5,8),(4,5,9),(0,1,10), (0,1,11),(0,1,12),(0,1,13),(0,1,14)]))

The textfile occupies a total of 14Mb of diskspace only, but when imported into Sage, the memory usage of the process jumps by nearly 800Mb, and indeed at intermediate stages of the importing, it jumps to over 5Gb before dropping back down again. (Presumably this is due to some sort of memory allocation strategy that keeps doubling the memory required?)

Clearly I am doing something wrong, because an array of 100000 graphs should be no trouble at all to a modern computer, but I don't know what I am doing wrong and how to fix it.

edit retag flag offensive close merge delete


Loading all graphs at the same time may cause that some times. If not all of them are needed at the same time, perhaps iterating through them, one at a time, would be another option.

fidbc gravatar imagefidbc ( 2013-04-22 12:16:42 +0200 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2013-04-22 04:56:01 +0200

Nathann gravatar image

Would it help if you were to put this at the head of your file

import gc

Then if you were to put such a line after every 1000th append statement ?


It is meant to call Python's garbage collector explicitely.


edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower


Asked: 2013-04-22 04:20:57 +0200

Seen: 1,690 times

Last updated: Apr 22 '13