Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Array of graphs consuming too much memory?

I want to work in Sage with a large array of small graphs.

To create this array, I load a textfile that has 100000 lines of the following form

gs = []
gs.append(Graph([(0,1,0),(0,1,1),(0,1,2),(0,1,3),(2,3,4),(2,3,5),(2,3,6),(4,5,7),(4,5,8),(4,5,9),(0,1,10), (0,1,11),(0,1,12),(0,1,13),(0,1,14)]))
gs.append(Graph([(0,1,0),(0,1,1),(0,1,2),(0,1,3),(2,3,4),(2,3,5),(2,3,6),(4,5,7),(4,5,8),(4,5,9),(0,1,10),(0,1,11),(0,1,12),(0,1,13),(0,2,14)]))
gs.append(Graph([(0,1,0),(0,1,1),(0,1,2),(0,1,3),(2,3,4),(2,3,5),(2,3,6),(4,5,7),(4,5,8),(4,5,9),(0,1,10),(0,1,11),(0,1,12),(0,1,13),(2,3,14)]))

The textfile occupies a total of 14Mb of diskspace only, but when imported into Sage, the memory usage of the process jumps by nearly 800Mb, and indeed at intermediate stages of the importing, it jumps to over 5Gb before dropping back down again. (Presumably this is due to some sort of memory allocation strategy that keeps doubling the memory required?)

Clearly I am doing something wrong, because an array of 100000 graphs should be no trouble at all to a modern computer, but I don't know what I am doing wrong and how to fix it.