Runaway memory usage in Sage 5.0?

asked 2012-07-25 14:52:05 +0200

Martin Malandro gravatar image

updated 2015-01-13 20:48:19 +0200

FrédéricC gravatar image

Hi,

I am running Sage 5.0 on Windows 7 (as it is the latest Windows version available) and my code is crashing after a couple of hours of computation. Downgrading to Sage 4.8 fixes the problem. I'm not sure exactly where the issue is so I will try to say as much about what I'm doing as possible.

I am using the algorithm described in this paper:

http://www.springerlink.com/content/1...

to build a database of the lattices of order $n$ up to isomorphism. I am up to $n=12$ so far, and my goal is to reach $n=15$. The program works by generating the lattices of order $n+1$ from the lattices of order $n$.

As such, I am using lots of Posets and LatticePosets. Sage should not have to store in memory more than a thousand or so Posets on $\leq 15$ nodes at any point during the code's execution, and should not have to hold much else in memory beyond these posets. My code takes as input the lattices of order $n$ and writes the lattices of order $n+1$ as it generates them to a file. I am running Sage 5.0 in VirtualBox with 4 processors and 1500MB RAM allocated.

My code uses the @parallel decorator on one function. With this, the overall memory usage of my system climbs rapidly from what it was before (X) to X+1500MB, and after a few hours one of the return values from the parallelized function will be 'NO DATA' (instead of what I expected, which is a short list of posets), which tells me something went wrong. If I remove the @parallel decorator and just call my function with single inputs instead of lists of inputs, the memory usage of my system rises rapidly to X+1500MB and after a few hours the entire Sage virtual machine just shuts down.

However, if I downgrade to Sage 4.8, dedicate 4 processors and only 1250MB RAM to Virtualbox, I can use the @parallel decorator and my code will run stably for hours and eventually complete, without my system ever going over X+1000MB memory usage.

Does anyone have any idea what's going on here? Is Sage 5.0 caching all of the lattices of order $n+1$ that I'm generating along the way and eventually running out of memory or something?

edit retag flag offensive close merge delete

Comments

If you can post the code (or a relevant portion of it) someone might be able to comment. It depends a lot on how the lattices are being computed, how they are stored, read, cached, etc..

benjaminfjones gravatar imagebenjaminfjones ( 2012-07-25 16:09:01 +0200 )edit

I'm not sure I could post a relevant portion because I'm not sure where the issue is. The code itself is kind of long. I would be happy to send the code (and the relevant input files) to anyone willing to take a look at it.

Martin Malandro gravatar imageMartin Malandro ( 2012-07-25 17:13:14 +0200 )edit

It might be a good idea to post to the sage-combinat mailing list, where it would receive wider attention from those that presumably know a lot more about this area of the code: https://groups.google.com/forum/?fromgroups#!forum/sage-combinat-devel

Jason Grout gravatar imageJason Grout ( 2012-07-30 11:48:22 +0200 )edit