Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Sagenb long (graph) calculations silently terminating

To recapitulate the title, I've been having quite a few different sagenb runs silently terminate lately...the green sidebar just goes away, and the full_output.txt link appears, but the results are clearly incomplete. Just wondering if anyone knew why this might be happening, or could suggest a remedy... I considered the possibility of a memory limit issue, but processing items from generators one at a time rather than building large lists seems to have no impact. I have no tracebacks to post, as whatever is stopping the execution is doing so without raising an exception. By printing intermediate results, I've found that the execution doesn't stop deterministically at the same point, though certain runs will consistently be cut off before completion. Here is a sample which never seems to complete: In this case, I'm cataloguing the automorphism groups of cubic graphs, but can't seem to get through the 18-vertex case...thanks in advance for any help!


import collections lglist=collections.Counter()

for k in range(4,19): lglist.clear() gen = (graphs.nauty_geng(str(k) + "-C -d3 -D3")) while 0==0: try: g=gen.next() except StopIteration: break if g.vertex_connectivity()>=3: a=[] a.append(k) a.append(g.vertex_connectivity()) try:
b=g.automorphism_group().group_id() except RuntimeError: b=[g.automorphism_group().cardinality(),0] a=a+b lglist.update((tuple(a),))

for item in lglist.items(): print item

Sagenb long (graph) calculations silently terminating

To recapitulate the title, I've been having quite a few different sagenb runs silently terminate lately...the green sidebar just goes away, and the full_output.txt link appears, but the results are clearly incomplete. Just wondering if anyone knew why this might be happening, or could suggest a remedy... I considered the possibility of a memory limit issue, but processing items from generators one at a time rather than building large lists seems to have no impact. I have no tracebacks to post, as whatever is stopping the execution is doing so without raising an exception. By printing intermediate results, I've found that the execution doesn't stop deterministically at the same point, though certain runs will consistently be cut off before completion. Here is a sample which never seems to complete: In this case, I'm cataloguing the automorphism groups of cubic graphs, but can't seem to get through the 18-vertex case...thanks in advance for any help!


 import collections
lglist=collections.Counter()

for k in range(4,19): lglist.clear() gen = (graphs.nauty_geng(str(k) + "-C -d3 -D3")) while 0==0: try: g=gen.next() except StopIteration: break if g.vertex_connectivity()>=3: a=[] a.append(k) a.append(g.vertex_connectivity()) try:
b=g.automorphism_group().group_id() except RuntimeError: b=[g.automorphism_group().cardinality(),0] a=a+b lglist.update((tuple(a),))

for item in lglist.items(): print item