Memory problem: splitting large conjugacy class into parts
I'm trying to do a calculation on certain conjugacy classes in a large (Weyl) group, let's say:
W = WeylGroup(["E",8])
w = W.an_element()
for v in W.conjugacy_class(w):
if v.length() < 12:
if v.reduced_word()[0] == 1:
print v.reduced_word()
A typical conjugacy class of such W has a size in the millions. When I run this on my computer, then after about 100000 iterations it stops and prints
Error, reached the pre-set memory limit
(change it with the -o command line option)
(I'm not sure where to enter this -o command, but it wouldn't suffice anyway.)
I don't know exactly how these conjugacy classes are implemented (via GAP I think), but I was hoping that I could simply resume the calculation by using islice or e.g.
for v in W.conjugacy_class(w)[100000:200000]:
if v.length() < 12:
if v.reduced_word()[0] == 1:
print v.reduced_word()
However this doesn't seem to work - I'm guessing that it's first trying to create the entire list W.conjugacy_class(w)[100000:200000] (after crunching through the first 100000), in a less efficient way than before, taking up more memory than before.
Is there a way around this? Perhaps this can somehow be set up as a queue (in GAP!?) so that it takes up little memory?
print and no parentheses ? Still not using python3 ?