ASKSAGE: Sage Q&A Forum - RSS feedhttps://ask.sagemath.org/questions/Q&A Forum for SageenCopyright Sage, 2010. Some rights reserved under creative commons license.Mon, 22 Nov 2010 04:52:00 +0100large groebner basis calculationshttps://ask.sagemath.org/question/7714/large-groebner-basis-calculations/I know there are a number of different packages in Sage capable of computing Groebner basis but I don't have much experience with these tools. I am wondering which option and specifically which routine might be most suitable for very large (50,000 generators) but relatively sparse (polynomials of degree less than 4 and with 4 or fewer terms over 64 variables). I am interested in basis and prime decomposition in relation to classifying low dimensional Frobenius algebras. I am currently using groeber and the primdec library from Singular in Sage but I am up against significant performance limits.
Also, does any have an recommended methods of pre-processing sparse sets of generators prior to calling groebner? And are there any packages capable of exploiting multi-processor architecture?Wed, 29 Sep 2010 21:25:07 +0200https://ask.sagemath.org/question/7714/large-groebner-basis-calculations/Answer by mhampton for <p>I know there are a number of different packages in Sage capable of computing Groebner basis but I don't have much experience with these tools. I am wondering which option and specifically which routine might be most suitable for very large (50,000 generators) but relatively sparse (polynomials of degree less than 4 and with 4 or fewer terms over 64 variables). I am interested in basis and prime decomposition in relation to classifying low dimensional Frobenius algebras. I am currently using groeber and the primdec library from Singular in Sage but I am up against significant performance limits.</p>
<p>Also, does any have an recommended methods of pre-processing sparse sets of generators prior to calling groebner? And are there any packages capable of exploiting multi-processor architecture?</p>
https://ask.sagemath.org/question/7714/large-groebner-basis-calculations/?answer=11708#post-id-11708I think Singular is your best bet right now in Sage. There are a lot of options you can try beyond the default behavior, but its hard to know what will be best for a given system. For some things I've worked with the facstd commend in Singular was very helpful, you can do something like:
myideal = myring.ideal(generators)
fstd = singular(myideal).facstd()
bases = [[myring(f) for f in fs] for fs in fstd]
where of course you would have to define the generators and ring. Also, you probably already know this but using different term orders can have an enormous effect on the running time.
I don't think there is anything parallelizable in Sage for Groebner bases at the moment.
Another thing I sometimes find helpful with big systems is to compute Groebner bases of subsets of the generators. This is trivial to parallelize, and sometimes merging those subsets is faster than doing them all at once. It also can give you some insight into the structure of the system.
Fri, 01 Oct 2010 18:50:48 +0200https://ask.sagemath.org/question/7714/large-groebner-basis-calculations/?answer=11708#post-id-11708Answer by Michael Brickenstein for <p>I know there are a number of different packages in Sage capable of computing Groebner basis but I don't have much experience with these tools. I am wondering which option and specifically which routine might be most suitable for very large (50,000 generators) but relatively sparse (polynomials of degree less than 4 and with 4 or fewer terms over 64 variables). I am interested in basis and prime decomposition in relation to classifying low dimensional Frobenius algebras. I am currently using groeber and the primdec library from Singular in Sage but I am up against significant performance limits.</p>
<p>Also, does any have an recommended methods of pre-processing sparse sets of generators prior to calling groebner? And are there any packages capable of exploiting multi-processor architecture?</p>
https://ask.sagemath.org/question/7714/large-groebner-basis-calculations/?answer=11794#post-id-11794Hi!
Sorry, I did not notice the message before.
There exists no 'best variant' for large systems.
The ugly answer is, that you have to try
several variants and the most important
options, at least redTail...
You can also try several orderings.
Usually, the best ordering is dp, but
sometimes you can arrange the variables in a smart block ordering structure,
which makes things much easier.
Maybe, you can also try slimgb, which has had good results in the past
for large system, but systems of these number of variables are reallly
hard.
Having 50000 equations however is awesome and improves the possibility to
compute the GB.
The core algorithms std and slimgb both implement their own preprocessing.
Regardings primary decomposition of such big systems.
This is more difficult than Groebner bases. So you can essentially
forget about it, unless there are some special conditions.
In fact, when you're ideal is such over determined
and you might only have a few solutions with multiplicity 1,
the prime decomposition is equivalent to solving.
I think you did not mention the characteristic.
Cheers,
Michael Mon, 22 Nov 2010 04:52:00 +0100https://ask.sagemath.org/question/7714/large-groebner-basis-calculations/?answer=11794#post-id-11794