Ask Your Question

Revision history [back]

Callling singular.invariant_ring

I am trying to use Singular through Sagemath, to compute an the invariant subring of a finite group. Here is a minimal example where the error I get occurs:

singular.lib('finvar.lib')
R = singular.ring(3, '(x0,x1)', 'dp')
g = singular.matrix(2,2,'1,0,1,1')
singular.invariant_ring(g)

I saved this code as test.sage and ran sage test.sage (Debian 9, sagemath installed using apt, with an existing installation of Singular), and got the error (I quote only the last few lines)

  File "/usr/lib/python2.7/dist-packages/sage/interfaces/singular.py", line 653, in eval
    raise SingularError('Singular error:\n%s'%s)
TypeError: Singular error:
   ? assign not impl.
   ? error occurred in or before STDIN line 16: `        return(P,S);`

However the corresponding (or at least I believe so) Singular code

LIB"finvar.lib";
ring R = 3,(x0,x1),dp;
matrix g1[2][2]= 1,0,1,1;
matrix P, S=invariant_ring(g1);

runs properly. What is the problem? Instead of the last line in my sage code, I tried

P, S =     singular.invariant_ring(g)

and

[P, S] =   singular.invariant_ring(g)

but this did not help. I get the same error when I tried the code at the Sage Cell Server.