# Calling singular.invariant_ring

I am trying to use Singular through Sagemath, to compute an the invariant subring of a finite group. Here is a minimal example where the error I get occurs:

singular.lib('finvar.lib')
R = singular.ring(3, '(x0,x1)', 'dp')
g = singular.matrix(2,2,'1,0,1,1')
singular.invariant_ring(g)


I saved this code as test.sage and ran sage test.sage (Debian 9, sagemath installed using apt, with an existing installation of Singular), and got the error (I quote only the last few lines)

  File "/usr/lib/python2.7/dist-packages/sage/interfaces/singular.py", line 653, in eval
raise SingularError('Singular error:\n%s'%s)
TypeError: Singular error:
? assign not impl.
? error occurred in or before STDIN line 16:         return(P,S);


However the corresponding (or at least I believe so) Singular code

LIB"finvar.lib";
ring R = 3,(x0,x1),dp;
matrix g1= 1,0,1,1;
matrix P, S=invariant_ring(g1);


runs properly. What is the problem? Instead of the last line in my sage code, I tried

P, S =     singular.invariant_ring(g)


and

[P, S] =   singular.invariant_ring(g)


but this did not help. I get the same error when I tried the code at the Sage Cell Server.

edit retag close merge delete

Sort by » oldest newest most voted This looks like a bug to me.

The problem is that SageMath doesn't understand the return type. You can work around it as follows:

sage: from sage.interfaces.singular import SingularElement
sage: SingularElement(singular, 'list', 'invariant_ring(%s)' % g.name(), False).sage()
[[            x0 x0^2*x1 - x1^3], ]


The problem has been reported as trac ticket #28386.

more

Thank you. I had begun to suspect that SageMath not understanding the return type was the issue, since singular.primary_invariants(g) returned the correct solution