Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Adjoint of symbolic matrix fails due to type conversion

Trying the following code

var('x')
Matrix([[sqrt(x),x],[1,0]]).adjoint()

I get an error message

TypeError: no conversion of this rational to integer

This seems very strange to me. Computing the adjoint of any matrix should be possible, without requiring the elements to be rational, integral or anything else special. Over the symbolic ring in particular, it should always be possible to write the minors as expressions, perhaps without any evaluation or simplification in cases where doing more might cause trouble.

Surprisingly, computing the inverse works without issues. This is despite the fact that inversion requires division, and thus entails the danger of division by zero, where computing the adjoint does not. So one workaround might be multiplying the inverse by the determinant. But this will fail for some matrices where computing the adjoint should work. And it might also leave behind divisions within the elements, which might be difficult to get rid of without changing other radical changes to the form of the involved expressions. I just had a more complicated example where simplify left the division and simplify_full split square roots in a rather unreadable fashion.

Is there a valid explanation for the above behavior, or is this a bug?
Is there a better way to work around this problem?