The answer of B r u n o is
a quick answer for the stated question, as it was stated.
The new question should be stated in a different place, not as an obscure partial comment,
since it comes with a complication in a special problem inside a "similar" problem.
Here is a solution "to the comment". It would not fit as a comment, so it is an answer.
First of all, let us write the two matrices, so that a human can compile them with bare eyes:
j = I
F = QQbar
A = matrix( F, 6, [ 2,-1,-j,0,0,0,
-1,2,-1,0,0,0,
j,-1,3,-1,0,0,
0,0,-1,2,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
B = matrix( F, 6, [ 2,-1,0,-j,0,0,
-1,2,-1,0,0,0,
0,-1,2,-1,0,0,
j,0,-1,3,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
They are of course similar, as we can quickly test:
sage: A.is_similar(B)
True
But the plain request
sage: A.is_similar(B, transformation=1)
takes too long to be exectued. Why?! We will see soon.
Some useful information is as follows:
sage: A.is_diagonalizable()
True
sage: B.is_diagonalizable()
True
sage: A.minpoly() == A.charpoly()
True
sage: B.minpoly() == B.charpoly()
True
So a solution to the problem can be found by getting the Jordan normal forms, and the transformations.
Let us see.
We compute $T_A,T_B$ so that we have:
$$ T_A\; J = A\cdot T_A\ .$$
$$ T_B\; J = B\cdot T_B\ .$$
Code:
JA, TA = A.jordan_form( transformation=1 )
JB, TB = B.jordan_form( transformation=1 )
print "Is JA == JB?", JA==JB
This gives:
Is JA == JB? True
So let us define:
J = JA
Let us test the equalities:
print "TA*J == A*TA is", TA*J == A*TA
print "TB*J == B*TB is", TB*J == B*TB
which gives:
TA*J == A*TA is True
TB*J == B*TB is True
Then from $J=T_A^{-1}AT_A=T_B^{-1}BT_B$ we get immediately
$$ A = (T_AT_B^{-1})\cdot B\cdot (T_BT_A^{-1})\ ,$$
so it seems that we only need to ask for one or both matrices
TB * TA.inverse()
TA * TB.inverse()
and we are done. Well,
first of all, inverses of $6\times 6$ matrices with entries in QQbar
are hard to compute.
We simply type
TA.inverse()
and start to wait. So here was the problem. At this stage we can involve either mathematics of a better (programatical) framework to solve the problem.
(1) Using mathematics*
In order to avoid taking inverses, we use the following trick. We take the Jordan normal form for BB
, the transpose of B
. Let $B'$ be mathematically
the transposed matrix to $B$
The equality
$$ T_{B'}\; J = B'\; T_{B'} $$
becomes transposed, and we get
$$ T_{B'}\; B = J\; T_{B'} \ .$$
Here, $J=J_B=J_{B'}$ is the same Jordan normal form. Then from
$$ J=T_A^{-1}AT_A=T'_{B'}B(T'_{B'})^{-1} $$
we see that the matrix
$ S = T_A\cdot T'_{B'} $
is our friend. Code for this path:
BB = B.transpose()
JA , TA = A .jordan_form( transformation=1 )
JBB, TBB = BB.jordan_form( transformation=1 )
print "Is JA == JBB?", JA == JBB # yes...
# J = JA
S = TA * TBB.transpose()
print "A*S - S*B is the following matrix:"
print A*S - S*B
And we get:
Is JA == JBB? True
A*S - S*B is the following matrix:
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
[0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
But we have to wait again too long for
print "Is A*S == S*B?", A*S == S*B # looooooooong tie...
So even recognizing if some numbers are zero...
(2) Using linear algebra
Note that a base change matrix to get the Jordan normal form is determined only
up to multiplication with some diagonal matrix. (The diagonal matrices form the commutator of $J$, they commute with $J$, of course and $J$ has different eigenvalues.)
This is so since its columns, eigenvectors for the corresponding eigenvalues in the Jordan matrix, are determined only up to a (non-zero)
multiplicative constant.
Now let us imagine $X$ is a matrix of shape $6\times 6$, each entry is an unknown,
We want to solve the linear system $AX=XB$. It is natural to get a vector space of dimension $6$
of solutions. It is exactly what the following code does.
The basis of this vector space is built out of six matrices, the code prints them.
Each linear combination of them delivers a matrix interchanging $A,B$. The code:
F.<J> = QuadraticField( -1 )
A = matrix( F, 6, [ 2,-1,-J,0,0,0,
-1,2,-1,0,0,0,
J,-1,3,-1,0,0,
0,0,-1,2,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
B = matrix( F, 6, [ 2,-1,0,-J,0,0,
-1,2,-1,0,0,0,
0,-1,2,-1,0,0,
J,0,-1,3,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
a = A.change_ring( QQbar )
b = B.change_ring( QQbar )
R = range(6)
xvars = var( ','.join( [ 'x%s%s' % (j,k) for j in R for k in R ] ) )
X = matrix( 6, 6, xvars )
eqs = [ ( A*X )[j,k] == (X*B)[j,k] for j in R for k in R ]
sol = solve( eqs, xvars, solution_dict=1 )[0]
S = matrix( 6, 6, [ sol[ xvar ] for xvar in xvars ] )
newvars = list( set( [ v for j in R for k in R for v in S[j,k].variables() ] ) )
# extract from S these new variables
RT = range(len(newvars)) # this is range(6)
T = [ matrix( 6, 6, [ QQbar( S[j,k].coefficient( newvars[t] ) ) for j in R for k in R ] )
for t in RT ]
for t in RT:
U = T[t]
print ( "T[%s] is the following matrix:\n%s\ndet( T[%s] ) = %s\nThe relation A*T[%s] == T[%s]*B is %s\n"
% ( t, U, t, U.det(), t, t, a*U == U*b ) )
And the results are:
T[0] is the following matrix:
[ I + 1 0 I + 1 1 0 I]
[ -I I + 1 1 0 I + 1 0]
[ 0 -I + 1 0 I + 2 0 1]
[ -I + 1 -2*I I + 1 I 2 1]
[-2*I + 1 I + 1 -I 2 I + 1 2]
[ 0 -I + 1 2 1 2 I + 2]
det( T[0] ) = 16*I + 112
The relation A*T[0] == T[0]*B is True
T[1] is the following matrix:
[ 0 0 0 0 I I + 1]
[ 0 0 0 0 1 I + 1]
[ 0 0 0 1 1 1]
[ -I 0 1 1 1 1]
[ -I -I + 1 1 1 1 1]
[-I + 1 -I + 1 -I + 1 1 1 1]
det( T[1] ) = 4*I - 4
The relation A*T[1] == T[1]*B is True
T[2] is the following matrix:
[ -1 0 I 0 -I + 1 0]
[ 0 I - 1 0 1 0 -I]
[ 0 0 0 -1 0 -I]
[ 0 -I + 1 -2 -1 0 -I]
[ 1 -2 -I 0 -I - 1 0]
[ -2 0 0 -I 0 -1]
det( T[2] ) = -32*I - 32
The relation A*T[2] == T[2]*B is True
T[3] is the following matrix:
[ I 1 I 0 0 0]
[ 1 I 1 0 0 0]
[ 0 1 0 I + 1 0 0]
[ -I 1 I 0 I + 1 0]
[ 1 -1 1 I + 1 0 I + 1]
[ 0 1 0 0 I + 1 I + 1]
det( T[3] ) = 8*I + 8
The relation A*T[3] == T[3]*B is True
T[4] is the following matrix:
[ 2 0 1 0 -I 1]
[ 0 2 -I 0 1 -I]
[ 0 -I 1 -I + 1 0 -I]
[ -1 -I - 1 2 1 -2*I + 1 0]
[-2*I - 1 I + 1 -I - 1 -2*I + 1 1 -I + 1]
[ 0 -2*I - 1 -I 0 -I + 1 -I + 2]
det( T[4] ) = 212*I + 280
The relation A*T[4] == T[4]*B is True
T[5] is the following matrix:
[ 1 0 2 0 -1 -I]
[ -I 2 0 0 -I -1]
[ 1 -I 0 -I + 1 0 -1]
[ 0 -I - 1 1 I -I 0]
[-I - 1 I + 1 -1 -I I -I + 1]
[ I -1 0 0 -I + 1 1]
det( T[5] ) = -40*I + 36
The relation A*T[5] == T[5]*B is True
On the free market i would maybe be able to sell only T[1]
, nobody will buy
on other intertwiner, too complicated...
See also https://trac.sagemath.org/ticket/22397 (that still needs work to be included in sage)