ASKSAGE: Sage Q&A Forum - RSS feedhttps://ask.sagemath.org/questions/Q&A Forum for SageenCopyright Sage, 2010. Some rights reserved under creative commons license.Thu, 02 Nov 2017 20:54:36 +0100How to find the similarity matrix between two similar matriceshttps://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/Suppose I have two matrices A=(1 0 ; 1 0) and B=(0 0; 1 1) where A,B are two by two matrices and suppose we know A and B are two similar matrices. Then there exist an invertible matrix P such that A=PBP^(-1). Now, how one can calculate the matrix P in sage.Tue, 31 Oct 2017 21:39:44 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/Comment by FrédéricC for <p>Suppose I have two matrices A=(1 0 ; 1 0) and B=(0 0; 1 1) where A,B are two by two matrices and suppose we know A and B are two similar matrices. Then there exist an invertible matrix P such that A=PBP^(-1). Now, how one can calculate the matrix P in sage.</p>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39363#post-id-39363See also https://trac.sagemath.org/ticket/22397 (that still needs work to be included in sage)Wed, 01 Nov 2017 21:57:02 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39363#post-id-39363Answer by B r u n o for <p>Suppose I have two matrices A=(1 0 ; 1 0) and B=(0 0; 1 1) where A,B are two by two matrices and suppose we know A and B are two similar matrices. Then there exist an invertible matrix P such that A=PBP^(-1). Now, how one can calculate the matrix P in sage.</p>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?answer=39358#post-id-39358 sage: A = matrix(QQ,[[1,0],[1,0]])
sage: B = matrix(QQ,[[0,0],[1,1]])
sage: b, P = A.is_similar(B, transformation=True)
sage: b
True
sage: P.inverse() * B * P == A
True
For more, you can use `A.is_similar?` to get:
Docstring:
Return "True" if "self" and "other" are similar, i.e. related by a
change-of-basis matrix.
INPUT:
* "other" -- a matrix, which should be square, and of the same
size as "self".
* "transformation" -- default: "False" - if "True", the output
may include the change-of-basis matrix (also known as the
similarity transformation). See below for an exact description.
[...]
Wed, 01 Nov 2017 10:04:21 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?answer=39358#post-id-39358Comment by rewi for <pre><code>sage: A = matrix(QQ,[[1,0],[1,0]])
sage: B = matrix(QQ,[[0,0],[1,1]])
sage: b, P = A.is_similar(B, transformation=True)
sage: b
True
sage: P.inverse() * B * P == A
True
</code></pre>
<p>For more, you can use <code>A.is_similar?</code> to get:</p>
<pre><code>Docstring:
Return "True" if "self" and "other" are similar, i.e. related by a
change-of-basis matrix.
INPUT:
* "other" -- a matrix, which should be square, and of the same
size as "self".
* "transformation" -- default: "False" - if "True", the output
may include the change-of-basis matrix (also known as the
similarity transformation). See below for an exact description.
[...]
</code></pre>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39360#post-id-39360Thank you very much. will you please see this problem:
A = matrix(QQbar,6,[2,-1,-I,0,0,0,-1,2,-1,0,0,0,I,-1,3,-1,0,0,0,0,-1,2,-1,0,0,0,0,-1,2,-1,0,0,0,0,-1,1])
B = matrix(QQbar,6,[2,-1,0,-I,0,0,-1,2,-1,0,0,0,0,-1,2,-1,0,0,I,0,-1,3,-1,0,0,0,0,-1,2,-1,0,0,0,0,-1,1])
I have shown that these two matrices are similar but I have not been able to find explicitly that invertible matrix P.please seeWed, 01 Nov 2017 13:34:27 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39360#post-id-39360Comment by B r u n o for <pre><code>sage: A = matrix(QQ,[[1,0],[1,0]])
sage: B = matrix(QQ,[[0,0],[1,1]])
sage: b, P = A.is_similar(B, transformation=True)
sage: b
True
sage: P.inverse() * B * P == A
True
</code></pre>
<p>For more, you can use <code>A.is_similar?</code> to get:</p>
<pre><code>Docstring:
Return "True" if "self" and "other" are similar, i.e. related by a
change-of-basis matrix.
INPUT:
* "other" -- a matrix, which should be square, and of the same
size as "self".
* "transformation" -- default: "False" - if "True", the output
may include the change-of-basis matrix (also known as the
similarity transformation). See below for an exact description.
[...]
</code></pre>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39374#post-id-39374Right, with your example, the computation hangs. My investigation is as follows:
- To compute the transformation matrix $P$, the function computes the transformation matrices $P_A$ and $P_B$ associated to the Jordan forms of $A$ and $B$.
- Then, it returns $P_B\cdot P_A^{-1}$.
- $P_A^{-1}$ is computed using the echelon form of $P_A|I$ ($P_A$ augmented with the identity matrix).
- Given the echelon form, it checks whether the lower right coefficient of the *left part* is $1$ (to check whether the matrix is indeed invertible).
- In your case, this coefficient is `1.000000000000000? + 0.?e-28*I`. To check whether is is $1$, it checks whether the coefficient minus $1$ is $0$.
- This coefficient minus $1$ is `0.?e-18 + 0.?e-28*I`.Thu, 02 Nov 2017 12:00:59 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39374#post-id-39374Comment by B r u n o for <pre><code>sage: A = matrix(QQ,[[1,0],[1,0]])
sage: B = matrix(QQ,[[0,0],[1,1]])
sage: b, P = A.is_similar(B, transformation=True)
sage: b
True
sage: P.inverse() * B * P == A
True
</code></pre>
<p>For more, you can use <code>A.is_similar?</code> to get:</p>
<pre><code>Docstring:
Return "True" if "self" and "other" are similar, i.e. related by a
change-of-basis matrix.
INPUT:
* "other" -- a matrix, which should be square, and of the same
size as "self".
* "transformation" -- default: "False" - if "True", the output
may include the change-of-basis matrix (also known as the
similarity transformation). See below for an exact description.
[...]
</code></pre>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39375#post-id-39375- Now the "coefficient minus $1$", `0.?e-18 + 0.?e-28*I` is stored as both an approximate value and an exact representation.
- The approximate value is given as a `ComplexIntervalFieldElement`: it is a rectangle in which the element is guaranteed to be. In your case, it is a tiny rectangle whose center is almost on zero. SageMath first tries to prove the element nonzero by refining the rectangle and excluding zero from it. Of course here it does not work since the element is zero!
- Now it uses the exact representation to prove the element is exactly zero.Thu, 02 Nov 2017 12:21:16 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39375#post-id-39375Comment by B r u n o for <pre><code>sage: A = matrix(QQ,[[1,0],[1,0]])
sage: B = matrix(QQ,[[0,0],[1,1]])
sage: b, P = A.is_similar(B, transformation=True)
sage: b
True
sage: P.inverse() * B * P == A
True
</code></pre>
<p>For more, you can use <code>A.is_similar?</code> to get:</p>
<pre><code>Docstring:
Return "True" if "self" and "other" are similar, i.e. related by a
change-of-basis matrix.
INPUT:
* "other" -- a matrix, which should be square, and of the same
size as "self".
* "transformation" -- default: "False" - if "True", the output
may include the change-of-basis matrix (also known as the
similarity transformation). See below for an exact description.
[...]
</code></pre>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39376#post-id-39376- The problem is that the element in your case is only known as a *binary expression", not by its minimal polynomial for instance. That is, SageMath knows it as formal expression built from other elements of `QQbar` using additions, mutliplications, inversions and the basic building blocks are algebraic numbers defined by some minimal polynomial. To prove the element to be zero, one needs to prove that this huge expression is $0$, exactly. This is very time consuming.
- Actually what happens is that in the computations of (mainly) the echelon form, when two algebraic numbers had to be multiplied, the result was simply defined as "the product of the operands", the same for additions and so on. The idea is that such a way, the echelon form is computed quite fast.Thu, 02 Nov 2017 12:32:33 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39376#post-id-39376Comment by B r u n o for <pre><code>sage: A = matrix(QQ,[[1,0],[1,0]])
sage: B = matrix(QQ,[[0,0],[1,1]])
sage: b, P = A.is_similar(B, transformation=True)
sage: b
True
sage: P.inverse() * B * P == A
True
</code></pre>
<p>For more, you can use <code>A.is_similar?</code> to get:</p>
<pre><code>Docstring:
Return "True" if "self" and "other" are similar, i.e. related by a
change-of-basis matrix.
INPUT:
* "other" -- a matrix, which should be square, and of the same
size as "self".
* "transformation" -- default: "False" - if "True", the output
may include the change-of-basis matrix (also known as the
similarity transformation). See below for an exact description.
[...]
</code></pre>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39377#post-id-39377Now finally, you can do all the computations by hand, simply not checking that $P_A$ is indeed invertible:
sage: A = matrix(QQbar,6,[2,-1,-I,0,0,0,-1,2,-1,0,0,0,I,-1,3,-1,0,0,0,0,-1,2,-1,0,0,0,0,-1,2,-1,0,0,0,0,-1,1])
sage: B = matrix(QQbar,6,[2,-1,0,-I,0,0,-1,2,-1,0,0,0,0,-1,2,-1,0,0,I,0,-1,3,-1,0,0,0,0,-1,2,-1,0,0,0,0,-1,1])
sage: _, PA = A.jordan_form(transformation=True)
sage: _, PB = B.jordan_form(transformation=True)
sage: PAaug = PA.augment(identity_matrix(6))
sage: PAinv = PAaug.echelon_form()[:, 6:]
sage: P = PB * PAinvThu, 02 Nov 2017 12:37:26 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39377#post-id-39377Answer by dan_fulea for <p>Suppose I have two matrices A=(1 0 ; 1 0) and B=(0 0; 1 1) where A,B are two by two matrices and suppose we know A and B are two similar matrices. Then there exist an invertible matrix P such that A=PBP^(-1). Now, how one can calculate the matrix P in sage.</p>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?answer=39373#post-id-39373
The answer of [B r u n o](https://ask.sagemath.org/users/8630/b-r-u-n-o/) is
a quick answer for the stated question, as it was stated.
The new question should be stated in a different place, not as an obscure partial comment,
since it comes with a complication in a special problem inside a "similar" problem.
Here is a solution "to the comment". It would not fit as a comment, so it is an answer.
First of all, let us write the two matrices, so that a human can compile them with bare eyes:
j = I
F = QQbar
A = matrix( F, 6, [ 2,-1,-j,0,0,0,
-1,2,-1,0,0,0,
j,-1,3,-1,0,0,
0,0,-1,2,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
B = matrix( F, 6, [ 2,-1,0,-j,0,0,
-1,2,-1,0,0,0,
0,-1,2,-1,0,0,
j,0,-1,3,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
They are of course similar, as we can quickly test:
sage: A.is_similar(B)
True
But the plain request
sage: A.is_similar(B, transformation=1)
takes too long to be exectued. Why?! We will see soon.
Some useful information is as follows:
sage: A.is_diagonalizable()
True
sage: B.is_diagonalizable()
True
sage: A.minpoly() == A.charpoly()
True
sage: B.minpoly() == B.charpoly()
True
So a solution to the problem can be found by getting the Jordan normal forms, **and** the transformations.
Let us see.
We compute $T_A,T_B$ so that we have:
$$ T_A\; J = A\cdot T_A\ .$$
$$ T_B\; J = B\cdot T_B\ .$$
Code:
JA, TA = A.jordan_form( transformation=1 )
JB, TB = B.jordan_form( transformation=1 )
print "Is JA == JB?", JA==JB
This gives:
Is JA == JB? True
So let us define:
J = JA
Let us test the equalities:
print "TA*J == A*TA is", TA*J == A*TA
print "TB*J == B*TB is", TB*J == B*TB
which gives:
TA*J == A*TA is True
TB*J == B*TB is True
Then from $J=T_A^{-1}AT_A=T_B^{-1}BT_B$ we get immediately
$$ A = (T_AT_B^{-1})\cdot B\cdot (T_BT_A^{-1})\ ,$$
so it seems that we only need to ask for one or both matrices
TB * TA.inverse()
TA * TB.inverse()
and we are done. Well,
first of all, inverses of $6\times 6$ matrices with entries in `QQbar` are hard to compute.
We simply type
TA.inverse()
and start to wait. So here was the problem. At this stage we can involve either mathematics of a better (programatical) framework to solve the problem.
**(1) Using mathematics***
In order to avoid taking inverses, we use the following trick. We take the Jordan normal form for `BB`, the transpose of `B`. Let $B'$ be mathematically
the transposed matrix to $B$
The equality
$$ T_{B'}\; J = B'\; T_{B'} $$
becomes transposed, and we get
$$ T_{B'}\; B = J\; T_{B'} \ .$$
Here, $J=J_B=J_{B'}$ is the same Jordan normal form. Then from
$$ J=T_A^{-1}AT_A=T'_{B'}B(T'_{B'})^{-1} $$
we see that the matrix
$ S = T_A\cdot T'_{B'} $
is our friend. Code for this path:
BB = B.transpose()
JA , TA = A .jordan_form( transformation=1 )
JBB, TBB = BB.jordan_form( transformation=1 )
print "Is JA == JBB?", JA == JBB # yes...
# J = JA
S = TA * TBB.transpose()
print "A*S - S*B is the following matrix:"
print A*S - S*B
And we get:
Is JA == JBB? True
A*S - S*B is the following matrix:
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
[0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
But we have to wait again too long for
print "Is A*S == S*B?", A*S == S*B # looooooooong tie...
So even recognizing if some numbers are zero...
**(2) Using linear algebra**
Note that a base change matrix to get the Jordan normal form is determined only
up to multiplication with some diagonal matrix. (The diagonal matrices form the commutator of $J$, they commute with $J$, of course and $J$ has different eigenvalues.)
This is so since its columns, eigenvectors for the corresponding eigenvalues in the Jordan matrix, are determined only up to a (non-zero)
multiplicative constant.
Now let us imagine $X$ is a matrix of shape $6\times 6$, each entry is an unknown,
We want to solve the **linear** system $AX=XB$. It is natural to get a vector space of dimension $6$
of solutions. It is exactly what the following code does.
The basis of this vector space is built out of six matrices, the code prints them.
Each linear combination of them delivers a matrix interchanging $A,B$. The code:
F.<J> = QuadraticField( -1 )
A = matrix( F, 6, [ 2,-1,-J,0,0,0,
-1,2,-1,0,0,0,
J,-1,3,-1,0,0,
0,0,-1,2,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
B = matrix( F, 6, [ 2,-1,0,-J,0,0,
-1,2,-1,0,0,0,
0,-1,2,-1,0,0,
J,0,-1,3,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
a = A.change_ring( QQbar )
b = B.change_ring( QQbar )
R = range(6)
xvars = var( ','.join( [ 'x%s%s' % (j,k) for j in R for k in R ] ) )
X = matrix( 6, 6, xvars )
eqs = [ ( A*X )[j,k] == (X*B)[j,k] for j in R for k in R ]
sol = solve( eqs, xvars, solution_dict=1 )[0]
S = matrix( 6, 6, [ sol[ xvar ] for xvar in xvars ] )
newvars = list( set( [ v for j in R for k in R for v in S[j,k].variables() ] ) )
# extract from S these new variables
RT = range(len(newvars)) # this is range(6)
T = [ matrix( 6, 6, [ QQbar( S[j,k].coefficient( newvars[t] ) ) for j in R for k in R ] )
for t in RT ]
for t in RT:
U = T[t]
print ( "T[%s] is the following matrix:\n%s\ndet( T[%s] ) = %s\nThe relation A*T[%s] == T[%s]*B is %s\n"
% ( t, U, t, U.det(), t, t, a*U == U*b ) )
And the results are:
T[0] is the following matrix:
[ I + 1 0 I + 1 1 0 I]
[ -I I + 1 1 0 I + 1 0]
[ 0 -I + 1 0 I + 2 0 1]
[ -I + 1 -2*I I + 1 I 2 1]
[-2*I + 1 I + 1 -I 2 I + 1 2]
[ 0 -I + 1 2 1 2 I + 2]
det( T[0] ) = 16*I + 112
The relation A*T[0] == T[0]*B is True
T[1] is the following matrix:
[ 0 0 0 0 I I + 1]
[ 0 0 0 0 1 I + 1]
[ 0 0 0 1 1 1]
[ -I 0 1 1 1 1]
[ -I -I + 1 1 1 1 1]
[-I + 1 -I + 1 -I + 1 1 1 1]
det( T[1] ) = 4*I - 4
The relation A*T[1] == T[1]*B is True
T[2] is the following matrix:
[ -1 0 I 0 -I + 1 0]
[ 0 I - 1 0 1 0 -I]
[ 0 0 0 -1 0 -I]
[ 0 -I + 1 -2 -1 0 -I]
[ 1 -2 -I 0 -I - 1 0]
[ -2 0 0 -I 0 -1]
det( T[2] ) = -32*I - 32
The relation A*T[2] == T[2]*B is True
T[3] is the following matrix:
[ I 1 I 0 0 0]
[ 1 I 1 0 0 0]
[ 0 1 0 I + 1 0 0]
[ -I 1 I 0 I + 1 0]
[ 1 -1 1 I + 1 0 I + 1]
[ 0 1 0 0 I + 1 I + 1]
det( T[3] ) = 8*I + 8
The relation A*T[3] == T[3]*B is True
T[4] is the following matrix:
[ 2 0 1 0 -I 1]
[ 0 2 -I 0 1 -I]
[ 0 -I 1 -I + 1 0 -I]
[ -1 -I - 1 2 1 -2*I + 1 0]
[-2*I - 1 I + 1 -I - 1 -2*I + 1 1 -I + 1]
[ 0 -2*I - 1 -I 0 -I + 1 -I + 2]
det( T[4] ) = 212*I + 280
The relation A*T[4] == T[4]*B is True
T[5] is the following matrix:
[ 1 0 2 0 -1 -I]
[ -I 2 0 0 -I -1]
[ 1 -I 0 -I + 1 0 -1]
[ 0 -I - 1 1 I -I 0]
[-I - 1 I + 1 -1 -I I -I + 1]
[ I -1 0 0 -I + 1 1]
det( T[5] ) = -40*I + 36
The relation A*T[5] == T[5]*B is True
On the free market i would maybe be able to sell only `T[1]`, nobody will buy
on other intertwiner, too complicated...Thu, 02 Nov 2017 11:19:37 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?answer=39373#post-id-39373Comment by B r u n o for <p>The answer of <a href="https://ask.sagemath.org/users/8630/b-r-u-n-o/">B r u n o</a> is
a quick answer for the stated question, as it was stated.</p>
<p>The new question should be stated in a different place, not as an obscure partial comment,
since it comes with a complication in a special problem inside a "similar" problem.</p>
<p>Here is a solution "to the comment". It would not fit as a comment, so it is an answer.</p>
<p>First of all, let us write the two matrices, so that a human can compile them with bare eyes:</p>
<pre><code>j = I
F = QQbar
A = matrix( F, 6, [ 2,-1,-j,0,0,0,
-1,2,-1,0,0,0,
j,-1,3,-1,0,0,
0,0,-1,2,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
B = matrix( F, 6, [ 2,-1,0,-j,0,0,
-1,2,-1,0,0,0,
0,-1,2,-1,0,0,
j,0,-1,3,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
</code></pre>
<p>They are of course similar, as we can quickly test:</p>
<pre><code>sage: A.is_similar(B)
True
</code></pre>
<p>But the plain request</p>
<pre><code>sage: A.is_similar(B, transformation=1)
</code></pre>
<p>takes too long to be exectued. Why?! We will see soon.</p>
<p>Some useful information is as follows:</p>
<pre><code>sage: A.is_diagonalizable()
True
sage: B.is_diagonalizable()
True
sage: A.minpoly() == A.charpoly()
True
sage: B.minpoly() == B.charpoly()
True
</code></pre>
<p>So a solution to the problem can be found by getting the Jordan normal forms, <strong>and</strong> the transformations.
Let us see.</p>
<p>We compute $T_A,T_B$ so that we have:
$$ T_A\; J = A\cdot T_A\ .$$
$$ T_B\; J = B\cdot T_B\ .$$</p>
<p>Code:</p>
<pre><code>JA, TA = A.jordan_form( transformation=1 )
JB, TB = B.jordan_form( transformation=1 )
print "Is JA == JB?", JA==JB
</code></pre>
<p>This gives:</p>
<pre><code>Is JA == JB? True
</code></pre>
<p>So let us define:</p>
<pre><code>J = JA
</code></pre>
<p>Let us test the equalities:</p>
<pre><code>print "TA*J == A*TA is", TA*J == A*TA
print "TB*J == B*TB is", TB*J == B*TB
</code></pre>
<p>which gives:</p>
<pre><code>TA*J == A*TA is True
TB*J == B*TB is True
</code></pre>
<p>Then from $J=T_A^{-1}AT_A=T_B^{-1}BT_B$ we get immediately
$$ A = (T_AT_B^{-1})\cdot B\cdot (T_BT_A^{-1})\ ,$$
so it seems that we only need to ask for one or both matrices</p>
<pre><code>TB * TA.inverse()
TA * TB.inverse()
</code></pre>
<p>and we are done. Well,
first of all, inverses of $6\times 6$ matrices with entries in <code>QQbar</code> are hard to compute.
We simply type</p>
<pre><code>TA.inverse()
</code></pre>
<p>and start to wait. So here was the problem. At this stage we can involve either mathematics of a better (programatical) framework to solve the problem.</p>
<p><strong>(1) Using mathematics*</strong></p>
<p>In order to avoid taking inverses, we use the following trick. We take the Jordan normal form for <code>BB</code>, the transpose of <code>B</code>. Let $B'$ be mathematically
the transposed matrix to $B$
The equality
$$ T_{B'}\; J = B'\; T_{B'} $$
becomes transposed, and we get
$$ T_{B'}\; B = J\; T_{B'} \ .$$
Here, $J=J_B=J_{B'}$ is the same Jordan normal form. Then from
$$ J=T_A^{-1}AT_A=T'_{B'}B(T'_{B'})^{-1} $$
we see that the matrix
$ S = T_A\cdot T'_{B'} $
is our friend. Code for this path:</p>
<pre><code>BB = B.transpose()
JA , TA = A .jordan_form( transformation=1 )
JBB, TBB = BB.jordan_form( transformation=1 )
print "Is JA == JBB?", JA == JBB # yes...
# J = JA
S = TA * TBB.transpose()
print "A*S - S*B is the following matrix:"
print A*S - S*B
</code></pre>
<p>And we get:</p>
<pre><code>Is JA == JBB? True
A*S - S*B is the following matrix:
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
[0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
</code></pre>
<p>But we have to wait again too long for</p>
<pre><code>print "Is A*S == S*B?", A*S == S*B # looooooooong tie...
</code></pre>
<p>So even recognizing if some numbers are zero...</p>
<p><strong>(2) Using linear algebra</strong></p>
<p>Note that a base change matrix to get the Jordan normal form is determined only
up to multiplication with some diagonal matrix. (The diagonal matrices form the commutator of $J$, they commute with $J$, of course and $J$ has different eigenvalues.)
This is so since its columns, eigenvectors for the corresponding eigenvalues in the Jordan matrix, are determined only up to a (non-zero)
multiplicative constant.</p>
<p>Now let us imagine $X$ is a matrix of shape $6\times 6$, each entry is an unknown,
We want to solve the <strong>linear</strong> system $AX=XB$. It is natural to get a vector space of dimension $6$
of solutions. It is exactly what the following code does.</p>
<p>The basis of this vector space is built out of six matrices, the code prints them.
Each linear combination of them delivers a matrix interchanging $A,B$. The code:</p>
<pre><code>F.<J> = QuadraticField( -1 )
A = matrix( F, 6, [ 2,-1,-J,0,0,0,
-1,2,-1,0,0,0,
J,-1,3,-1,0,0,
0,0,-1,2,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
B = matrix( F, 6, [ 2,-1,0,-J,0,0,
-1,2,-1,0,0,0,
0,-1,2,-1,0,0,
J,0,-1,3,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
a = A.change_ring( QQbar )
b = B.change_ring( QQbar )
R = range(6)
xvars = var( ','.join( [ 'x%s%s' % (j,k) for j in R for k in R ] ) )
X = matrix( 6, 6, xvars )
eqs = [ ( A*X )[j,k] == (X*B)[j,k] for j in R for k in R ]
sol = solve( eqs, xvars, solution_dict=1 )[0]
S = matrix( 6, 6, [ sol[ xvar ] for xvar in xvars ] )
newvars = list( set( [ v for j in R for k in R for v in S[j,k].variables() ] ) )
# extract from S these new variables
RT = range(len(newvars)) # this is range(6)
T = [ matrix( 6, 6, [ QQbar( S[j,k].coefficient( newvars[t] ) ) for j in R for k in R ] )
for t in RT ]
for t in RT:
U = T[t]
print ( "T[%s] is the following matrix:\n%s\ndet( T[%s] ) = %s\nThe relation A*T[%s] == T[%s]*B is %s\n"
% ( t, U, t, U.det(), t, t, a*U == U*b ) )
</code></pre>
<p>And the results are:</p>
<pre><code>T[0] is the following matrix:
[ I + 1 0 I + 1 1 0 I]
[ -I I + 1 1 0 I + 1 0]
[ 0 -I + 1 0 I + 2 0 1]
[ -I + 1 -2*I I + 1 I 2 1]
[-2*I + 1 I + 1 -I 2 I + 1 2]
[ 0 -I + 1 2 1 2 I + 2]
det( T[0] ) = 16*I + 112
The relation A*T[0] == T[0]*B is True
T[1] is the following matrix:
[ 0 0 0 0 I I + 1]
[ 0 0 0 0 1 I + 1]
[ 0 0 0 1 1 1]
[ -I 0 1 1 1 1]
[ -I -I + 1 1 1 1 1]
[-I + 1 -I + 1 -I + 1 1 1 1]
det( T[1] ) = 4*I - 4
The relation A*T[1] == T[1]*B is True
T[2] is the following matrix:
[ -1 0 I 0 -I + 1 0]
[ 0 I - 1 0 1 0 -I]
[ 0 0 0 -1 0 -I]
[ 0 -I + 1 -2 -1 0 -I]
[ 1 -2 -I 0 -I - 1 0]
[ -2 0 0 -I 0 -1]
det( T[2] ) = -32*I - 32
The relation A*T[2] == T[2]*B is True
T[3] is the following matrix:
[ I 1 I 0 0 0]
[ 1 I 1 0 0 0]
[ 0 1 0 I + 1 0 0]
[ -I 1 I 0 I + 1 0]
[ 1 -1 1 I + 1 0 I + 1]
[ 0 1 0 0 I + 1 I + 1]
det( T[3] ) = 8*I + 8
The relation A*T[3] == T[3]*B is True
T[4] is the following matrix:
[ 2 0 1 0 -I 1]
[ 0 2 -I 0 1 -I]
[ 0 -I 1 -I + 1 0 -I]
[ -1 -I - 1 2 1 -2*I + 1 0]
[-2*I - 1 I + 1 -I - 1 -2*I + 1 1 -I + 1]
[ 0 -2*I - 1 -I 0 -I + 1 -I + 2]
det( T[4] ) = 212*I + 280
The relation A*T[4] == T[4]*B is True
T[5] is the following matrix:
[ 1 0 2 0 -1 -I]
[ -I 2 0 0 -I -1]
[ 1 -I 0 -I + 1 0 -1]
[ 0 -I - 1 1 I -I 0]
[-I - 1 I + 1 -1 -I I -I + 1]
[ I -1 0 0 -I + 1 1]
det( T[5] ) = -40*I + 36
The relation A*T[5] == T[5]*B is True
</code></pre>
<p>On the free market i would maybe be able to sell only <code>T[1]</code>, nobody will buy
on other intertwiner, too complicated...</p>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39378#post-id-39378I have two questions:
1. It seems that your *trick* of taking the transpose is general: Could we modify `is_similar` to compute the transformation matrix using your trick? It would be presumably more efficient!
2. Do you think that taking inverse of $6\times 6$ matrices over `QQbar` *has to* be costly, or is it simply due to SageMath's current implementation?Thu, 02 Nov 2017 14:22:47 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39378#post-id-39378Comment by dan_fulea for <p>The answer of <a href="https://ask.sagemath.org/users/8630/b-r-u-n-o/">B r u n o</a> is
a quick answer for the stated question, as it was stated.</p>
<p>The new question should be stated in a different place, not as an obscure partial comment,
since it comes with a complication in a special problem inside a "similar" problem.</p>
<p>Here is a solution "to the comment". It would not fit as a comment, so it is an answer.</p>
<p>First of all, let us write the two matrices, so that a human can compile them with bare eyes:</p>
<pre><code>j = I
F = QQbar
A = matrix( F, 6, [ 2,-1,-j,0,0,0,
-1,2,-1,0,0,0,
j,-1,3,-1,0,0,
0,0,-1,2,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
B = matrix( F, 6, [ 2,-1,0,-j,0,0,
-1,2,-1,0,0,0,
0,-1,2,-1,0,0,
j,0,-1,3,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
</code></pre>
<p>They are of course similar, as we can quickly test:</p>
<pre><code>sage: A.is_similar(B)
True
</code></pre>
<p>But the plain request</p>
<pre><code>sage: A.is_similar(B, transformation=1)
</code></pre>
<p>takes too long to be exectued. Why?! We will see soon.</p>
<p>Some useful information is as follows:</p>
<pre><code>sage: A.is_diagonalizable()
True
sage: B.is_diagonalizable()
True
sage: A.minpoly() == A.charpoly()
True
sage: B.minpoly() == B.charpoly()
True
</code></pre>
<p>So a solution to the problem can be found by getting the Jordan normal forms, <strong>and</strong> the transformations.
Let us see.</p>
<p>We compute $T_A,T_B$ so that we have:
$$ T_A\; J = A\cdot T_A\ .$$
$$ T_B\; J = B\cdot T_B\ .$$</p>
<p>Code:</p>
<pre><code>JA, TA = A.jordan_form( transformation=1 )
JB, TB = B.jordan_form( transformation=1 )
print "Is JA == JB?", JA==JB
</code></pre>
<p>This gives:</p>
<pre><code>Is JA == JB? True
</code></pre>
<p>So let us define:</p>
<pre><code>J = JA
</code></pre>
<p>Let us test the equalities:</p>
<pre><code>print "TA*J == A*TA is", TA*J == A*TA
print "TB*J == B*TB is", TB*J == B*TB
</code></pre>
<p>which gives:</p>
<pre><code>TA*J == A*TA is True
TB*J == B*TB is True
</code></pre>
<p>Then from $J=T_A^{-1}AT_A=T_B^{-1}BT_B$ we get immediately
$$ A = (T_AT_B^{-1})\cdot B\cdot (T_BT_A^{-1})\ ,$$
so it seems that we only need to ask for one or both matrices</p>
<pre><code>TB * TA.inverse()
TA * TB.inverse()
</code></pre>
<p>and we are done. Well,
first of all, inverses of $6\times 6$ matrices with entries in <code>QQbar</code> are hard to compute.
We simply type</p>
<pre><code>TA.inverse()
</code></pre>
<p>and start to wait. So here was the problem. At this stage we can involve either mathematics of a better (programatical) framework to solve the problem.</p>
<p><strong>(1) Using mathematics*</strong></p>
<p>In order to avoid taking inverses, we use the following trick. We take the Jordan normal form for <code>BB</code>, the transpose of <code>B</code>. Let $B'$ be mathematically
the transposed matrix to $B$
The equality
$$ T_{B'}\; J = B'\; T_{B'} $$
becomes transposed, and we get
$$ T_{B'}\; B = J\; T_{B'} \ .$$
Here, $J=J_B=J_{B'}$ is the same Jordan normal form. Then from
$$ J=T_A^{-1}AT_A=T'_{B'}B(T'_{B'})^{-1} $$
we see that the matrix
$ S = T_A\cdot T'_{B'} $
is our friend. Code for this path:</p>
<pre><code>BB = B.transpose()
JA , TA = A .jordan_form( transformation=1 )
JBB, TBB = BB.jordan_form( transformation=1 )
print "Is JA == JBB?", JA == JBB # yes...
# J = JA
S = TA * TBB.transpose()
print "A*S - S*B is the following matrix:"
print A*S - S*B
</code></pre>
<p>And we get:</p>
<pre><code>Is JA == JBB? True
A*S - S*B is the following matrix:
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-12 + 0.?e-12*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
[0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
[0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-12 + 0.?e-12*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I 0.?e-11 + 0.?e-11*I]
</code></pre>
<p>But we have to wait again too long for</p>
<pre><code>print "Is A*S == S*B?", A*S == S*B # looooooooong tie...
</code></pre>
<p>So even recognizing if some numbers are zero...</p>
<p><strong>(2) Using linear algebra</strong></p>
<p>Note that a base change matrix to get the Jordan normal form is determined only
up to multiplication with some diagonal matrix. (The diagonal matrices form the commutator of $J$, they commute with $J$, of course and $J$ has different eigenvalues.)
This is so since its columns, eigenvectors for the corresponding eigenvalues in the Jordan matrix, are determined only up to a (non-zero)
multiplicative constant.</p>
<p>Now let us imagine $X$ is a matrix of shape $6\times 6$, each entry is an unknown,
We want to solve the <strong>linear</strong> system $AX=XB$. It is natural to get a vector space of dimension $6$
of solutions. It is exactly what the following code does.</p>
<p>The basis of this vector space is built out of six matrices, the code prints them.
Each linear combination of them delivers a matrix interchanging $A,B$. The code:</p>
<pre><code>F.<J> = QuadraticField( -1 )
A = matrix( F, 6, [ 2,-1,-J,0,0,0,
-1,2,-1,0,0,0,
J,-1,3,-1,0,0,
0,0,-1,2,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
B = matrix( F, 6, [ 2,-1,0,-J,0,0,
-1,2,-1,0,0,0,
0,-1,2,-1,0,0,
J,0,-1,3,-1,0,
0,0,0,-1,2,-1,
0,0,0,0,-1,1 ] )
a = A.change_ring( QQbar )
b = B.change_ring( QQbar )
R = range(6)
xvars = var( ','.join( [ 'x%s%s' % (j,k) for j in R for k in R ] ) )
X = matrix( 6, 6, xvars )
eqs = [ ( A*X )[j,k] == (X*B)[j,k] for j in R for k in R ]
sol = solve( eqs, xvars, solution_dict=1 )[0]
S = matrix( 6, 6, [ sol[ xvar ] for xvar in xvars ] )
newvars = list( set( [ v for j in R for k in R for v in S[j,k].variables() ] ) )
# extract from S these new variables
RT = range(len(newvars)) # this is range(6)
T = [ matrix( 6, 6, [ QQbar( S[j,k].coefficient( newvars[t] ) ) for j in R for k in R ] )
for t in RT ]
for t in RT:
U = T[t]
print ( "T[%s] is the following matrix:\n%s\ndet( T[%s] ) = %s\nThe relation A*T[%s] == T[%s]*B is %s\n"
% ( t, U, t, U.det(), t, t, a*U == U*b ) )
</code></pre>
<p>And the results are:</p>
<pre><code>T[0] is the following matrix:
[ I + 1 0 I + 1 1 0 I]
[ -I I + 1 1 0 I + 1 0]
[ 0 -I + 1 0 I + 2 0 1]
[ -I + 1 -2*I I + 1 I 2 1]
[-2*I + 1 I + 1 -I 2 I + 1 2]
[ 0 -I + 1 2 1 2 I + 2]
det( T[0] ) = 16*I + 112
The relation A*T[0] == T[0]*B is True
T[1] is the following matrix:
[ 0 0 0 0 I I + 1]
[ 0 0 0 0 1 I + 1]
[ 0 0 0 1 1 1]
[ -I 0 1 1 1 1]
[ -I -I + 1 1 1 1 1]
[-I + 1 -I + 1 -I + 1 1 1 1]
det( T[1] ) = 4*I - 4
The relation A*T[1] == T[1]*B is True
T[2] is the following matrix:
[ -1 0 I 0 -I + 1 0]
[ 0 I - 1 0 1 0 -I]
[ 0 0 0 -1 0 -I]
[ 0 -I + 1 -2 -1 0 -I]
[ 1 -2 -I 0 -I - 1 0]
[ -2 0 0 -I 0 -1]
det( T[2] ) = -32*I - 32
The relation A*T[2] == T[2]*B is True
T[3] is the following matrix:
[ I 1 I 0 0 0]
[ 1 I 1 0 0 0]
[ 0 1 0 I + 1 0 0]
[ -I 1 I 0 I + 1 0]
[ 1 -1 1 I + 1 0 I + 1]
[ 0 1 0 0 I + 1 I + 1]
det( T[3] ) = 8*I + 8
The relation A*T[3] == T[3]*B is True
T[4] is the following matrix:
[ 2 0 1 0 -I 1]
[ 0 2 -I 0 1 -I]
[ 0 -I 1 -I + 1 0 -I]
[ -1 -I - 1 2 1 -2*I + 1 0]
[-2*I - 1 I + 1 -I - 1 -2*I + 1 1 -I + 1]
[ 0 -2*I - 1 -I 0 -I + 1 -I + 2]
det( T[4] ) = 212*I + 280
The relation A*T[4] == T[4]*B is True
T[5] is the following matrix:
[ 1 0 2 0 -1 -I]
[ -I 2 0 0 -I -1]
[ 1 -I 0 -I + 1 0 -1]
[ 0 -I - 1 1 I -I 0]
[-I - 1 I + 1 -1 -I I -I + 1]
[ I -1 0 0 -I + 1 1]
det( T[5] ) = -40*I + 36
The relation A*T[5] == T[5]*B is True
</code></pre>
<p>On the free market i would maybe be able to sell only <code>T[1]</code>, nobody will buy
on other intertwiner, too complicated...</p>
https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39383#post-id-393831. Yes, the trick should work in the generic case. (There should be also a point in between when there is a match the diagonal of the J-matrices. The trick works only if J is diagonal, else the standard form blocks are switches to the other side of the diagonal...)
2. There should be relatively simple - in principle - to invert a matrix, Gauss algorithm against the initial identity matrix, there are $N$ elimination steps, where we need the $N$ pivot elements. So we need a quick check if an element is zero. I think this is the drawback when working "uncontrolled" in `QQbar`. An algorithm to decide immediately if an element is zero or not is needed, and this is all is needed. All other computations seem to work well. I also tried to work in the Galois closure. No chance...Thu, 02 Nov 2017 20:54:36 +0100https://ask.sagemath.org/question/39355/how-to-find-the-similarity-matrix-between-two-similar-matrices/?comment=39383#post-id-39383