ASKSAGE: Sage Q&A Forum - RSS feedhttps://ask.sagemath.org/questions/Q&A Forum for SageenCopyright Sage, 2010. Some rights reserved under creative commons license.Mon, 21 Oct 2013 20:03:37 +0200How quickly minimizing $M*x-v$ (numerically) ?https://ask.sagemath.org/question/10643/how-quickly-minimizing-mx-v-numerically/Let $v$ in $R^m$ and let $M$ be a matrix from $R^n$ to $R^m$, with $m>n$ big numbers.
I want to compute a vector $x$ in $R^n$ such that the norm of $M*x-v$ is minimal.
One way is to compute the projection $w$ of $v$ on the image of $M$.
For so, we can compute the projection $p$ on the image of $M$, as follows :
MTGS=M.transpose().gram_schmidt()[0] # it's orthogonalization, not orthonormalization
l=MTGS.rank()
U=[]
for i in range(l):
v=MTGS[i]
u=v/(v.norm())
L=list(u)
U.append(L)
N=matrix(m,l,U)
p=N.transpose()*N
Then:
w=p*v
x=M.solve_right(w)
This vector $x$ minimizes the norm of $M*x-v$, but this method is very expensive in time, because it computes $p$ and $w$, while I just need $x$.
> Is there another method, less expensive in time, for computing $x$ ?
**Remark** : I'm ok with numerical methods.
Sébastien PalcouxMon, 21 Oct 2013 20:03:37 +0200https://ask.sagemath.org/question/10643/