How quickly minimizing M∗x−v (numerically) ?
Let v in Rm and let M be a matrix from Rn to Rm, with m>n big numbers.
I want to compute a vector x in Rn such that the norm of M∗x−v is minimal.
One way is to compute the projection w of v on the image of M.
For so, we can compute the projection p on the image of M, as follows :
MTGS=M.transpose().gram_schmidt()[0] # it's orthogonalization, not orthonormalization
l=MTGS.rank()
U=[]
for i in range(l):
v=MTGS[i]
u=v/(v.norm())
L=list(u)
U.append(L)
N=matrix(m,l,U)
p=N.transpose()*N
Then:
w=p*v
x=M.solve_right(w)
This vector x minimizes the norm of M∗x−v, but this method is very expensive in time, because it computes p and w, while I just need x.
Is there another method, less expensive in time, for computing x ?
Remark : I'm ok with numerical methods.