Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

A lazier solution : use R's gigantic statistical libraries. Run this :

# Data matrix in Sage
M=matrix([[1,2],[3.45,4],[6,5],[4,3]])
# Build an R data frame
RData=r.data_frame(x=M.columns()[0].list(),
                   y=M.columns()[1].list())
# Do the regression in R
Rlm=r.lm(r.formula("y~x"),
         data=RData)
# Compute the relevant indicators
Rsum=r.summary(Rlm)
# Regression coefficients
b, a = Rlm._sage_()['DATA']['coefficients']['DATA']
# Coefficient of determination
R_squared=Rsum._sage_()['DATA']['r.squared']

Then :

sage: R_squared
0.821935737833981
sage: a
0.568813659400679
sage: b
1.44516065541505

Yes, in that simple case, that's heavier than @tolga's solution. But its principles are applicable to any problem having a solution in R, not just the coefficient of determination of a single-regressor linear regresssion.

HTH,