1 | initial version |
Here are two approaches using scipy.optimize
routines.
def f(x):
q=[x[0]*x[1]*x[2]*x[3] - x[0]*x[1] - x[0]*x[2] - x[0]*x[3] - x[1]*x[2] - x[1]*x[3] + 2*x[0] + 2*x[1] - 448, -x[0]*x[1]*x[2] - x[0]*x[1]*x[3] - x[0]*x[2]*x[3] - x[1]*x[2]*x[3] + 3*x[0] + 3*x[1] + 2*x[2] + 2*x[3] + 452, x[0]*x[1] + x[0]*x[2] + x[0]*x[3] + x[1]*x[2] + x[1]*x[3] + x[2]*x[3] - 159, -x[0] - x[1] - x[2] - x[3] + 21]
return(add([q[i]^2 for i in range(0,4)]))
def g(x):
q=[x[0]*x[1]*x[2]*x[3] - x[0]*x[1] - x[0]*x[2] - x[0]*x[3] - x[1]*x[2] - x[1]*x[3] + 2*x[0] + 2*x[1] - 448, -x[0]*x[1]*x[2] - x[0]*x[1]*x[3] - x[0]*x[2]*x[3] - x[1]*x[2]*x[3] + 3*x[0] + 3*x[1] + 2*x[2] + 2*x[3] + 452, x[0]*x[1] + x[0]*x[2] + x[0]*x[3] + x[1]*x[2] + x[1]*x[3] + x[2]*x[3] - 159, -x[0] - x[1] - x[2] - x[3] + 21]
return(q)
First, we try conjugate gradient minimization of the sum of squares.
from scipy.optimize import fmin_cg
#conjugate gradient minimization of the sum of squares
ans=fmin_cg(f,[2,4,7,8])
print ans
#value of the sum of squares
print f(ans)
#values of each element of the list
print g(ans)
Second, try truncated Newton conjugate gradient minimization of the sum of squares.
#truncated Newton Conjugate Gradient minimization
from scipy.optimize import fmin_tnc
ans=fmin_tnc(f,[2,4,7,8],bounds=[(-10,10),(-10,10),(-10,10),(-10,10)],approx_grad=True)
print ans
#value of the sum of square
print f(ans[0])
#value of each element of the list
g(ans[0])
I've seeded these with your suggested initial values. I'm not sure the results are really giving a solution to your equations, given the values for each equations at the suggested point of minimization. However, you might be able to modify the code above to get what you need.