# minimization without constraints gives error for multiple dimension

I'm trying to use minimize(func, x0, gradient=None, hessian=None, algorithm='default', verbose=False, **args) given the gradient function. But I'm getting the following error:

sage: f = lambda x : (x*y)**4
sage: g = lambda x : 4*y*(x*y)**3
sage: y = random_vector(RR, 5)
sage: x = random_vector(RR,5)

ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()


What am I doing wrong? the gradient return a vector (correctly so). Shouldn't this work?

edit retag close merge delete

1

minimize wants to minimize real-valued function but the codomain of the lambda function that you provide is not the real numbers because it contains a symbolic variable:

sage: x,y = var('x,y')
sage: f = lambda x : (x*y)**4
sage: f(4.56)
432.373800960000*y^4


Also that function f is not defined over R^5 so you can't start the minimization algo at such a point.

you are right, it should have been as below, and then x domain is now the one I want

sage: y = random_vector(RR, 5)
sage: f = lambda x : y.dot_product(x)**4
sage: g = lambda x : 4*y*y.dot_product(x)**3
sage: x = random_vector(RR,5)


thanks, I'll try again like so

1

minimize uses numpy.ndarray objects so you might have to turn them into vectors to make it work:

sage: y = random_vector(RR, 5)
sage: f = lambda x: y.dot_product(vector(x))**4
sage: x = random_vector(RR, 5)
sage: minimize(f, x)
(-1.0930705321328256, 0.5286798390975018, -0.2851428750737046, -0.9759363717052219, -0.6610153608303511)


is there a way to use sum([x,y]) for x=var('x')?, with x and y in R^5

Sorry, I don't understand what you mean. You seem to prefer to use def and lambda functions. In that case, you don't need the symbolic variables x=var('x').

Sort by » oldest newest most voted

Did you take a look at the examples in the documentation of the function minimize?

sage: x,y = var('x,y')
sage: start = random_vector(RR,2)
sage: minimize((x*y)**4, start)
(0.03970299255032317, -0.16999160099351518)


Documentation says that if you provide a symbolic expression, it computes the gradient for you. You only need to provide a starting point whose dimension should match the number of variables of the expression.

more

I saw the documentation, yes. But my example above should also work, and I think I'm still able to compute gradents by hand. Even if there is an error I don't think it's the one being yielded by sage. Moreover, it should work for a function that is not symbolic.

Documentation says input function should be "Either a symbolic function or a Python function whose argument is a tuple with n components". So you can use a def function:

sage: def f(t):
....:     x,y = t
....:     return (x*y)**4
sage: minimize(f, random_vector(RR, 2))
(1.0161974686309019, 1.0838297482246177e-12)


or a lambda function as follows:

sage: minimize(lambda x:(x*x)^4, random_vector(RR, 2))
(-5.0337249611079625e-11, -0.2698426886953712)