# Can sage do symbolic optimization? Anonymous

I know sage has many functions to do numerical optimization.

Now, I am wondering if sage can do symbolic optimization as well.

For example, if I wanted to maximize a function using numerical methods, I could do this:

f(x,y) = -(x * log(x) + y * log(y))
minimize(-f,[0.1, 0.1])


Which gives the answer:

(0.367880005621,0.367880005621)


Is there a function that can give the solution as $(\frac{1}{e}, \frac{1}{e})$ or do I have to compute the stationary points and perform the necessary substitutions myself?

edit retag close merge delete

Sort by » oldest newest most voted

The particular case is readily solved with Sage:

sage: stationary_points = lambda f : solve([gi for gi in f.gradient()], f.variables())
sage: f(x,y) = -(x * log(x) + y * log(y))
sage: stationary_points(f)
[[x == e^(-1), y == e^(-1)]]


More generally, $\nabla f(x_1,\ldots, x_n) = 0$ is a necessary and sufficienty condition for optimality provided that $f$ is twice continuously differentiable and convex (see e.g. BV Ch. 9.1 page 457). In this setting, we can use stationary_points as above, but in general the solve function will fail to find explicit solutions. Indeed, from that book:

"In a few special cases, we can find an optimal solution by analytically solving the optimality equation, but usually the problem must be solved by an iterative algorithm."

more