Ask Your Question
1

Can sage do symbolic optimization?

asked 2017-06-24 11:27:53 +0100

anonymous user

Anonymous

I know sage has many functions to do numerical optimization.

Now, I am wondering if sage can do symbolic optimization as well.

For example, if I wanted to maximize a function using numerical methods, I could do this:

f(x,y) = -(x * log(x) + y * log(y))
minimize(-f,[0.1, 0.1])

Which gives the answer:

(0.367880005621,0.367880005621)

Is there a function that can give the solution as $(\frac{1}{e}, \frac{1}{e})$ or do I have to compute the stationary points and perform the necessary substitutions myself?

edit retag flag offensive close merge delete

1 Answer

Sort by » oldest newest most voted
4

answered 2017-06-24 19:50:57 +0100

mforets gravatar image

updated 2017-06-24 19:54:11 +0100

The particular case is readily solved with Sage:

sage: stationary_points = lambda f : solve([gi for gi in f.gradient()], f.variables())
sage: f(x,y) = -(x * log(x) + y * log(y))
sage: stationary_points(f)
[[x == e^(-1), y == e^(-1)]]

More generally, $\nabla f(x_1,\ldots, x_n) = 0 $ is a necessary and sufficienty condition for optimality provided that $f$ is twice continuously differentiable and convex (see e.g. BV Ch. 9.1 page 457). In this setting, we can use stationary_points as above, but in general the solve function will fail to find explicit solutions. Indeed, from that book:

"In a few special cases, we can find an optimal solution by analytically solving the optimality equation, but usually the problem must be solved by an iterative algorithm."

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2017-06-24 11:27:53 +0100

Seen: 1,360 times

Last updated: Jun 24 '17