Processing math: 100%

First time here? Check out the FAQ!

Ask Your Question
1

Non-linear optimization with no derivatives

asked 3 years ago

johng23 gravatar image

I'm trying to optimize a convex (or concave?) function which is a sum of max functions. Is there a function that does this?

For example, I'd like to minimize

max(r_0 + s_0, 0) + max(r_0 + s_1, r_1 + s_0, 0) + max(r_0 + s_2, r_1 + s_1, r_2 + s_0, 0) + max(r_1 + s_2, r_2 + s_1, 0) + max(r_2 + s_2, 0)

subject to some linear constraints in terms of r_i and s_i.

I saw minimize_constrained, but it required the the minimizing function to have a defined derivative, which is not built in.

Preview: (hide)

1 Answer

Sort by » oldest newest most voted
1

answered 3 years ago

Max Alekseyev gravatar image

updated 3 years ago

Introduce a new variable for each max, say mi, and constraints that it is each argument of max. Then in the objective function replace each max with the corresponding mi. For your example we get:

{m1r0+s0m10m2r0+s1m2r1+s0m20m1+m2+m3+m4+m5  min Since all constraints here are linear, such a problem can be solved via MILP - https://doc.sagemath.org/html/en/refe...

Preview: (hide)
link

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 3 years ago

Seen: 204 times

Last updated: Jan 29 '22