# Non-linear optimization with no derivatives

I'm trying to optimize a convex (or concave?) function which is a sum of max functions. Is there a function that does this?

For example, I'd like to minimize

max(r_0 + s_0, 0) + max(r_0 + s_1, r_1 + s_0, 0) + max(r_0 + s_2, r_1 + s_1, r_2 + s_0, 0) + max(r_1 + s_2, r_2 + s_1, 0) + max(r_2 + s_2, 0)

subject to some linear constraints in terms of r_i and s_i.

I saw minimize_constrained, but it required the the minimizing function to have a defined derivative, which is not built in.

edit retag close merge delete

Sort by » oldest newest most voted

Introduce a new variable for each $\max$, say $m_i$, and constraints that it is $\geq$ each argument of $\max$. Then in the objective function replace each $\max$ with the corresponding $m_i$. For your example we get:

$$\begin{cases} m_1\geq r_0 + s_0\\ m_1\geq 0\\ m_2 \geq r_0 + s_1\\ m_2\geq r_1 + s_0\\ m_2\geq 0\\ \dots\\ m_1 + m_2 + m_3 + m_4 + m_5\ \rightarrow\ \min \end{cases}$$ Since all constraints here are linear, such a problem can be solved via MILP - https://doc.sagemath.org/html/en/refe...

more