Ask Your Question
1

Defining constraint eqations for minimize_constrained

asked 2011-12-07 18:50:50 +0200

rtrwalker gravatar image

I'm trying to think of a way to do constrained optimization where both the objective function that I want to minimise and my constraint function are calculated within the same overarching function. Lets say I have a function like the following:

def complicated_function(x,y,z):
    #...some lengthy and complicated actions
    return f, g

Is there a way to minimize f by changing x,y,z subject to g>=0?

Looking at the documentation for sage.numerical.optimize.minimize_constrained it looks as if I have to define all my constraint equations individually with the varibales as a tuple. I could maybe wrap my complicated_function up like this:

def funcObjective(x):
    f, g = complicated_function(x[0],x[1],x[2])
    return f
def funcConstraint(x):
    f, g = complicated_function(x[0],x[1],x[2])
    return g

But that would mean complicated_function would be called multiple times for each optimization iteration which seems highly inefficient. The problem would only get worse if there were more contraints. Any ideas how to define the constraint without reevaluating complicated_function?

On my internet wanderings I found the recently (June 6, 2011) released pyOpt 1.0 (journal article) which at first glance looks well suited to the problem. I see OpenOpt is an experimental package for sage. I'm not sure if openOpt is suitable; the pyOpt documentation is, at first glance, clearer. Any chance pyOpt could be made an optional package with sage, it's published under the GNU Lesser General Public License?

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
2

answered 2011-12-07 20:37:23 +0200

DSM gravatar image

updated 2011-12-08 15:23:19 +0200

FWIW, I've used OpenOpt for a few things and had some success. I haven't had as much luck with the scipy optimizers, but YMMV.

One easy way to avoid the cost of multiple evaluations is to take advantage of the cached_function decorator:

def complicated(x,y,z):
    f = (x*2-y*z)**2
    g = x+y-9*z+4
    sleep(0.10)
    return f,g

@cached_function
def complicated_cached(x,y,z):
    f = (x*2-y*z)**2
    g = x+y-9*z+4
    sleep(0.10)
    return f,g

# or complicated_cached = cached_function(complicated)

def get(fn):
    a = minimize_constrained(lambda x: fn(*x)[0], 
                             lambda x: fn(*x)[1],
                             [1,1,1])
    return a

which can produce a significant speedup. (I put in the sleeps to make the one-call time longer-- originally I did multiple runs, but the caching means that I wasn't measuring what I wanted to.)

sage: time a = get(complicated)
Time: CPU 0.06 s, Wall: 7.67 s
sage: time b = get(complicated_cached)
Time: CPU 0.03 s, Wall: 3.83 s

UPDATE:

When numpy gets involved caching the function is a little trickier, because we need an immutable key. We can get around the problem by coercing to a tuple:

def func(p):
    f = -p[0]-p[1]+50
    c_1 = p[0]-45
    c_2 = p[1]-5
    c_3 = -50*p[0]-24*p[1]+2400
    c_4 = -30*p[0]-33*p[1]+2100
    return f, c_1, c_2, c_3, c_4

func_cached = CachedFunction(func)
func_wrap = lambda x: func_cached(tuple(x))

def get2(fn):
    a = minimize_constrained(lambda x: fn(x)[0], 
                             [lambda x: fn(x)[1],lambda x: fn(x)[2],lambda x: fn(x)[3],lambda x: fn(x)[4]],
                             [2,3])

which produces:

sage: time get2(func)
Time: CPU 0.10 s, Wall: 0.10 s
sage: time get2(func_wrap)
Time: CPU 0.02 s, Wall: 0.02 s
edit flag offensive delete link more
0

answered 2011-12-09 13:55:01 +0200

rtrwalker gravatar image

Thanks @DSM. I've tried to generalize your solution with the following:

def get3(fn, constraint_nums, x0, iobj=0):
    """
    Constrained mimimization of a function that is output, along with constraint equations, from function fn.

    Arguments:
        fn - function that returns a tuple containing objective function and constraint equations

        constraint_nums - tuple or list containing index positions of constraint equations in fn output

        x0 - initial guess

        iobj - index position of object function in fn output.  default = 0
    """  

    a = minimize_constrained(lambda x,j=iobj: fn(x)[j], 
                             [lambda x,j=i: fn(x)[j] for i in constraint_nums],
                             x0)
    return a

running get3(func_wrap,range(1,5),[2,3]) results in the correct (45.0, 6.25). I had some trouble with the list comprehension of the lambda functions for the constraints. If I used [lambda x: fn(x)[i] for i in constraint_nums] then the [i] wasn't hardcoded in and each of my lambda equations had a parameter i in them; the list comphehension looped through the i's meaning all the lambda functions were the same with i equal to the final i such that only my last constraint equation was used. [lambda x,j=i: fn(x)[j] for i in constraint_nums] seems to work.

Two further questions: 1.Is there a way to determine the number of values returned by a function without actually evaluating the function? If there was I could further generalise get3 by giving constraint_nums a default value. 2. What actually gets cached when I use FunctionCached? Is everything in the function stored or just the output values? When should I be using fn.clear_cache()?

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2011-12-07 18:50:50 +0200

Seen: 1,211 times

Last updated: Dec 09 '11