Loading [MathJax]/jax/output/HTML-CSS/jax.js
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Assuming that in your problem the b's have given numerical values, let's try minimize_constrained:

# number of variables 
k = 5

# generate some random data
b_linear = random_vector(RDF, k)
b_quadratic = random_vector(RDF, k)
b_const = random_vector(RDF, 1)

Then, minimize_constrained takes essentially three arguments: the objective function (symbolic function or standard python function), a list with constraints (or tuple of lower/upper bounds as in this case), and an initial point.

# optimization variables
x = [SR.var('x' + str(i)) for i in range(k)]

# constraints
cons = [(1, 9)] * k

# cost function
func = b_const[0] + sum([b_linear[i]*x[i] for i in range(k)]) \
       + sum([b_quadratic[i]*x[i]^2 for i in range(k)])

# initial point
x0 = [5]*k

# solve
xopt = minimize_constrained(func, cons, x0, algorithm='l-bfgs-b')

# show optimal value and optimal point
func({x[i] : xopt[i] for i in range(k)}), xopt

The output looks like: (3.4200707906802412, (1.0, 1.0, 4.3222000559500335, 1.0, 1.0)).

click to hide/show revision 2
No.2 Revision

Assuming that in your problem the b's have given numerical values, let's try minimize_constrained:

# number of variables 
k = 5

# generate some random data
b_linear = random_vector(RDF, k)
b_quadratic = random_vector(RDF, k)
b_const = random_vector(RDF, 1)

Then, minimize_constrained takes essentially three arguments: the objective function (symbolic function or standard python function), a list with (0) constraints (or tuple of lower/upper bounds as in this case), and an initial point.

# optimization variables
x = [SR.var('x' + str(i)) for i in range(k)]

# constraints
cons = [(1, 9)] * k

# cost function
func = b_const[0] + sum([b_linear[i]*x[i] for i in range(k)]) \
       + sum([b_quadratic[i]*x[i]^2 for i in range(k)])

# initial point
x0 = [5]*k

# solve
xopt = minimize_constrained(func, cons, x0, algorithm='l-bfgs-b')

# show optimal value and optimal point
func({x[i] : xopt[i] for i in range(k)}), xopt

The output looks like: (3.4200707906802412, (1.0, 1.0, 4.3222000559500335, 1.0, 1.0)).

click to hide/show revision 3
No.3 Revision

Assuming that in your problem the b's have given numerical values, let's try minimize_constrained:

# number of variables 
k = 5

# generate some random data
b_linear = random_vector(RDF, k)
b_quadratic = random_vector(RDF, k)
b_const = random_vector(RDF, 1)

Then, minimize_constrained takes essentially three arguments: the objective function (symbolic function or standard python function), a list with (0) constraints in 0 form (or tuple tuple(s) of lower/upper bounds as in this case), and an initial point.

# optimization variables
x = [SR.var('x' + str(i)) for i in range(k)]

# constraints
cons = [(1, 9)] * k

# cost function
func = b_const[0] + sum([b_linear[i]*x[i] for i in range(k)]) \
       + sum([b_quadratic[i]*x[i]^2 for i in range(k)])

# initial point
x0 = [5]*k

# solve
xopt = minimize_constrained(func, cons, x0, algorithm='l-bfgs-b')

# show optimal value and optimal point
func({x[i] : xopt[i] for i in range(k)}), xopt

The output looks like: (3.4200707906802412, (1.0, 1.0, 4.3222000559500335, 1.0, 1.0)).