Ask Your Question

# Double indexed variables in linear programming

The following linear programming code works perfectly.

nlg=4 #nombre de contraintes
mlg=6 #nombre de variables
Alg= matrix(nlg,mlg,[0,0,0,1,1,1,
1,0,0,1,0,0,
0,1,0,0,1,0,
1,1,3,0,0,0
]) # les coefficients
blgmin = [1,0,0,1] # les bornes inférieures pour les contraintes
blgmax = [1,0,0,3] # les bornes supérieures pour les contraintes (oo = infini)
show(LatexExpr('A = '),Alg)
show(LatexExpr('bmin = '),blgmin)

show(LatexExpr('bmax = '),blgmax)
lg = MixedIntegerLinearProgram(maximization=False, solver = "GLPK") # on crèe le programme

x = lg.new_variable(integer=True, indices=[0..mlg-1]) # les nouvelles variables sont x[0] ... x[5]
Blg = Alg * x # la fonction linéaire pour les contraintes
lg.set_objective(x[0])# fixe l’objectif
# On construit les contraintes avec leurs bornes
for i in range(nlg) :
lg.add_constraint(Blg[i], min=blgmin[i], max=blgmax[i])
for i in range(mlg-1) :
lg.set_binary(x[i])
lg.show()
lg.solve()
xx=lg.get_values(x)
show(xx)


But now, for an other problem, I would like to define

1) variables with other names

2) double indexed variables

Is this possible ?

edit retag close merge delete

## 1 answer

Sort by » oldest newest most voted

The primary resource for these question is the documentation: http://doc.sagemath.org/html/en/refer... as is explained there, the indexing of variables (so, how you refer the the indices in your code) is very flexible, and you can just use multi-indices as you like: the interface will keep track of the indices you use and map them to an internal index. The printing of variables is a little more restricted: it looks like variables just always print in the form <name>_<index>.

You can use variables with other names, but presently this seems to be relatively limited:

sage: p = MixedIntegerLinearProgram(solver='GLPK')
sage: x = p.new_variable(real=True, nonnegative=True, name='X')
sage: y = p.new_variable(integer=True, nonnegative=True, name='Y')
sage: x[1],x[2],y[1],y[2]
(x_0, x_1, x_2, x_3)


As the documentation says:

• "name" -- string. Associates a name to the variable. This is only useful when exporting the linear program to a file using "write_mps" or "write_lp", and has no other effect.

so it looks like presently, the name argument is essentially a no-op. It suggests that presently, there is excellent support for writing linear programming problems with very rich variable names and indexing, but the printing seems to focus on the linear algebra representation of the problem, where your variables are just entries in a vector. It would seem this could be changed, but might require discussion if the benefits of that are worth the breaking of backwards compatibility that it would entail.

From a teaching point of view, there is an argument to be made that helping people understand that variables are just entries in a vector (and therefore are naturally labeled with x_0,x_1,x_2,...), but printing that is a little closer to the commands used to input the problem would also be nice; especialy given that x_0,x_1, ... are NEVER appropriate names in a sage session to refer to variables in a MIP.

So ... DEFINING the things you want seems to be fully supported, but having them PRINT in a way that reflects it seems not to be.

more

## Comments

To complete your answer, we can currently get:

sage: p.add_constraint(x[0] + x[1] + y[1] - y[2] == 0)
sage: p.add_constraint(x[0] + y[1,2] <= 3)
sage: p.show()
Maximization:

Constraints:
0.0 <= X[1] + Y[1] - Y[2] + X[0] <= 0.0
X[0] + Y[(1, 2)] <= 3.0
Variables:
X[1] = x_0 is a continuous variable (min=0.0, max=+oo)
X[2] = x_1 is a continuous variable (min=0.0, max=+oo)
Y[1] = x_2 is an integer variable (min=0.0, max=+oo)
Y[2] = x_3 is an integer variable (min=0.0, max=+oo)
X[0] = x_4 is a continuous variable (min=0.0, max=+oo)

( 2019-11-18 01:49:47 -0500 )edit

## Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

## Stats

Asked: 2019-11-11 23:41:44 -0500

Seen: 77 times

Last updated: Nov 15 '19