Hi,
I am using GLPK in Sage 6.9 to solve MILP problem. I am also using the Solver_parameter('timelimit') to terminate the problem after the specficied time. According to http://doc.sagemath.org/html/en/reference/numerical/sage/numerical/mip.html in "SageMath documentation", "timelimit" defines the maximum time spent on a computation. Measured in seconds. However, I am a bit confiused with this definition. This is becasue:
I am solving the same problem with the same timelimit value for different input values (generated randomly using uniform distribution) while I was priniting the start time (giving input) and end time (returning value) of each. I have noticed the duration between the start and end times are different when I am giving different inputs.
I am a bit consufed now, as I expect to be the same because of the defiend timelimit.
Can some one please tell me why they are different even though I used the same problem with the same timelimit? Is it something related to the size of the input, as its only difference between them?
Cheers, Aissan