Ask Your Question

Revision history [back]

Solver_parameter('timelimit') meaning

Hi,

I am using GLPK in Sage 6.9 to solve MILP problem. I am also using the Solver_parameter('timelimit') to terminate the problem after the specficied time. According to http://doc.sagemath.org/html/en/reference/numerical/sage/numerical/mip.html in "SageMath documentation", "timelimit" defines the maximum time spent on a computation. Measured in seconds. However, I am a bit confiused with this definition. This is becasue:

I am solving the same problem with the same timelimit value for different input values (generated randomly using uniform distribution) while I was priniting the start time (giving input) and end time (returning value) of each. I have noticed the duration between the start and end times are different when I am giving different inputs.
I am a bit consufed now, as I expect to be the same because of the defiend timelimit.

Can some one please tell me why they are different even though I used the same problem with the same timelimit? Is it something related to the size of the input, as its only difference between them?

Cheers, Aissan

Solver_parameter('timelimit') meaning

Hi,

I am using GLPK in Sage 6.9 to solve MILP problem. I am also using the Solver_parameter('timelimit') to terminate the problem after the specficied time. According to http://doc.sagemath.org/html/en/reference/numerical/sage/numerical/mip.html in "SageMath documentation", SageMath documentation, "timelimit" defines the maximum time spent on a computation. Measured in seconds. However, I am a bit confiused with this definition. This is becasue:

I am solving the same problem with the same timelimit value for different input values (generated randomly using uniform distribution) while I was priniting the start time (giving input) and end time (returning value) of each. I have noticed the duration between the start and end times are different when I am giving different inputs.
(ranging from 1min, 5mins, to even 10mins). I am a bit consufed now, as I expect to be the same because of the defiend timelimit.. I actually expected them to be 10 seconds!

Can some one please tell me why how the time is calcualted for timelimit? Why they are different even though I used the same problem with the same timelimit? Is it something related to the size of the input, as its only difference between them?

Cheers, Aissan

Solver_parameter('timelimit') meaning

Hi,

I am using GLPK in Sage 6.9 to solve MILP problem. I am also using the Solver_parameter('timelimit') to terminate the problem after the specficied time. According to SageMath documentation, "timelimit" defines the maximum time spent on a computation. Measured in seconds. However, I am a bit confiused confused with this definition. This is becasue:because:

I am solving the same problem with the same timelimit value for different input values (generated randomly using uniform distribution) while I was priniting the start time (giving input) and end time (returning value)printing execution time of each. I have noticed the duration between the start and end times are differentdifferent execution times when I am giving different inputs (ranging from 1min, 5mins, to even 10mins). I am a bit consufed now, as I expect to be the same because of the defiend timelimit. I actually expected them to be 10 seconds!

Can some one please tell me how the time computation time is calcualted calculated for timelimit? Why they are different even though I used the same problem with the same timelimit? Is it something related to the size of the input, as its only difference between them? Is there any relation between computation time and `execution time'?

Cheers, Aissan

Solver_parameter('timelimit') meaningExecution time vs. Computation time in MILP

Hi,

I am using GLPK in Sage 6.9 to solve MILP problem. I am also using the Solver_parameter('timelimit') to terminate the problem after the specficied time. According to SageMath documentation, "timelimit" defines the maximum time spent on a computation. Measured in seconds. However, I am a bit confused with this definition. This is because:

I am solving the same problem with the same timelimit value for different input values (generated randomly using uniform distribution) while I was printing execution time of each. I have noticed different execution times when I am giving different inputs (ranging from 1min, 5mins, to even 10mins).

Can some one please tell me how the computation time is calculated for timelimit? Why they are different even though I used the same problem with the same timelimit? Is it something related to the size of the input, as its only difference between them? Is there any relation between computation timecomputation and `execution time'?execution times?

Cheers, Aissan