Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

How am I using definite_integral wrong?

I'm attempting to evaluate the definite integral of a symbolic function, and I'm getting a type error that I don't understand.

Here's my script:

from sage.symbolic.integration.integral import definite_integral

### random variables
xi = var('xi'); assume(xi >= 0)
tau, eta = var('tau', 'eta'); assume(tau > 0); assume(eta > 0)
p1 = var('p1'); assume(p1 >= 0); assume(p1 <=1)

### expressions
h(p1, alpha, beta) = p1^(1/3 - 1) * (1 - p1)^(1/3 - 1)
k(p1, xi) = (1 - exp(-p1 * xi)) * exp(-p1 * xi)
hk = h * k

### evaluate
print(hk)
print(type(hk))
definite_integral(hk, p1, 0, 1)

The call to print(type(hk)) returns <class 'sage.symbolic.expression.Expression'>, which is what I expect. However, the call to definite_integral(hk, p1, 0, 1) returns a lengthy error message featuring:

TypeError: cannot coerce arguments: no canonical coercion from Callable function ring with arguments (p1, alpha, beta, xi) to Symbolic Ring

I'm not sure what's going on with the types here, and I'd like to understand that so I can get this to work and also avoid making such mistakes in the future.

Thanks in advance, A beginner