1 | initial version |

Define a Python function rather than a symbolic function.

```
def g(x):
return sum(frac(j*p/q), j, 0, x)
```

2 | No.2 Revision |

Define a Python function rather than a symbolic function.

Define p and q:

```
p, q = 33, 21
```

Define the function:

```
def g(x):
return
```~~sum(frac(j*p/q), j, 0, x)
~~sum(frac(j*p/q) for j in range(x+1))

Use it:

```
sage: g(10)
31/7
```

3 | No.3 Revision |

~~Define ~~There are two ways to get the value you want.

One is to define a Python function rather than a symbolic function.

Define p and q:

```
p, q = 33, 21
```

Define the function:

```
def g(x):
return sum(frac(j*p/q) for j in range(x+1))
```

Use it:

```
sage: g(10)
31/7
```

The other way, with `g`

defined as in the question
(after declaring `j`

as a symbolic variable)
is to factor the result.

```
sage: j = SR.var('j')
sage: g(x) = sum(frac(j*p/q), j, 0, x)
sage: g(10).factor()
31/7
```

Copyright Sage, 2010. Some rights reserved under creative commons license. Content on this site is licensed under a Creative Commons Attribution Share Alike 3.0 license.