# Are finite field computations slow?

I ran the following in Sage 4.5.3 on a MacBook Pro (running OSX):

p = 5;
pow = 2;
k = GF(5 ** pow, 'z');
f(x, y) = y^3 + y + 4*x;
def count_pts():
retval = 0;
for a in k:
for b in k:
if (f(a, b) == 0):
retval = retval + 1;
return retval + 1;

count_pts()


It took around 10-20 seconds to return the correct answer of 26. With pow = 1 it terminated much more quickly. The loop is only over 625 pairs of elements of GF(25); is there some reason this program runs so slowly? (And can the performance be improved?)

edit retag close merge delete

Sort by » oldest newest most voted

You are not using it the correct way. You defined f(x,y) over the symbolic ring but then you are doing the computations f(a, b) over the finite field. So, at every step it has to coerce everything to the finite field. You should be using the PolynomialRing to get your variables and constants coerced to the finite field at the very outset.

Try the following code. It is fast.

f(x,y) = y^3 + y + 4*x
print "Original polynomial's base ring:", f.base_ring()
k = GF(25, 'a')
F.<x,y> = PolynomialRing(k)  # x,y are not symbolic variables now
f = y^3 + y + 4*x
print "New base ring:", f.base_ring()
count = 0
for a in k:
for b in k:
if not f.subs(x=a, y=b):
count += 1
print count


The output of above:

Original polynomial's base ring: Symbolic Ring
New base ring: Finite Field in a of size 5^2
25

more

Not an answer, but it doesn't fit in a comment box and relevant, I think. With:

def test1():
count = 0
for a in k:
for b in k:
if not f.subs(x=a, y=b):
count += 1
return count

def test2():
count = 0
for a in k:
for b in k:
if not f(a, b):
count += 1
return count

def test3():
count = 0
for a in k:
for b in k:
if not (b^3+b+4*a):
count += 1
return count


we get

sage: timeit('test1()')
25 loops, best of 3: 24.6 ms per loop
sage: timeit('test2()')
5 loops, best of 3: 64.1 ms per loop
sage: timeit('test3()')
125 loops, best of 3: 2.73 ms per loop


so, surprisingly, subs is quite a bit faster than "polynomial evaluation", and unsurprisingly writing out the expression explicitly is of course even faster than either.

The subs method gives back a result in a ring that's a little too large for the purposes, though (this is as documented):

sage: parent(f.subs(x=k(1),y=k(1)))
Multivariate Polynomial Ring in x, y over Finite Field in a of size 5^2
sage: parent(f(k(1),k(1)))
Finite Field in a of size 5^2

more

This is quite confusing and inconsistent. For symbolic rings it is advised to use subs, and then when it comes to polynomial rings one can evaluate just like that? sage: f(x, y) = x^2 + y # symbolic function sage: f(1, 2) 3 sage: g = x^2 + y # symbolic expression sage: g(1, 2) ... DeprecationWarning: Substitution using function-call syntax and unnamed arguments is deprecated and will be removed from a future release of Sage; you can use named arguments instead, like EXPR(x=..., y=...) See http://trac.sagemath.org/5930 for details. exec code_obj in self.user_global_ns, self.user_ns 3 sage: F.<x,y> = PolynomialRing(RR) sage: h = x^2 + y # polynomial sage: h(1, 2) 3.00000000000000

Yes, there is a good reason for that. For polynomials, there is an explicit order of the variables, so in R.<x,y> we have that f(a,b) evaluates at x=a,y=b and in S.<y,x> we have that g(a,b) evaluates at y=a, x=b. Since there is no room for ambiguity we can safely use mathematical notation. In SR there is no such implicit variable order (it's not even completely evident which variables do occur in an expression: is F= 0.0*x+1.0*y in one or two variables? There is a "callable" form of symbolic expressions, but those are different from normal SR elements. Finally, the fact that for (at least some) polynomials, f.subs(...) works and produces (by default at least) an element in the same symbolic ring is also in step with what happens in SR ...(more)