Factorial(10000000) Performance between Ubuntu and Mac OS X on Sage 7.5.1

asked 2017-02-14 02:46:56 +0100

this post is marked as community wiki

This post is a wiki. Anyone with karma >750 is welcome to improve it.

On Ubuntu 6.10 and Core i7 Quad-Core platform running Sage 7.5.1: I tried

time a=factorial(10000000)

It showed completing in around 3 seconds

Same command on a 2015 MacBook Pro 15in with 5th Generation Quadcore Core i7, the same factorial evaluation showed a completion time of about 5 seconds.

Same performance of 3 seconds was shown even when Ubuntu was run on a VM with same Core i7 processor.

edit retag flag offensive close merge delete

Comments

1

Well, just use GNU/Linux !

tmonteil gravatar imagetmonteil ( 2017-02-15 10:41:54 +0100 )edit

Thanks. I should have been more specific. I am surprised by this performance difference. I do not think this problem is related to Mac OS X vs GNU/Linux. As some of gcc or Xcode compiled standalone C/C++ programs give similar performance on a similarly equipped Mac or Linux machine. I think the issue might be with the Sage env on Mac OS X.

nrsaxena gravatar imagenrsaxena ( 2017-02-16 17:42:57 +0100 )edit
1

Maybe. Easier to use GNU/Linux rather than fixing Mac OS X. Moreover, any fix you would have would apply only for a specific version of Mac OS X. Of course, you are very welcome to propose a fix.

vdelecroix gravatar imagevdelecroix ( 2017-03-10 01:15:17 +0100 )edit