Factorial(10000000) Performance between Ubuntu and Mac OS X on Sage 7.5.1

asked 2017-02-13 19:46:56 -0600

this post is marked as community wiki

This post is a wiki. Anyone with karma >750 is welcome to improve it.

On Ubuntu 6.10 and Core i7 Quad-Core platform running Sage 7.5.1: I tried

time a=factorial(10000000)

It showed completing in around 3 seconds

Same command on a 2015 MacBook Pro 15in with 5th Generation Quadcore Core i7, the same factorial evaluation showed a completion time of about 5 seconds.

Same performance of 3 seconds was shown even when Ubuntu was run on a VM with same Core i7 processor.

edit retag flag offensive close merge delete


Well, just use GNU/Linux !

tmonteil gravatar imagetmonteil ( 2017-02-15 03:41:54 -0600 )edit

Thanks. I should have been more specific. I am surprised by this performance difference. I do not think this problem is related to Mac OS X vs GNU/Linux. As some of gcc or Xcode compiled standalone C/C++ programs give similar performance on a similarly equipped Mac or Linux machine. I think the issue might be with the Sage env on Mac OS X.

nrsaxena gravatar imagenrsaxena ( 2017-02-16 10:42:57 -0600 )edit