Factorial(10000000) Performance between Ubuntu and Mac OS X on Sage 7.5.1
On Ubuntu 6.10 and Core i7 Quad-Core platform running Sage 7.5.1: I tried
time a=factorial(10000000)
It showed completing in around 3 seconds
Same command on a 2015 MacBook Pro 15in with 5th Generation Quadcore Core i7, the same factorial evaluation showed a completion time of about 5 seconds.
Same performance of 3 seconds was shown even when Ubuntu was run on a VM with same Core i7 processor.
Well, just use GNU/Linux !
Thanks. I should have been more specific. I am surprised by this performance difference. I do not think this problem is related to Mac OS X vs GNU/Linux. As some of gcc or Xcode compiled standalone C/C++ programs give similar performance on a similarly equipped Mac or Linux machine. I think the issue might be with the Sage env on Mac OS X.
Maybe. Easier to use GNU/Linux rather than fixing Mac OS X. Moreover, any fix you would have would apply only for a specific version of Mac OS X. Of course, you are very welcome to propose a fix.