I am trying to get the sum of 1 + 2 + ... + 1000000000
, but I'm getting funny results in PHP and Node.js.
PHP
$sum = 0;
for($i = 0; $i <= 1000000000 ; $i++) {
$sum += $i;
}
printf("%s", number_format($sum, 0, "", "")); // 500000000067108992
Node.js
var sum = 0;
for (i = 0; i <= 1000000000; i++) {
sum += i ;
}
console.log(sum); // 500000000067109000
The correct answer can be calculated using
1 + 2 + ... + n = n(n+1)/2
Correct answer = 500000000500000000, so I decided to try another language.
GO
var sum , i int64
for i = 0 ; i <= 1000000000; i++ {
sum += i
}
fmt.Println(sum) // 500000000500000000
But it works fine! So what is wrong with my PHP and Node.js code?
Perhaps this a problem of interpreted languages, and that's why it works in a compiled language like Go? If so, would other interpreted languages such as Python and Perl have the same problem?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…