Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
907 views
in Technique[技术] by (71.8m points)

math - Avoiding problems with JavaScript's weird decimal calculations

I just read on MDN that one of the quirks of JS's handling of numbers due to everything being "double-precision 64-bit format IEEE 754 values" is that when you do something like .2 + .1 you get 0.30000000000000004 (that's what the article reads, but I get 0.29999999999999993 in Firefox). Therefore:

(.2 + .1) * 10 == 3

evaluates to false.

This seems like it would be very problematic. So what can be done to avoid bugs due to the imprecise decimal calculations in JS?

I've noticed that if you do 1.2 + 1.1 you get the right answer. So should you just avoid any kind of math that involves values less than 1? Because that seems very impractical. Are there any other dangers to doing math in JS?

Edit:
I understand that many decimal fractions can't be stored as binary, but the way most other languages I've encountered appear to deal with the error (like JS handles numbers greater than 1) seems more intuitive, so I'm not used to this, which is why I want to see how other programmers deal with these calculations.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

1.2 + 1.1 may be ok but 0.2 + 0.1 may not be ok.

This is a problem in virtually every language that is in use today. The problem is that 1/10 cannot be accurately represented as a binary fraction just like 1/3 cannot be represented as a decimal fraction.

The workarounds include rounding to only the number of decimal places that you need and either work with strings, which are accurate:

(0.2 + 0.1).toFixed(4) === 0.3.toFixed(4) // true

or you can convert it to numbers after that:

+(0.2 + 0.1).toFixed(4) === 0.3 // true

or using Math.round:

Math.round(0.2 * X + 0.1 * X) / X === 0.3 // true

where X is some power of 10 e.g. 100 or 10000 - depending on what precision you need.

Or you can use cents instead of dollars when counting money:

cents = 1499; // $14.99

That way you only work with integers and you don't have to worry about decimal and binary fractions at all.

2017 Update

The situation of representing numbers in JavaScript may be a little bit more complicated than it used to. It used to be the case that we had only one numeric type in JavaScript:

This is no longer the case - not only there are currently more numerical types in JavaScript today, more are on the way, including a proposal to add arbitrary-precision integers to ECMAScript, and hopefully, arbitrary-precision decimals will follow - see this answer for details:

See also

Another relevant answer with some examples of how to handle the calculations:


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...