Why 0.1 + 0.2 Doesn’t Equal 0.3 in JavaScript?
Ever heard about JavaScript being bad at maths?
Well, if not, you will find out why some people think so.
Well, there's this classic head-scratcher where if you add 0.1
and 0.2
, the answer isn't exactly 0.3
.
Yup, you read that right.
It's one of those "huh?" moments in programming that catches a lot of folks off guard. So, what's the deal with it?
JavaScript, like many languages, follows a standard for number crunching called IEEE 754.
This is just a fancy way of saying it handles numbers in a specific format, which mostly works great. But, when it comes to floating-point numbers (those with decimals), things get a bit wobbly.
So, when you happily key in 0.1 + 0.2
in JavaScript, expecting the neat 0.3
, the language goes, "Here’s 0.30000000000000004!"
Why?
It boils down to how these numbers are represented in the machine’s brain – converting them to binary and back introduces tiny errors. These errors are usually no biggie, but they're enough to notice in cases like (0.1 + 0.2) === 0.3
.
This might sound small, but imagine dealing with transactions or precise measurements. Suddenly, that tiny difference could become a big deal.
But don't worry; there are ways to sidestep this quirk. You could round off your results using specific methods, ensuring you don't end up scratching your head over unexpected decimals.
Here's a simple fix:
You could do something like rounding the sum to a manageable number of decimal places. Here’s a quick code snippet to show what I mean:
const sum = (0.1 + 0.2).toFixed(2); console.log(sum); // Now it says: "0.30"
Just knowing about this little oddity saves you from potential headaches and arms you with the know-how to tackle number precision like a pro.
So, the next time you add decimals in JavaScript and the numbers seem off, remember, it's not you; it's just JavaScript being JavaScript.