Sunday 23 October 2016

math - Why does adding two decimals in Javascript produce a wrong result?








Why does JS screw up this simple math?





console.log(.1 + .2)  // 0.3000000000000004

console.log(.3 + .6) // 0.8999999999999999





The first example is greater than the correct result, while the second is less. ???!! How do you fix this? Do you have to always convert decimals into integers before performing operations? Do I only have to worry about adding (* and / don't appear to have the same problem in my tests)?



I've looked in a lot of places for answers. Some tutorials (like shopping cart forms) pretend the problem doesn't exist and just add values together. Gurus provide complex routines for various math functions or mention JS "does a poor job" in passing, but I have yet to see an explanation.

No comments:

Post a Comment

c++ - Does curly brackets matter for empty constructor?

Those brackets declare an empty, inline constructor. In that case, with them, the constructor does exist, it merely does nothing more than t...