Number() function returns incorrect values on some arguments, like this:
Number('10000000712224641') returns 10000000712224640
Number('10000000544563531') returns 10000000544563532
I tested this on Firefox, Chome, IE and Node.js. Why is this happening?
Javascript uses 64-bit IEEE-754 binary floating point to store all numbers - like double
in C# and Java, for example. There isn't a different type to store integers. (The actual implementation may use optimizations to avoid always performing arithmetic in this way, but from an end-user perspective the results will always be as if every number were treated as a 64-bit binary floating point value.)
That means only 52 bits are available to store the significand, with the other bits being used for the exponent and sign. With normalization, that means you can effectively store values with 53 significant bits of precision. That means beyond 253-1 (which is the value 9007199254740991 as quoted in other answers), the distance between "adjacent" numbers is more than 1, so you can't store all integers exactly.
See more on this question at Stackoverflow