I'm trying to display a number as a percent by using _.round and then by multiplying the number by 100. For some reason, when I multiply the rounded number, the precision gets messed up. Here's what it looks like:
var num = 0.056789,
roundingPrecision = 4,
roundedNum = _.round(num, roundingPrecision),
percent = (roundedNum * 100) + '%';
console.log(roundedNum); // 0.0568
console.log(percent); // 5.680000000000001%
Why is the 0.000000000000001 added to the number after multiplying by 100?
This is due to the fact that numbers are represented internally as binary numbers with limited precision.
See also "Is floating point math broken?"
Is floating point math broken?
Which got the answer:
To get the correct outcome in your case, you need to round after all the arithmetic: