Why is 0.1+0.2 not equal to 0.3 in most programming languages?
Answers
Answered by
0
Why is 0.1+0.2 not equal to 0.3 in most programming languages? It's called "precision", and it's due to the fact that computers do not compute in Decimal, but in Binary. In Math, 0.1 is a Rational number, which means that it is the result of a ratio between two numbers, 1 / 10.
Similar questions
English,
6 months ago
English,
6 months ago
Math,
6 months ago
Environmental Sciences,
1 year ago
Math,
1 year ago