It has to do with number precision – how many decimal places the particular language and data type you’re using can store and use in comparisons. In some languages and data types, it can be as few as 7 places (SQL float data type) so this would also evaluate as true:
0.15 == 0.15000001
Anything beyond the last decimal place of precision is ignored. Thus, even though you’re typing the above, the computer looks at it like this:
0.15 == 0.1500000
And those are obviously the same so the result is True.
Latest Answers