As a rule, between any two real numbers, there must be another real number. There is no number between .999… and 1, therefore they are the same, simply two ways of writing the same number. If the kid is the curious type, it might be pretty interesting to point out that numbers are conceptually separate from their representation. Most simply: ½=.5, but there’s also binary or hexadecimal, or even more exotic forms (like p-adic or continued fractions). The numbers themselves sorta “exist” out there in the aether as an abstract object that isn’t exactly tied to our notation.
Latest Answers