The weird feeling we get only arises because we usually dont think about what 0.999 … actually IS. “It just has infinitely many 9’s”. What does that actually mean?
If you write 0.999 … down, does it get more 9’s as we speak? In that case, any equation containing it is wrong because its value changes all the time. You cant work with that. Its like saying “This section of the river has 10 fish”. That statement can never be right for long because the amount of fish changes all the time, so eventually, there may be more fish than 10.
So its a fixed amount of 9’s? No, thats nonsense. We cant say that “infinitely many 9’s” means that there is a fixed amount of 9’s.
So the notion of “infinitely many 9’s” doesnt actually make sense. No matter how we define it, we get clear logical issues. If we want to do math with it, we need to assign it a value that stays *fixed* and which doesnt “change as we speak”. There are 2 important observations for this task:
(1) 0.999 … is *always* less than or equal to 1.
(2) 0.999 … is bigger than *any* number below 1 (because it surpasses 0.9, 0.99, 0.999, 0.9999 etc.)
So IF 0.999 … is equal to any *fixed* number, the best candidate would be 1. Thats why mathematicians defined 0.999 … = 1.
How to explain it practically to a kid? I suppose it depends on what age the kid is. The basic idea is that a number can have more than one name, just like a person can. Dwane, Mr Johnson, and The Rock all refer to the same person.
If the kid is older and has more understanding of maths, you could ask them about fractions. 1/2=2/4=356/712=0.5. We’re used to the idea that you can have multiple equivalent fractions. There’s nothing in the rules that says you can’t have multiple equivalent decimals too. It’s just not as common, and less likely to crop up.
You can make any repeating number you want by doing the following:
choose a repeating part; I’ll take 1001. Divide your repeating part by a number of nines, equal to the length of your repeating part. So for me that would be 1001/9999. This equals 0.100110011001…
The reverse is also true, so 0.333…. Is 3/9 = 1/3
And 0.999… = 9/9 = 1
You’ve seen the proof, but I never really liked it until someone told me: “find a number between 0.999… and 1”. That’s the real evidence to me. There is no number between them, so they have to be the same number.
Number between 1 and 2? 1.1.
Number between 1 and 1.1? 1.01
Etc
Rational numbers always have an infinite amount of numbers between any two numbers. They are called infinitely dense because of this.
Sorry for any non-technical aspects of this explanation, I’m a physicist, not a mathematician.
0.999 is considered equal to 1 in mathematics because, in the decimal number system, it is a different way to represent the same quantity as 1. In other words, 0.999 and 1 are two different decimal representations of the same number, which is exactly 1.
One way to understand this is through fractions. You can express 1 as a fraction: 1/1. Now, if you divide 1 by 3, you get the fraction 1/3, which in decimal form is 0.333 (repeating). If you multiply 0.333 by 3, you get 0.999 (repeating), which is equivalent to 1/1 or simply 1.
So, mathematically, 0.999 and 1 are two different notations for the same value, and that’s why they are considered equal.
One way to define what a number actually **is**, is to create something called a Dedekind Cut. In simple terms, I can “slice” the number line at a certain point and put all the points on the line to the left in one collection and all the point on the line to the right in another collection. Then, I can say that a number is “equal to” the collection running off to the left.
The set which defines 0.999… has precisely the same points in it as does the set that defines 1, so they are the same number.
Not a math expert but here is how I understand it. You are using 2 different counting systems to represent the same thing.
Imaging you have an apple and cut it into 3 pieces. You call each piece 1/3. Put all 3 pieces back together and you get the whole apple, or 3/3. Nothing is different between the 3 pieces and the whole apple you started with.
Now you decide to convert this fraction to a decimal. Problem is a decimal (in base 10) is just the fraction of 1/10 for the first number, the fraction 1/100 for the second number and so on. Why we use 10 is unknown, maybe because humans having 10 fingers.
Point is we don’t have to use 10, it is more or less a random number that humans made up. If we would eliminate a number (or finger) when counting the conversion would be nice and clean (base 9) but this problem would show up in other places.
So what happens is you try to convert 1/3 = X/10, a whole number doesnt fit. So we add a zero. 1/3 = X/100 and it still doesn’t work, no matter how many zeros you add to the denominator. 1/3 =X/10000000000 still doesn’t get a whole number. So you are left with .333 repeating forever to represent 1 slice of apple and .9999… to represent the whole apple.
All you were trying to do is count 1/3 of an apple by only using the fraction 1/10 which doesn’t ever come out right.
Imagine you have “1”.
If you subtract a tiny bit from 1, you have 0.9
If you subtract a tinier bit from 1, you have 0.99
You can subtract tinier and tinier bits, getting 0.9999…..
Until finally the bits are so small, that they’re actually nothing at all. When you take away nothing from 1, you still have 1.
Latest Answers