Normal just means that every digit appears roughly the same fraction of time as all the others. For example, if pi is normal in base 10, that would mean that in the infinite decimal expansion there would be approximately the same amount of 1s as 2s, 3s, etc. And that every digit from 0 to 9 would appear about 10% of the time. It would be *not normal* if, for example, there were twice as many 3s as 7s or something like that.
We can assume based on expansions that we’ve calculated (they’ve calculated pi out of tens of millions of digits or more) that a number is normal, but it’s very hard to *prove*. Like maybe after the trillionth digit all of a sudden there aren’t any more 7s. We can assume that it won’t happen, but no one has been able to prove it yet.
Latest Answers