Not a mathematician here at all so perhaps my question is phrased incorrectly.
Let’s say through thorough testing in reality, we can prove with certainty Pi is correct up until 5 decimal places,
3.14159
The computers that are calculating Pi to an endless degree, how do they validate new values that are calculated as correct and cannot be otherwise?
In: 434
It’s decimal expansion might be infinite and non repeating. But it’s still computable.
We have multiple ways of creating sequences of numbers that have been proven to approximate π better and better as the sequence progresses. (https://en.wikipedia.org/wiki/Pi has several examples).
We can check a simpler irrational number: √2.
We know that √2 multiplied by itself gives 2, right? That’s kind of its definition.
Now, we know 1² = 1 < 2 and 2² = 4 > 2, so 1 < √2 < 2
And we keep going, 1.4² = 1.96, 1.5² = 2.25.
And we keep adding numbers. At each step we can find two numbers that keep sandwiching √2, so we know each new digit is the correct one.
There are similar processes for π. The exact process is (very very much) more complicated, but the principle remains the same.
Latest Answers