Not a mathematician here at all so perhaps my question is phrased incorrectly.
Let’s say through thorough testing in reality, we can prove with certainty Pi is correct up until 5 decimal places,
3.14159
The computers that are calculating Pi to an endless degree, how do they validate new values that are calculated as correct and cannot be otherwise?
In: 434
It depends on the method you use to calculate it
Some methods use upper and lower bounds, and if both of the bounds have the first 100 digits in common, the actual value of pi must also have these 100 digits
A simpler example is √2. It’s also irrational, so its decimal expansion is infinite. 1.414² < 2 < 1.415², so 1.414 < √2 < 1.415. This means that 1.414… is definitely correct
For other methods you might be able to calculate the error in some way and get an upper bound on it. If the error is less than 0.00000000001, you know you have at least 10 correct digits
Latest Answers