Not a mathematician here at all so perhaps my question is phrased incorrectly.
Let’s say through thorough testing in reality, we can prove with certainty Pi is correct up until 5 decimal places,
3.14159
The computers that are calculating Pi to an endless degree, how do they validate new values that are calculated as correct and cannot be otherwise?
In: 434
The formulas for calculating pi come with error bounds that shrink as you calculate more. If you’re partially calculating it and you have 3.1416… with an error bound of 0.0001, then you know that pi is between 3.1415 and 3.1417. You’ve now guaranteed that the first 4 digits are 3.141. If you calculate more to shrink the error bound further, then you can be sure of more decimal places in the same way.
The only difficulty is with the digits 9 and 0, because 0.99999 is extremely close to 1.00000, even though they have all different digits. But there are other independent proofs that pi is irrational, so we know that 9s and 0s will not repeat forever. So if we ever get a long string of 9s, we just need to keep calculating for longer and eventually they’ll stop.
Latest Answers