Not a mathematician here at all so perhaps my question is phrased incorrectly.
Let’s say through thorough testing in reality, we can prove with certainty Pi is correct up until 5 decimal places,
3.14159
The computers that are calculating Pi to an endless degree, how do they validate new values that are calculated as correct and cannot be otherwise?
In: 434
Mathematician here:
Some numbers are really easy to write out and work with. Whole numbers and fractions are super easy to work with.
Pi is more difficult to write out, but easy to work with. By this I mean, we have methods of approximating pi with fractions. Mathematicians can determine a maximum error depending on the number of calculations you used. The more calculations you complete, the smaller your maximum error becomes. By making this maximum error small enough, you can verify a certain number of decimal places in your approximation.
Latest Answers