Not a mathematician here at all so perhaps my question is phrased incorrectly.
Let’s say through thorough testing in reality, we can prove with certainty Pi is correct up until 5 decimal places,
3.14159
The computers that are calculating Pi to an endless degree, how do they validate new values that are calculated as correct and cannot be otherwise?
In: 434
pi divided by 4 is 0.78539816339…
You can approximate this value as 1 – 1/3. It’s 0.666, a bad approximation, so let’s add 1/5. That gives us 0.8666, closer but too high. Let’s subtract 1/7. That gives us 0.7238, too low now, so we add 1/9…
This sequence, **1 – 1/3 + 1/5 – 1/7 + 1/9 – 1/11 + 1/13…** is called the Gregory-Leibniz Series, and it gets a little closer to pi/4 each time you add or subtract the next fraction with an odd denominator. Mathematicians have proven this conclusively, via a proof too complex to give here.
To calculate the next (several) decimals of pi, computers can simply add the next number in the series. The actual method used by computers does not use the Leibniz formula, but it’s the same basic idea.
Similar functions exist to calculate other constants as well.
Latest Answers