If a number like Pi is infinite, how do we know each decimal that is newly calculated is valid?

884 views

Not a mathematician here at all so perhaps my question is phrased incorrectly.

Let’s say through thorough testing in reality, we can prove with certainty Pi is correct up until 5 decimal places,

3.14159

The computers that are calculating Pi to an endless degree, how do they validate new values that are calculated as correct and cannot be otherwise?

In: 434

36 Answers

Anonymous 0 Comments

When a computer approximates pi, what it’s typically doing is using an infinite sum that’s equal to pi. Here’s [one example](http://www.mathmavericktutor.com/wp-content/uploads/2016/03/PI-Sum1-1024×175.png) of such a sum (you would just multiply your final answer by 4 to get pi). Computers obviously can’t add up infinitely many terms, so instead, they just add a bunch of terms. These sums will always start adding smaller and smaller numbers, so there will always be a *finite* point in the sum where we know our error will be less than, for example, 1/1000. That would tell us that our first 3 digits are accurate, since we can’t be more than 0.001 away from pi. So people will just tell the computer to run until it is within their accepted error bound and that will let them know that every digit of their approximation, up to their magnitude of error, is accurate.

You are viewing 1 out of 36 answers, click here to view all answers.