Not a mathematician here at all so perhaps my question is phrased incorrectly.
Let’s say through thorough testing in reality, we can prove with certainty Pi is correct up until 5 decimal places,
3.14159
The computers that are calculating Pi to an endless degree, how do they validate new values that are calculated as correct and cannot be otherwise?
In: 434
Note that calculating larger values of pi doesn’t have a practical value. It’s now a way to show technical chops in how the program is written, the hardware used to perform the calculation, or if a new algorithm is faster than others.
NASA doesn’t use more than 15 digits of pi (including the 3 in front of the decimal point) in their most precise calculations: [https://www.jpl.nasa.gov/edu/news/2016/3/16/how-many-decimals-of-pi-do-we-really-need/](https://www.jpl.nasa.gov/edu/news/2016/3/16/how-many-decimals-of-pi-do-we-really-need/)
Latest Answers