Not a mathematician here at all so perhaps my question is phrased incorrectly.
Let’s say through thorough testing in reality, we can prove with certainty Pi is correct up until 5 decimal places,
3.14159
The computers that are calculating Pi to an endless degree, how do they validate new values that are calculated as correct and cannot be otherwise?
In: 434
I’ll use an example the helped my brain comprehend it.Say we are trying to split 10 into 3 even parts.
Well we know it can’t be 4,4,4 because that would be over 10. It also can’t be 3,3,3 because that would be under. Therefore it must be 3.x, 3.x, 3.x.
What should x be? Well we know it can’t be 3.4 because that would equal more than 10 and 3.3 because that would be under. Therefore it must be 3.3x, 3.3x, 3.3x.
As you can probably tell, you can do this over and over and over. 1/3 of 10 is also an infinite number. Obviously computers can do this kind of calculation much quicker and easier than us now, but there is a way to work out each decimal, it’s just a little different from the example I’ve given.
Latest Answers