What do mathematicians mean when they say that the calculus that Newton and Leibniz invented was not rigorous?

192 views

Was it that Newton and Leibniz didn’t really know what they were doing? I presume that they had to create proofs for how their integrals and derivatives worked.

Were their calculations incorrect because they weren’t rigorous?

In: 3

6 Answers

Anonymous 0 Comments

Newton and Leibniz knew what they were doing, but they were making some assumptions about how real numbers work.

For example, consider the following statement:

* Given 0 and a number x > 0, there exists some number y such that 0 < y < x.

In more everyday terms, this statement says “there is no smallest positive number”. This statement is true for real numbers and for rational numbers, but it isn’t true for integers (there is no integer y such that 0 < y < 1).

Now consider the following, trickier statement:

* Consider an infinite sequence of numbers x1, x2, x3, … with the following two properties. First, each number is larger than the previous number. And second, every number is smaller than some single other number. Then this sequence necessarily approaches a limit.

(For you real analysis students out there, this is, essentially, the [monotone convergence theorem](https://en.wikipedia.org/wiki/Monotone_convergence_theorem).)

In everyday-ish terms, this says “a sequence that moves in only one direction can have one of two behaviors. Either it runs off to infinity, or it approaches some number more and more closely.” For example, the sequence 3, 3.1, 3.14, 3.141, 3.1415, 3.14159, 3.141592… approaches the value pi, while the sequence 1, 2, 4, 8, 16 … runs off to infinity.

This statement is true for real numbers, but it is **not** true for rational numbers. The sequence 3, 3.1, 3.14… described above is a sequence of rational numbers that has the two properties given (every number is bigger than the last, and all the numbers are less than, say, 4), but it does *not* approach a rational number as a limit.

Newton and Leibniz were implicitly assuming statements like this one in their development of calculus. But since they didn’t have a rigorous definition of the real numbers, they didn’t have any rigorous way to prove these statements, so their calculus rested on the assumption that those statements were true. It turns out that they are, in a proper formalism of the real numbers, but Newton and Leibniz did not know (and could not have known, since they did not have a rigorous definition of the real numbers) that that was true.

You are viewing 1 out of 6 answers, click here to view all answers.