Subtraction isn’t much harder than addition. Division isn’t much harder than multiplication. Logarithmization isn’t much harder than exponentiation. Then, what fundamentally makes integration so much more difficult than differentiation?

467 viewsMathematicsOther

Subtraction isn’t much harder than addition. Division isn’t much harder than multiplication. Logarithmization isn’t much harder than exponentiation. Then, what fundamentally makes integration so much more difficult than differentiation?

In: Mathematics

6 Answers

Anonymous 0 Comments

I think what’s happening here is confounding two aspects: numerical and symbolical.

It is usually (relatively) easy to do all the mentioned operations numerically, with a given precision. We can work with decimals and can, if so required, carefully track significant digits and error margins. That even works with differentiation and integration. There are better methods, but one can often go with the basic approach of drawing little vertical bars to find the area under a curve.

Symbolically, I would argue that all of those operations are difficult! Yes we can easily and without any numerical errors add, subtract, multiple or form powers of integers and other stuff. But that for differentiations and integration is akin to working with basic polynomials such as x³+7x²-5x+2. It is not hard to find the derivative and antiderivative of those expressions.

Stuff however gets ugly when we involve slightly more complicated numbers: ask yourself if e^^𝜋·√163 is actually an integer. Or is there any expression using e, 𝜋 (but no other numbers or constants) and only +, -, · that amounts to exactly 1? The answer to the former is no, but you need to do quite a a bit of digits and error-tracking to properly see it. The second one is an open problem which all of humanity could, until now, not solve!

So why would we expect that symbolical differentiation and integration would be any simpler? Differentiation works reasonably simple for what we call _elementary functions_, those made from the four basic arithmetical operation, exponentiation, logarithms and roots. But as soon as we allow for complicated functions such as Riemann’s zeta function or the Gamma function, we don’t get a result that has a better description than being that derivative.

Same with integration: there actually _is_ an algorithm by Risch that finds an antiderivative of an elementary function as an elementary expression if one exists, and tells you if none exist*. It only uses the more commonly known methods to accomplish that. And again with more complex things such as the zeta function(s), the antiderivative is “just that”, it lacks a proper better description just like with derivatives.

*: terms and conditions apply: it needs to assume that “do two elementary expressions describe the same function?” can be solved algorithmically; we currently don’t know if this is possible.

You are viewing 1 out of 6 answers, click here to view all answers.