Why do I have to put a constant on an undefined integral and don’t have to in a defined one?

885 views

Why do I have to put a constant on an undefined integral and don’t have to in a defined one?

In: Mathematics

2 Answers

Anonymous 0 Comments

Think of integration as backwards derivation. If you derive the result of your integration, you get back to where you started right?

But it also works if you add any constant to the end of your integrated function (derivative of constant =0). So the true integral you’re looking for could have any number as that last constant. You can’t know what that number is without more info (an initial condition), so you leave it as C

When you’re given a definite integral your result will be a number, not a function. No need to worry about weather you’re missing possible functions when your answer isn’t even a function

Anonymous 0 Comments

One way to think about it is, a definite integral is an indefinite integral with an extra step applied to the end. This extra step happens to cancel out the constant.

Suppose you’re doing definite integral of x^(2) dx from x = 1 to 3.

First do the indefinite integral of x^(2) dx, which is (1/3)*x^(3) + C, for arbitrary constant C. Then for the definite part we plug in x=3 and subtract from that what we get when we plug in x=1:

[(1/3)*3^(3) + C] – [(1/3)*1^(3) + C]

In the arithmetic the Cs cancel.

So in a sense you’re ‘always’ using the constant, but when you’re doing the definite integral you ignore it because you know in advance it won’t survive the subtraction.