Why does dividing the final(chosen) height in a line by 2 give you the average height(y) in that line?

476 views

I was taught, of course, that the average of something is the sum of all the parts divided by the number of those parts.

This is easy when it is a set of values(2, 6, 8) because its simply (2+6+8)/3 = 16/3 ≈ 5.33.

But when you have a line, you have an **infinite amount of values** divided by **infinity**, right?

So how is it even possible to take the average? And why can you simply divide it by 2 when its a line?

And then of course there is the question about the average of curves and whatnot…

​

I feel so dumb, plz help me understand.

**Also this question comes from the idea that average velocity is half the final velocity**

In: 0

13 Answers

Anonymous 0 Comments

For a linear value, which is what this question is about, it’s enough to take average between start and end points

Avg = (a + b) / 2

When we start at 0, i.e. a = 0, it reduces to

Avg = (0 + b) / 2, which is

Avg = b / 2

For non linear functions (curves, etc), this doesn’t work.

You are viewing 1 out of 13 answers, click here to view all answers.