Why does dividing the final(chosen) height in a line by 2 give you the average height(y) in that line?

472 views

I was taught, of course, that the average of something is the sum of all the parts divided by the number of those parts.

This is easy when it is a set of values(2, 6, 8) because its simply (2+6+8)/3 = 16/3 ≈ 5.33.

But when you have a line, you have an **infinite amount of values** divided by **infinity**, right?

So how is it even possible to take the average? And why can you simply divide it by 2 when its a line?

And then of course there is the question about the average of curves and whatnot…

​

I feel so dumb, plz help me understand.

**Also this question comes from the idea that average velocity is half the final velocity**

In: 0

13 Answers

Anonymous 0 Comments

Here’s the 5-year-old explanation. Imagine I’m on the ground floor about to walk up 10 steps. If I walk up them in a row my height off the ground goes from 0, to 1, to 2, steadily until I hit 10. I start at height zero and end at height 10. The average height is halfway up the stairs – 5, so 10/2. Similarly, (0+1+2+3+…10)/11 – not /10, since zero is a step too – equals 5.

Now, imagine I don’t walk up in order. I go up 2, down 2, up 3, down 1, up 3, but eventually make it up the stairs. My height goes 0, 2, 0, 3, 2, 5… and finally hits 10. This height over time isn’t a nice smooth line, it’s a jagged one with positive and negative slope. However, at the end I’m at the top of the stairs, and my average height was still 5. And if you do the math of 0+2-2+3-1+3… divided by however many steps you took, you’ll still get 5 as the average.

You are viewing 1 out of 13 answers, click here to view all answers.