eli5: In statistics, why do we square the deviation before dividing by n to get the variance instead of just using absolute values?

530 views

I’m assuming there’s a good reason this doesn’t work but I’ve never had it explained to me why.

In: 4

4 Answers

Anonymous 0 Comments

Mostly because of the normal distribution.

In practice and reality, most random processes (or at least a LOT of them) follow the normal distribution or something approaching it. And the mathematical function describing a normal distribution has a square in it (more precisely there is a “exponential(-x^2)” at some point). It follows that:

* The standard deviation (with squares) of a normal distribution is easy to compute from its mathematical formula.
* The average deviation (with absolute value) of a normal distribution is much more ugly to compute from its mathematical formula.

Then, it’s not just the normal distributions, there are a lot of mathematical formula where the standard deviations is more elegant to use than the average deviation.

You are viewing 1 out of 4 answers, click here to view all answers.