The reason we don’t use absolute values for calculating variance

359 viewsOther

I know the reason for using squared values is so the values don’t get cancelled out. Why not absolute values? I get it when calculating error metrics that squaring can punish the model more for bigger errors, but I don’t get the reason here.

In: Other

7 Answers

Anonymous 0 Comments

There is “Mean absolute deviation” (MAD) which is exactly what you are describing.

I can think of two reasons why variance has become the standard:

From statistical purposes it is often desired that variance puts a higher weight on extreme observations compared to MAD.

From a mathematical point of view the square function is preferred as the square function is fully differentiable ( abs() isn’t differentiable at 0). Hence, making variance easier to handle in more complicated mathematical applications.

You are viewing 1 out of 7 answers, click here to view all answers.