The reason we don’t use absolute values for calculating variance

367 viewsOther

I know the reason for using squared values is so the values don’t get cancelled out. Why not absolute values? I get it when calculating error metrics that squaring can punish the model more for bigger errors, but I don’t get the reason here.

In: Other

7 Answers

Anonymous 0 Comments

One reason for using variance, instead of some differently-defined measure of how spread out a data set or probability distribution is, is that if you are adding together random variables that vary in uncorrelated ways (which is a common thing in a lot of real-world applications) then the variance of this sum is just the sum of the individual variances. It makes some calculations simpler.

You are viewing 1 out of 7 answers, click here to view all answers.