EDIT: okay so i just understood what i’ve calculated is the MAD (mean average distance) of the population. And the use for the standart deviation is that showing how much a set is spread from the mean. The SD is also differentiable and rates outliers higher, please inform me if i got anything wrong.
​
For example for a population of (1, 3, 5) the standart deviation is 1.63, what exactly does it represent? Shouldn’t the standart deviation represent how much a data point differs from the mean on average?
Each data point has a 2, 0, 2 difference from the mean, respectively. When we take the arithmetic mean of the 3 differences, we get 1.33, why is the standart deviation 1.63?
I am absolutely fascinated by statistics but sometimes i just don’t get things. I’d be really happy if someone explained this simply, thanks in advance
​
​
In: 0
>Each data point has a 2, 0, 2 difference from the mean, respectively. When we take the arithmetic mean of the 3 differences, we get 1.33, why is the standart deviation 1.63?
It’s not the arithmetric mean but the quadratic mean (abbreviated RMS for Root-Mean-Squares). It rates outliers higher and has some neat mathematical properties like being differentiable wich occasionally make things easier to work with.
Latest Answers