what is the difference between what variance measures and what standard deviation measures?

126 views

what is the difference between what variance measures and what standard deviation measures?

In: 0

2 Answers

Anonymous 0 Comments

Variance is the base metric and it measures the ‘spread’ of the data. A high variance indicates the data covers a wide range while a low variance means it clusters tightly around the mean.

However, the units for Variance are the square of the units of the original data. So if we take the square root of the Variance, we get a measure of spread in the original units – the standard deviation.

Anonymous 0 Comments

Variance is just the square of the standard deviation. They’re essentially a measure of the same thing.

They’re a measure of how spread out the data is. The smaller the standard deviation the more likely a point of data is close to the mean.

If you are familiar with a normal distribution (AKA Gaussian distribution or bell curve), the the standard deviation will tell you how peaked the curve is. A small standard deviation results in a narrow high peak, and a large standard deviation results wide low hill.

For a normal distribution (a very common distribution thanks in part to the central limit theorm) the standard deviation can be used to predict how much of a dataset is within any given range. ~68% of the dataset will be within 1 standard deviation of the mean, ~95% within 2 standard deviations, and ~99.7% within 3 standard deviations.

For example, in the USA the height of adult US males is approximately a normal distribution with a mean of 70 inches and a standard deviation of 3 inches. From this we can assume that 68% of adult US males will be between 67 and 73 inches tall, 95% will be between 64 and 76 inches tall, and 99.7% will be between 61 and 79 inches tall.