Hello
Even after having searched online, I still don’t understand in which **concrete** cases one would rather work with standard deviation and in which cases variance. I do see that the difference between both is the fact that one is squared and the other not. But that doesn’t clarify to me in which case one would choose the one over the other.
So my questions are:
– what is the difference between variance and standard deviation regarding their meaning?
– when would one person use variance in stead of standard deviation and vice-verca?
Thank you
In: Mathematics
Since there’s a direct relationship between the two, they basically mean the same thing, they’re measures of how far the samples are from the mean, they measure the “spread” of the data. Higher variance or higher standard deviation mean the samples are more spread out around their mean.
Variance is usually mathematically more convenient; a lot of math functions don’t behave well around square roots because, from a math standpoint, there’s two answers to a square root (negative and positive) and this can screw up other formulas. “Negative standard deviation” doesn’t have any meaning but the square root doesn’t know that. In addition, when adding random variables, the variances just add too, which is convenient. That is *not* generally true for standard deviation.
Standard deviation, as /u/znn94x notes, is in the same units as the data so is more intuitive.
In general…use variance for all your calculations, drop it down to standard deviation when you actually want to talk about the spread of the data or need to work with distributions that take the SD as an input (very common for Excel distribution formulas, for example).
Latest Answers