By definition, variance and standard deviation are both measures of variation for interval-ratio variables. They describe how much variation or diversity there is in a distribution. Both the variance and standard deviation increase or decrease based on how closely the scores cluster around the mean.
The standard deviation is a measure of how spread out the numbers in a distribution are. It indicates how much, on average, each of the values in the distribution deviates from the mean, or center, of the distribution. It is calculated by taking the square root of the variance.
Variance is defined as the average of the squared deviations from the mean. To calculate the variance, you first subtract the mean from each number and then square the results to find the squared differences. You then find the average of those squared differences. The result is the variance.
Let’s say we want to find the variance and standard deviation of the age among your group of 5 close friends. The ages of you and your friends are: 25, 26, 27, 30, and 32.
First, we must find the mean age: (25 + 26 + 27 + 30 + 32) / 5 = 28.
Then, we need to calculate the differences from the mean for each of the 5 friends.
25 – 28 = -3
26 – 28 = -2
27 – 28 = -1
30 – 28 = 2
32 – 28 = 4
Next, to calculate the variance, we take each difference from the mean, square it, then average the result.
Variance = ( (-3)2 + (-2)2 + (-1)2 + 22 + 42 )/ 5
= (9 + 4 + 1 + 4 + 16 ) / 5 = 6.8
So, the variance is 6.8. And the standard deviation is the square root of the variance, which is 2.61.
What this means is that, on average, you and your friends are 2.61 years apart in age.
Frankfort-Nachmias, C. & Leon-Guerrero, A. (2006). Social Statistics for a Diverse Society. Thousand Oaks, CA: Pine Forge Press.