Hi Traffic,
Doing the calculation is not that hard.
If you understand what standard deviation is all about, the calculation even makes sense!
So let's see what it is we're doing in the calculation.
Start with a collection of numbers.
You'd like to describe them somehow, perhaps to compare them with other collections of numbers.
It could be the heights of your classmates, or how much money they have in their pockets.
First thing you could do is find the average amount, called the mean.
Add all the numbers, and get say $60.95.
There are 23 classmates, say, so their average coin is $2.65. The mean is $2.65.
What else might you want to know?
Does every one have the same amount? Probably not.
But if everyone doesn't have exactly $2.65, how much more or less do they have?
Since some have more and some have less, the average of the differences is 0. Not helpful.
But if you square the differences first, you get positive numbers, and the average won't be zero.
So that's the idea.
The average of the square of the differences from the mean gives you something called the variance.
And its square root gives you the standard deviation.
There's only one catch:
When you take that last average, you don't use the number [23] of classmates, you use one less: 22.
I won't go into why you do that, just believe: it gives a more meaningful result.
So here's the deal:
Standard Deviation: Find the mean. Find the differences from the mean. Square the differences. Find the average of the squares [but use N-1 instead of N] Take the square root. Now you have two ways to compare groups of numbers.
The mean tells you the average amount of money owned by your classmates.
The standard deviation tells you a kind of average of how much what they have differs from the mean.
For example, if the standard deviation were zero, everyone would have exactly $2.65.
If the standard deviation were huge, some of your classmates would have very little, and others quite a lot.
Hope that helps.
- bn