I wanted to measure the "average distance from the average value", but I can't understand the "standard deviation". Many text books say the standard deviation is used to "summarize" the deviation, but I have trouble understanding why it uses squares to do it.
If I have many x
numbers the formula is:
$$\sigma = (\frac{1}{N} \sum_i \lvert x_i - avg \rvert^\color{red}{2.0})^{1/\color{red}{2.0}}$$
So, there it is... that magic number 2.0
. It shows up twice. What is so magic about it? Can't I use 1
, 3
, 4
, or 6
? or even 1.9
, or 2.1
?
Why 2.0?
Edit: I just learned that 1
is called "mean deviation".