Variance is defined as $$V(X) = \sum (x-\mu)^2 .p(x)$$ And standard deviation is $\sigma_X = \sqrt{V(x)}$
But I feel it makes more sense to define $\sigma_X$ as $\sum( |x-\mu|.p(x))$ instead because the mod takes care of negative distances and multiplication by p(x) would give us expected value of the deviation we should expect. Then why is SD defined the way it is?