Standard deviation (symbol σ) is a measure of the dispersion of a set of numerical values. Intuitively, it can be thought of as the mean deviance from the average. It is calculated by taking the square root of variance, or

$ \sigma = \sqrt{\frac{1}{N} \sum_{i=1}^N (x_i - \mu)^2}, \mu = \frac{1}{N} \sum_{i=1}^N x_i. $

where xi is the value of the ith data point in a set and N is the total number of data points.

Community content is available under CC-BY-SA unless otherwise noted.