Math, asked by Paramjeet5184, 1 year ago

Calculating range from mean and standard deviation

Answers

Answered by Chetnadav1
0
The standard deviation and range are both measures of the spread of a data set. Each number tells us in its own way how spaced out the data are, as they are both a measure of variation.  Although there is a not an explicit relationship between the range and standard deviation, there is a rule of thumb that can be useful to relate these two statistics.  This relationship is sometimes referred to as the range rule for standard deviation.

The range rule tells us that the standard deviation of a sample is approximately equal to one fourth of the range of the data. In other words s = (Maximum – Minimum)/4. This is a very straightforward formula to use, and should only be used as a very rough estimate of the standard deviation.

An Example

To see an example of how the range rule works, we will look at the following example. Suppose we start with the data values of 12, 12, 14, 15, 16, 18, 18, 20, 20, 25. These values have mean of 17 and standard deviation of about 4.1. If instead we first calculate the range of our data as 25 – 12 = 13, and then divide this number by four we have our estimate of the standard deviation as 13/4 = 3.25. This number is relatively close to the true standard deviation, and good for a rough estimate.

Similar questions