explain the concept of range in statistics
Answers
Answer:
In statistics: Numerical measures. The range, the difference between the largest value and the smallest value, is the simplest measure of variability in the data. The range is determined by only the two extreme data values.
Explanation:
HOPE IT'S HELP U MATE
Answer:
I hope this will help you
Explanation:
A simple measure of variability is the range, given as the difference between the largest and the smallest results. It has no statistical significance, however, for small data sets. Another statistical term, the average deviation, is calculated by adding the differences, while ignoring the sign, between each result and the average of all the results, and then dividing the sum by the number of results. Confidence limits at a given probability level are values greater than and less than the average, between which the results are statistically expected to fall a given percentage of the time.