Economy, asked by srutidas511, 17 days ago

What do you mean by standard deviation ? Write its merits and demerits.​

Answers

Answered by swatiravibhadoriya12
1

Answer:

Standard Deviation

A standard deviation (or σ) is a measure of how dispersed the data is in relation to the mean. Low standard deviation means data are clustered around the mean, and high standard deviation indicates data are more spread out. A standard deviation close to zero indicates that data points are close to the mean, whereas a high or low standard deviation indicates data points are respectively above or below the mean. In Image 7, the curve on top is more spread out and therefore has a higher standard deviation, while the curve below is more clustered around the mean and therefore has a lower standard deviation.

Merits

It is rigidly defined and free from any ambiguity.

Its calculation is based on all the observations of a series and it cannot be correctly calculated ignoring any item of a series.

It strictly follows the algebraic principles, and it never ignores the + and – signs like the mean deviation.

It is capable of further algebraic treatment as it has a lot of algebraic properties.

It is used as a formidable instrument in making higher statistical analysis viz.: correlation, skewness, regression and sample studies, etc.

Demerits

It is not understood by a common man.

Its calculation is difficult as it involves many mathematical models and processes.

It is affected very much by the extreme values of a series in as much as the squares of deviations of big items proportionately bigger than the squares of the smaller items.

It cannot be used for comparing the dispersion of two, or more series given in different units

Answered by shoaibkhan4201122
2

Explanation:

A standard deviation (or σ) is a measure of how dispersed the data is in relation to the mean. Low standard deviation means data are clustered around the mean, and high standard deviation indicates data are more spread out.

Merits:

(i) The standard deviation is the best measure of variation because of its mathematical acteristics. It is based on every item the distribution. Also it is amenable to algebraic treatment nd is less affected by fluctuations of sampling than most other measures of dispersion.

Demerits:

It is affected very much by the extreme values of a series in as much as the squares of deviations of big items proportionately bigger than the squares of the smaller items. It cannot be used for comparing the dispersion of two, or more series given in different units.

Similar questions