Math, asked by manishr7631, 11 months ago

Why squarred deviation are used over absolute deviations

Answers

Answered by ouytt
0

Answer:

Step-by-step explanation:

In the definition of standard deviation, why do we have to square the difference from the mean to get the mean (E) and take the square root back at the end? Can't we just simply take the absolute value of the difference instead and get the expected value (mean) of those, and wouldn't that also show the variation of the data? The number is going to be different from square method (the absolute-value method will be smaller), but it should still show the spread of data. Anybody know why we take this square approach as a standard?

The definition of standard deviation:

σ=E[(X−μ)2]−−−−−−−−−−−√.

Can't we just take the absolute value instead and still be a good measurement?

σ=E[|X−μ|]

Similar questions