Economy, asked by MitaSinha7966, 1 year ago

Explain the ordinary least squares method along with the basic assumptions.

Answers

Answered by RakeshPateL555
2
_______♦♦☺♦♦________

\huge{\red{Hello...frd}}

\bf{here..is..ur.. answer}

_______♦♦☺♦♦________

&lt;b&gt;In statistics, ordinary least squares (OLS) is a type of linear least squares method for estimating the unknown parameters in a linear regression model. OLS chooses the parameters of a linear function of a set of explanatory variables by the principle of least squares: minimizing the sum of the squares of the differences between the observed dependent variable (values of the variable being predicted) in the given dataset and those predicted by the linear function.<br /><br />Geometrically, this is seen as the sum of the squared distances, parallel to the axis of the dependent variable, between each data point in the set and the corresponding point on the regression surface – the smaller the differences, the better the model fits the data. The resulting estimator can be expressed by a simple formula, especially in the case of a simple linear regression, in which there is a single regressor on the right side of the regression equation.<br /><br />The OLS estimator is consistent when the regressors are exogenous, and optimal in the class of linear unbiased estimators when the errors are homoscedastic and serially uncorrelated[citation needed]. Under these conditions, the method of OLS provides minimum-variance mean-unbiased estimation when the errors have finite variances. Under the additional assumption that the errors are normally distributed, OLS is the maximum likelihood estimator.<br /><br />OLS is used in fields as diverse as economics(econometrics), data science, political science, psychology and engineering (control theory and signal processing).

_______♦♦☺♦♦________

\bf{By-RakeshPatel}

_______♦♦☺♦♦________


Similar questions