English, asked by Pratham4066, 1 year ago

Reducing the number of features can reduce overfitting

Answers

Answered by nanu95star89
1

HEy mate---------


Yes you can, although it is not the desirable approach.

Imagine you are a teacher and you want to predict the scores of your students on the next exam, using data from previous years. You have attendance rate, GPA, sex, and color of the students hair. Of course we know that sex and the color of the students hair are not relevant features, but if you do not have enough data your model might assume they are important and, therefore, it will overfit.

Most cases, we do not know beforehand which features are important. Instead of removing features, it is usually better to regularize your model. Adding L2 or L1 regularization and Dropout are typically good choices.
hope this will help you.
thank you master.
please mark this is a brilliant



Answered by laraibmukhtar55
2

We can decrease the overfitting by reducing the number of features.

  • The simplest way to avoid over-fitting is to make sure that the number of self-regulating parameters in your fit is much smaller than the number of data points you have.
  • The "classic" way to avoid overfitting is to distribute your data sets into 3 groups -- a training set, a test set, and a validation set.  We find the coefficients by the training set; we find the best form of the equation by the test set, test for over-fitting by the validation set.  

Hope it helped..

Similar questions