For an overfit data set the cross validation error will be much bigger than the training error.
Answers
Answered by
7
For an over fit data set the cross validation error will be much bigger than the training error because than the validation error in the system the training error is much lower as the because our model is to get over fit.But if the dropout regularization layer is being used in the network then the validation layer becomes smaller than the training error.
Answered by
5
The answer is true
When we take the high variance term then it will show the estimator of the whole logic.
This logic will have learning algorithms that will depend on the data sets.
It works on the overfitting process as the whole.
If we go in the deep, then high variance and overfitting are related to each other.
Due to no robust algorithm.
Similar questions
Social Sciences,
7 months ago
Chemistry,
7 months ago
Social Sciences,
7 months ago
Science,
1 year ago
Biology,
1 year ago