In a neural network, which one of the following techniques is NOT useful to reduce overfitting?
A) Dropout
B) Regularization
C) Batch normalizatioh
D) Adding more layers
Answers
Answered by
0
Dropout is the answer
Answered by
0
In a neural network, D) Adding more layers is NOT useful to reduce overfitting.
Explanation for the answer:
- Regularization methods such as weight decay provide an easy way to control overfitting for large neural network models.
- One of the modern recommendations for regularization is to use early stopping with a weight constraint and a dropout.
- Batch normalisation is also used to reduce overfitting.
- Hence, the correct answer among all the options is option D) Adding more layers.
(#SPJ3)
Similar questions
Math,
5 months ago
Science,
5 months ago
Computer Science,
10 months ago
English,
10 months ago
Chemistry,
1 year ago