Computer Science, asked by shaiktasleem, 10 months ago

In a neural network, which one of the following techniques is NOT useful to reduce overfitting?
A) Dropout
B) Regularization
C) Batch normalizatioh
D) Adding more layers

Answers

Answered by meshal99
0
Dropout is the answer
Answered by anjalin
0

In a neural network, D) Adding more layers is NOT useful to reduce overfitting.

Explanation for the answer:

  • Regularization methods such as weight decay provide an easy way to control overfitting for large neural network models.
  • One of the modern recommendations for regularization is to use early stopping with a weight constraint and a dropout.
  • Batch normalisation is also used to reduce overfitting.
  • Hence, the correct answer among all the options is option D) Adding more layers.

(#SPJ3)

Similar questions