Science, asked by kalia8712, 1 year ago

n training a batch neural network, after running the first few epochs, you notice that the loss does not decrease. The reasons for this could be 1. The learning rate is low. 2. The neural net is stuck in local minima 3. The neural net has too many units in the hidden layer A) 1 or 2 B) 1 or 3 C) 2 or 3 D) 1 only

Answers

Answered by keshavramaiah
0

Answer for this question is c)

2 OR 3

Answered by jeevan9447
0

1. The learning is rate is low

2. The neural net is stuck in local minima

The problem can occur due to any of the reasons mentioned. So the correct answer is

A). 1 or 2

Similar questions