Computer Science, asked by aditiseal176, 1 year ago

In training a batch neural network, after running the first few epochs, you notice that the loss does not decrease. The reasons for this could be
1. The learning rate is low.
2. The neural net is stuck in local minima
3. The neural net has too many units in the hidden layer
A) 1 or 2
B) 1 or 3
C) 2 or 3
D) 1 only

Answers

Answered by aqibkincsem
0

it is learning r ate is low when you look off the other and also neural net is get stuck in the part of the local minima and also provide better support and solution for the customer.


hence the result is option 1 and 2 so it will be more comfortable for the student to get ideas on reading this answer.

Similar questions