Computer Science, asked by Shyamsr, 11 months ago

In training a batch neural network, after running the first few epochs, you notice that the loss does not decrease. The reasons for this could be
1. The learning rate is low.
2. The neural net is stuck in local minima
3. The neural net has too many units in the hidden layer

A) 1 or 2
B) 1 or 3
C) 2 or 3
D) 1 only

Answers

Answered by jeevan9447
0

1. The learning is rate is low

2. The neural net is stuck in local minima

The problem can occur due to any of the reasons mentioned. So the correct answer is

A). 1 or 2

Answered by Arslankincsem
0

We have been given a question and we have been given options and we have to choose the correct option from the given options.


The question given to us is that why does the loss does not decrease after running the first few epochs in a training a batch of a neural network.


The answer to the question is A) The learning rate is low.

Similar questions