Computer Science, asked by mohan8856, 1 year ago

Learning rate is learned by the network when training​

Answers

Answered by choudhary21
0

Explanation:

If your learning rate is too high, you can find yourself in a case where you repeatedly hop back and forth over the valley, wasting lots of time.

You can see this by checking the *sign* of your gradient at each iteration, if it is continuously switching between positive/negative, your learning rate is too high.

Answered by Anonymous
26

Answer:

The amount that the weights are updated during training is referred to as the step size or the “learning rate.” Specifically, the learning rate is a configurable hyperparameter used in the training of neural networks that has a small positive value, often in the range between 0.0 and 1.0.

Similar questions