Computer Science, asked by Vibaksingh353, 1 year ago

What is gradient descent in neural network?

Answers

Answered by Akshaymas
0
Backpropagation is a method used in artificial neural networks to calculate a gradient that is needed in the calculation of the weights to be used in the network. It is commonly used to train deep neural networks, a term used to explain neural networks with more than one hidden layer.
Similar questions