Computer Science, asked by ksbenhur1999, 1 year ago

Which of the following is true?
A) In batch gradient descent we update the weights and biases of the neural network after forward pass over each training example.
B) In batch gradient descent we update the weights and biases of our neural network after forward pass over all the training examples.
C) Each step of stochastic gradient descent takes more time than each step of batch gradient descent.
D) None of these three options is correct

Answers

Answered by divyansha1115
1

B should be the correct answer

Answered by omegads04
0

In batch gradient descent we update the weights and biases of our neural network after forward pass over all the training examples is true among the following. Using batch gradient descent, you can train neural networks. The term Back propagation is a technique or method that is used in artificial neural networks which is used to calculate a gradient which is needed for the calculation of the weights in network.

Similar questions