GradientDescent is one of Backward propagation techniques to find the best set of parameters of the network.
True or False
Answers
Answered by
0
Yes, it’s true. Gradient Descent technique is commonly used in machine learning.
Answered by
0
Answer:
Yes, the statement is true.
Explanation:
Gradient Descent is the method in which we have an optimization algorithm, that helps to adjust or calculate weight of neurons and give the results as output.
These gradients are used further in backward propagation for completing the next steps. Here, you can say the backward propagation cannot work until you do not have exact gradients.
For this, you need to follow the steps of gradient descent to calculate the gradient from the loss function.
Similar questions