Computer Science, asked by anikshah1001, 11 months ago

____________ controls the magnitude of a step taken during Gradient Descent .

Answers

Answered by PoojaBurra
5

Gradient descent is first-order optimization algorithm for searching the minimal of a function.  

Finding a local minimal of a function using gradient descent one can take steps perpendicular to the negative of the gradient of the function at the current point.  

Instead one can take steps perpendicular to the positive of gradient where one arrives at a local maximum of that function i.e. procedure afterwards is known as Gradient Ascent.

Gradient descent is called as steepest descent.  

Though Gradient Descent should not be mixed with the process of steepest descent for perfect integrals.

Answered by pourkodi
1

Answer:

Learning Rate

Explanation:

Similar questions