Math, asked by Himanshukatoch7168, 1 year ago

the rate at wch cost changes with respect to weight or bias is called

Answers

Answered by abhi178
1
Answer :- Backpropagation

Explanation :- Backpropagation is an expression for the partial derivatives δC/δ w of the cost function with respect to any weigh or bias in the network. This expression tell us how quickly the cost changes when we change the weighs and biases. Hence, rate at which cost change with respect to weight or bias is called Backpropagation.
Answered by SerenaBochenek
0

The correct answer is "Backpropagation".

Step-by-step explanation:

Backpropagation seems to be an affirmation including its value function with corresponds to any weight as well as bias throughout the channel also for partial differential equation δC/δ w, growing teaching approach for neural nets.

The appropriate references us about how rapidly the value changes whenever the weights or biases shift.

Learn more:

https://brainly.in/question/3680632

Similar questions