the rate at which cost changes with respect to weight or bias is called
Answers
Answered by
5
The rate at which cost changes with respect to weight or bias is known as backpropagation. It is based upon a partial derivative expression given by
dC/dw. Here C is the Cost function where w is the weight function. This algorithm gives us an understanding of not just how the cost changes with biases and weights but also, how the overall behaviour of the network itself changes with these.
dC/dw. Here C is the Cost function where w is the weight function. This algorithm gives us an understanding of not just how the cost changes with biases and weights but also, how the overall behaviour of the network itself changes with these.
Answered by
9
Answer :- Backpropagation
Explanation :- Backpropagation is an expression for the partial derivatives δC/δ w of the cost function with respect to any weigh or bias in the network. This expression tell us how quickly the cost changes when we change the weighs and biases. Hence, rate at which cost change with respect to weight or bias is called Backpropagation.
Explanation :- Backpropagation is an expression for the partial derivatives δC/δ w of the cost function with respect to any weigh or bias in the network. This expression tell us how quickly the cost changes when we change the weighs and biases. Hence, rate at which cost change with respect to weight or bias is called Backpropagation.
Similar questions