Computer Science, asked by adityakumarpad5225, 8 months ago

in neural network the error of the network is fed back to network with the gradient of the cost function to update the weights this is know as

Answers

Answered by aditya738451396
2

Answer:

Once we have received the output for a single iteration, we can calculate the error of the network. This error is then fed back to the network along with the gradient of the cost function to update the weights of the network. ... This updating of weights using the gradient of the cost function is known as back-propagation.

Similar questions