Computer Science, asked by Rocknain4295, 10 months ago

Why after normalisation the values should not be zero in neural network?

Answers

Answered by RishiAEC
1

Answer:

Among the best practices for training a Neural Network is to normalize your data to obtain a mean close to 0. ... Also, the (logistic) sigmoid function is hardly ever used anymore as an activation function in hidden layers of Neural Networks, because the tanh function (among others) seems to be strictly superior.

Answered by Itzgaurav
0

Answer:

refer to the attachment.

Attachments:
Similar questions