Why after normalisation the values should not be zero in neural network?
Answers
Answered by
1
Answer:
Among the best practices for training a Neural Network is to normalize your data to obtain a mean close to 0. ... Also, the (logistic) sigmoid function is hardly ever used anymore as an activation function in hidden layers of Neural Networks, because the tanh function (among others) seems to be strictly superior.
Answered by
0
Answer:
refer to the attachment.
Attachments:
Similar questions