Q. What is not a good weight initialization idea for deep neural Networks?
A. Initialize all the weights with 0
B. Initialize all weights with small random
numbers scaled by 0.1
c. Initialize all the weights with random numbers scaled by 100
D. None of the options .
Answers
Answered by
6
Explanation:
The aim of weight initialization is to prevent layer activation outputs from exploding or vanishing during the course of a forward pass through a deep neural network.
Similar questions