Q. What is not a good weight initialization idea for deep neural Networks?
A. Initialize all the weights with 0
B. Initialize all weights with small random
numbers scaled by 0.1
c. Initialize all the weights with random numbers scaled by 100
D. None of the options .
Answers
Answered by
6
Explanation:
The aim of weight initialization is to prevent layer activation outputs from exploding or vanishing during the course of a forward pass through a deep neural network.
Similar questions
Social Sciences,
2 months ago
English,
2 months ago
English,
4 months ago
Biology,
9 months ago
Geography,
9 months ago