In a simple MLP model with 6 neurons in input layers, 4 neurons in the hidden layers, and 1 neuron in the output layer.
What is the size of the weight matrices between the hidden output layer and input hidden layer?
a) [1X4], [4x6]
b) [6x4], [1x4]
c) [6x4], [4x1]
d) [4x1], [6x4]
Answers
Answered by
17
Answer:
a=4,24
b=24,4
c=24,4
d=4,24
Answered by
5
In a simple MLP model, there are 6 neurons in the input layers, 4 in hidden layers, and 1 in the output layer. Between the hidden output layer and the input hidden layer, The weight matrices size is - [4x1], [6x4] (Option d).
- Modeling univariate time series forecasting problems with Multilayer Perceptrons, or MLPs, is possible. A simple MLP model includes a single hidden layer of nodes and an output layer that predicts the outcome.
- The formula, [nodes in layer 1 X nodes in layer 2] gives the size of weights between any two layers.
#SPJ3
Similar questions