17. Let's say, you are using activation
function X in hidden layers of neural
network. At a neuron for any given
input, you get the output as
0.0001". Which of the following
activation function could
X
represent?
(A) ReLU
(B)
tanh
(9) SIGMOID
(D) None of these
Answers
Answered by
2
Answer:
the answer is option (B) tanh.
Answered by
2
Explanation:
(B) hanh...........,....
Similar questions