Computer Science, asked by maheshwarimohit354, 6 months ago

17. Let's say, you are using activation
function X in hidden layers of neural
network. At a neuron for any given
input, you get the output as
0.0001". Which of the following
activation function could
X
represent?
(A) ReLU
(B)
tanh
(9) SIGMOID
(D) None of these​

Answers

Answered by Itzabhi001
2

Answer:

the answer is option (B) tanh.

Answered by abhi3023
2

Explanation:

(B) hanh...........,....

Similar questions