Activation functions are non-differentiable in data science
Answers
Answered by
0
Answer:
Activation functions also known as transfer function is used to map input nodes to output nodes in certain fashion.
They are used to impart non linearity .
There are many activation functions used in Machine Learning out of which commonly used are listed Activation functions also known as transfer function is used to map input nodes to output nodes in certain fashion.
They are used to impart non linearity .
There are many activation functions used in Machine Learning out of which commonly used are listed below
Similar questions
Art,
25 days ago
History,
1 month ago
Business Studies,
1 month ago
History,
9 months ago
Social Sciences,
9 months ago