Computer Science, asked by dsvss, 1 year ago

RELU in deep learning stands for

Answers

Answered by nishant6517
7
Rectified linear unit(RELU)
Answered by imhkp4u
2

ReLU stands for Rectified Linear Unit which is one of the active functions of deep learning it can be approximated by softplus function and the only non linearity comes from the selection and the path with the help of individual neurones whether being active or inactive.

In mathematics it is defined as below:

y = max(0, x).

It is most commonly used in CNN neural networks.

Similar questions