Activation Function Cheat Sheet

Common Functions

ReLU

ReLU(x) = max(0,x)

Sigmoid

$$Sigmoid(x) = \frac{1}{1+exp(-x)}$$


tanh

$$tanh(x) = \frac{e^{x}-e^{-x}}{e^{x}+e^{-x}}$$


Softmax

$$Softmax(x{i}) = \frac{exp(x{i})}{\sum{j} exp(x{j})}$$

es not work with NLLLoss


LogSoftmax

$$Softmax(x{i}) = log\bigg( \frac{exp(x{i})}{\sum{j} exp(x{j})}\bigg)$$


Others Functions

ELU

$$ELU(x) = max(0,x) + min(0,\alpha * (exp(x)-1))$$


HardShrink

$$HardShrink(x)=\begin{cases}
x, & \text{if}\ x > \lambda \
x, & \text{if}\ x < -\lambda \ 0, & \text{otherwise} \end{cases}$$


HardTanh

$$HardTanh(x)=\begin{cases} 1, & \text{if}\ x > 1 \
-1, & \text{if}\ x < -1 \ x, & \text{otherwise} \end{cases}$$


LeakyReLU

$$LeakyReLU(x) = max(0,x) + negative_slope * min(0,x)$$


LogSigmoid

$$LogSigmoid(x) = log\bigg(\frac{1}{1+exp(-x)}\bigg)$$


PReLU

$$PReLU(x) = max(0,x) + a * min(0,x)$$


ReLU6

$$ReLU6(x) = min(max(0,x),6)$$


SELU

$$SELU(x) = scale (max(0,x) + min(0, \alpha (exp(x)-1)))$$


Softplus

$$Softplus(x)=\frac{1}{\beta}log(1+exp(\beta x))$$


Softshrink

$$SoftShrink(x)=\begin{cases} x - \lambda, & \text{if}\ x > \lambda \
x + \lambda, & \text{if}\ x < -\lambda \
0, & \text{otherwise}\end{cases}$$


Softsign

$$SoftSign(x) = \frac{x}{1+|x|}$$


Tanhshrink

$$Tanhshrink(x) = x - tanh(x)$$