tensornn.activation
This file contains the activation functions of TensorNN. Activation functions modify their input to create non-linearity in the network. This allows your network to handle more complex problems. They are very similar to a layer.
Classes
Base activation class. |
|
Exponential linear unit is similar to ReLU, but it is not piecewise. |
|
Leaky ReLU is extremely similar to ReLU. |
|
Haven't seen it anywhere so I am not sure if this is good but seemed like a good candidate. |
|
Linear activation function, doesn't change anything. |
|
The rectified linear unit activation function is one of the simplest activation function. |
|
The sigmoid function's output is always between -1 and 1 Formula: |
|
The softmax activation function is most commonly used in the output layer. |
|
The swish activation function is the output of the sigmoid function multiplied by x. |
|
TODO: add description |