tensornn.activation

This file contains the activation functions of TensorNN. Activation functions modify their input to create non-linearity in the network. This allows your network to handle more complex problems. They are very similar to a layer.

Classes

Activation

Base activation class.

ELU

Exponential linear unit is similar to ReLU, but it is not piecewise.

LeakyReLU

Leaky ReLU is extremely similar to ReLU.

NewtonsSerpentine

Haven't seen it anywhere so I am not sure if this is good but seemed like a good candidate.

NoActivation

Linear activation function, doesn't change anything.

ReLU

The rectified linear unit activation function is one of the simplest activation function.

Sigmoid

The sigmoid function's output is always between -1 and 1 Formula: 1 / (1+e^(-x)) | constants: e(Euler's number, 2.718...)

Softmax

The softmax activation function is most commonly used in the output layer.

Swish

The swish activation function is the output of the sigmoid function multiplied by x.

Tanh

TODO: add description