tensornn.loss

This file contains the loss functions used in TensorNN. Loss functions are ways your neural network calculates how off its calculations are. Then this information is used to improve/train it.

Classes

BinaryCrossEntropy

Sigmoid is the only activation function compatible with BinaryCrossEntropy loss.

CategoricalCrossEntropy

It is recommended to use the Softmax activation function with this loss.

Loss

Base loss class.

MAE

Mean absolute error is MSE but instead of squaring the values, you absolute value them.

MSE

Mean squared error is calculated extremely simply.

MSLE

Mean squared logarithmic error is MSE but taking the log of our values before subtraction.

Poisson

Poisson loss is calculated with this formula: average of (pred-desired*logₑ(pred))

RMSE

Root mean squared error is just MSE, but it includes a square root after taking the average.

RSS

Residual sum of squares loss is MSE but instead of doing the mean, you do the sum.

SquaredHinge

Square hinge loss is calculated with this formula: max(0, 1-pred*desired)^2