tensornn.loss
This file contains the loss functions used in TensorNN. Loss functions are ways your neural network calculates how off its calculations are. Then this information is used to improve/train it.
Classes
Sigmoid is the only activation function compatible with BinaryCrossEntropy loss. |
|
It is recommended to use the Softmax activation function with this loss. |
|
Base loss class. |
|
Mean absolute error is MSE but instead of squaring the values, you absolute value them. |
|
Mean squared error is calculated extremely simply. |
|
Mean squared logarithmic error is MSE but taking the log of our values before subtraction. |
|
Poisson loss is calculated with this formula: average of (pred-desired*logₑ(pred)) |
|
Root mean squared error is just MSE, but it includes a square root after taking the average. |
|
Residual sum of squares loss is MSE but instead of doing the mean, you do the sum. |
|
Square hinge loss is calculated with this formula: max(0, 1-pred*desired)^2 |