LogLoss Class
Definition
Important
Some information relates to prerelease product that may be substantially modified before it’s released. Microsoft makes no warranties, express or implied, with respect to the information provided here.
The Log Loss, also known as the Cross Entropy Loss. It is commonly used in classification tasks.
public sealed class LogLoss : Microsoft.ML.Trainers.ILossFunction<float,float>, Microsoft.ML.Trainers.ISupportSdcaClassificationLoss
type LogLoss = class
interface ISupportSdcaClassificationLoss
interface ISupportSdcaLoss
interface IScalarLoss
interface ILossFunction<single, single>
interface IClassificationLoss
Public NotInheritable Class LogLoss
Implements ILossFunction(Of Single, Single), ISupportSdcaClassificationLoss
- Inheritance
-
LogLoss
- Implements
Remarks
The Log Loss function is defined as:
$L(p(\hat{y}), y) = -y ln(\hat{y}) - (1 - y) ln(1 - \hat{y})$
where $\hat{y}$ is the predicted score, $p(\hat{y})$ is the probability of belonging to the positive class by applying a sigmoid function to the score, and $y \in \{0, 1\}$ is the true label.
Note that the labels used in this calculation are 0 and 1, unlike Hinge Loss and Exponential Loss, where the labels used are -1 and 1.
The Log Loss function provides a measure of how certain a classifier's predictions are, instead of just measuring how correct they are. For example, a predicted probability of 0.80 for a true label of 1 gets penalized more than a predicted probability of 0.99.
Constructors
LogLoss() |
Methods
ComputeDualUpdateInvariant(Single) | |
Derivative(Single, Single) | |
DualLoss(Single, Single) | |
DualUpdate(Single, Single, Single, Single, Int32) | |
Loss(Single, Single) |