Interface Loss
- All Superinterfaces:
BiFunction<Tensor,Tensor, Tensor>
Loss functions.
-
Method Summary
Static MethodsModifier and TypeMethodDescriptionstatic Lossbce()Binary Cross-Entropy Loss Function.static LossBinary Cross-Entropy with Logits Loss Function.static LossCross Entropy Loss Function.static LossHinge Embedding Loss Function.static Losshuber(double delta) Huber Loss Function.static Losskl()Kullback-Leibler Divergence Loss Function.static Lossl1()Mean Absolute Error (L1) Loss Function.static TensormarginRanking(Tensor input1, Tensor input2, Tensor target) Margin Ranking Loss Function.static Lossmse()Mean Squared Error (L2) Loss Function.static Lossnll()Negative Log-Likelihood Loss Function.static LosssmoothL1()Smooth L1 (Huber) Loss Function.static TensortripleMarginRanking(Tensor anchor, Tensor positive, Tensor negative) Triplet Margin Ranking Loss Function.Methods inherited from interface BiFunction
andThen, apply
-
Method Details
-
l1
-
mse
-
nll
-
crossEntropy
-
hingeEmbedding
-
bce
Binary Cross-Entropy Loss Function. Measures the binary cross-entropy between the target and the input probabilities. Input should be in [0,1].- Returns:
- the loss functor.
-
bceWithLogits
Binary Cross-Entropy with Logits Loss Function. Combines a sigmoid activation and binary cross-entropy in a numerically stable way.- Returns:
- the loss functor.
-
smoothL1
Smooth L1 (Huber) Loss Function. Uses a squared term if the absolute element-wise error falls below beta (default 1) and an L1 term otherwise. This is less sensitive to outliers than MSE and avoids the gradient discontinuity of plain MAE.- Returns:
- the loss functor.
-
huber
Huber Loss Function. Equivalent to smooth L1 when delta = 1.- Parameters:
delta- the threshold at which to change between L1 and L2.- Returns:
- the loss functor.
-
kl
-
marginRanking
-
tripleMarginRanking
-