incerto.calibration.evidential_loss

incerto.calibration.evidential_loss#

incerto.calibration.evidential_loss(evidence, targets, num_classes, epoch, num_epochs, kl_weight=1.0)[source]#

Evidential Deep Learning loss.

Learns Dirichlet distributions over class probabilities, enabling second-order uncertainty estimation.

Loss = MSE(p, y) + λ * KL[Dir(α_tilde) || Dir(1)]

where α = evidence + 1 (Dirichlet parameters) and λ is annealed from 0 to 1 over training.

Reference:

Sensoy et al. “Evidential Deep Learning to Quantify Classification Uncertainty” (NeurIPS 2018)

Parameters:
  • evidence (Tensor) – Non-negative evidence values (N, C)

  • targets (Tensor) – Ground truth labels (N,)

  • num_classes (int) – Number of classes

  • epoch (int) – Current epoch

  • num_epochs (int) – Total number of epochs

  • kl_weight (float) – Maximum KL weight (default: 1.0)

Return type:

tuple[Tensor, Tensor, Tensor]

Returns:

tuple of (total_loss, mse_loss, kl_loss)

Example

>>> evidence = F.softplus(model(x))  # Ensure non-negative
>>> loss, mse, kl = evidential_loss(evidence, targets, 10, epoch, total_epochs)