incerto.bayesian.expected_calibration_error

incerto.bayesian.expected_calibration_error#

incerto.bayesian.expected_calibration_error(predictions, labels, n_bins=10)[source]#

Compute Expected Calibration Error for Bayesian predictions.

This is a convenience wrapper around incerto.calibration.ece_score that handles the mean prediction from an ensemble.

Parameters:
  • predictions (Tensor) – Mean predictions (batch_size, num_classes). For ensembles, average the predictions first.

  • labels (Tensor) – True labels (batch_size,)

  • n_bins (int) – Number of bins for calibration

Return type:

float

Returns:

ECE score

Example

>>> # For ensemble predictions
>>> ensemble_preds = torch.softmax(torch.randn(10, 32, 5), dim=-1)
>>> mean_preds = ensemble_preds.mean(dim=0)
>>> ece = expected_calibration_error(mean_preds, labels, n_bins=10)

See also

incerto.calibration.ece_score: The canonical ECE implementation