incerto.bayesian.expected_calibration_error#
- incerto.bayesian.expected_calibration_error(predictions, labels, n_bins=10)[source]#
Compute Expected Calibration Error for Bayesian predictions.
This is a convenience wrapper around incerto.calibration.ece_score that handles the mean prediction from an ensemble.
- Parameters:
- Return type:
- Returns:
ECE score
Example
>>> # For ensemble predictions >>> ensemble_preds = torch.softmax(torch.randn(10, 32, 5), dim=-1) >>> mean_preds = ensemble_preds.mean(dim=0) >>> ece = expected_calibration_error(mean_preds, labels, n_bins=10)
See also
incerto.calibration.ece_score: The canonical ECE implementation