Skip to content

Repository of NeurIPS 2019 paper "Calibration tests in multi-class classification: A unifying framework"

Notifications You must be signed in to change notification settings

devmotion/CalibrationPaper

Repository files navigation

CalibrationPaper

This repository accompanies the paper "Calibration tests in multi-class classification: A unifying framework" by Widmann, Lindsten, and Zachariah, which was presented at NeurIPS 2019.

2021-05-04: We extended the calibration errors and tests to general probabilistic predictive models in our paper "Calibration tests beyond classification" presented at ICLR 2021

Structure

The folder paper contains the LaTeX source code of the paper.

The folder experiments contains the source code and the results of our experiments.

The folder src contains common implementations such as the definition of the generative models, which are used for generating the figures in our paper and for some experiments.

Reproducibility

You can rerun our experiments and recompile our paper. Every folder contains instructions for how to build and run the files therein.

Software

We published software packages for the proposed calibration errors and calibration tests.

Julia packages

Python and R interface

  • pycalibration is a Python interface for CalibrationErrors.jl, CalibrationErrorsDistributions.jl, and CalibrationTests.jl.
  • rcalibration is an R interface for CalibrationErrors.jl, CalibrationErrorsDistributions.jl, and CalibrationTests.jl.

About

Repository of NeurIPS 2019 paper "Calibration tests in multi-class classification: A unifying framework"

Resources

Stars

Watchers

Forks

Packages

No packages published