A Unifying Principle of Distance from Calibration

[ad_1]

We examine the basic query of easy methods to outline and measure the gap from calibration for probabilistic predictors. Whereas the notion of good calibration is well-understood, there isn’t any consensus on easy methods to quantify the gap from good calibration. Quite a few calibration measures have been proposed within the literature, however it’s unclear how they evaluate to one another, and lots of fashionable measures akin to Anticipated Calibration Error (ECE) fail to fulfill primary properties like continuity. We current a rigorous framework for analyzing calibration measures, impressed by the literature on property testing. We suggest a ground-truth notion of distance from calibration: the distance to the closest completely calibrated predictor. We outline a constant calibration measure as one that may be a polynomial issue approximation to the this distance. Making use of our framework, we establish three calibration measures which can be constant and might be estimated effectively: clean calibration, interval calibration, and Laplace kernel calibration. The previous two give quadratic approximations to the bottom fact distance, which we present is information-theoretically optimum. Our work thus establishes basic decrease and higher bounds on measuring distance to calibration, and in addition supplies theoretical justification for preferring sure metrics (like Laplace kernel calibration) in apply.

[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *