We formalize the idea of probability distributions that lead to reliable predictions about some, but not all aspects of a domain. The resulting notion of 'safety' provides a fresh perspective on foundational issues in statistics, providing a middle ground between imprecise probability and multiple-prior models on the one hand and strictly Bayesian approaches on the other. It also allows us to formalize fiducial distributions in terms of the set of random variables that they can safely predict, thus taking some of the sting out of the fiducial idea. By restricting probabilistic inference to safe uses, one also automatically avoids paradoxes such as the Monty Hall problem. Safety comes in a variety of degrees, such as 'validity' (the strongest notion), 'calibration', 'confidence safety' and 'unbiasedness' (almost the weakest notion).
|Journal||Journal of Statistical Planning and Inference|
|Project||Safe Bayesian Inference: A Theory of Misspecification based on Statistical Learning|
Grünwald, P.D. (2018). Safe probability. Journal of Statistical Planning and Inference, 195, 47–63. doi:10.1016/j.jspi.2017.09.014