We give a novel, unified derivation of conditional PAC-Bayesian and mutual information (MI) generalization bounds. We derive conditional MI bounds as an instance, with special choice of prior, of conditional MAC-Bayesian (Mean Approximately Correct) bounds, itself derived from conditional PAC-Bayesian bounds, where ‘conditional’ means that one can use priors conditioned on a joint training and ghost sample. This allows us to get nontrivial PAC-Bayes and MI-style bounds for general VC classes, something recently shown to be impossible with standard PACBayesian/MI bounds. Second, it allows us to get faster rates of order O((KL=n)^g) for g > 1/2 if a Bernstein condition holds and for exp-concave losses (with g = 1), which is impossible with both standard PAC-Bayes generalization and MI bounds. Our work extends the recent work by Steinke and Zakynthinou (2020) who handle MI with VC but neither PAC-Bayes nor fast rates, the recent work of Hellström and Durisi (2020) who extend the latter to the PAC-Bayes setting via a unifying exponential inequality, and Mhammedi et al. (2019) who initiated fast rate PAC-Bayes generalization error bounds but handle neither MI nor general VC classes.

, , , , , ,
Google Research, Zurich, Switzerland
Proceedings of Machine Learning Research
Safe Bayesian Inference: A Theory of Misspecification based on Statistical Learning
34th Annual Conference on Learning Theory (COLT 2021)
Machine Learning

Grünwald, P., Steinke, T., & Zakynthinou, L. (2021). PAC-Bayes, MAC-Bayes and Conditional Mutual Information: Fast rate bounds that handle general VC classes. In 34th Conference on Learning Theory (pp. 2217–2247).