We show that for any concave utility, the expected utility of an e-variable can only increase after conditioning on a sufficient statistic. The simplest form of the result has an extremely straightforward proof, which follows from a single application of Jensen's inequality. Similar statements hold for compound e-variables, asymptotic e-variables, and e-processes. These results echo the Rao-Blackwell theorem, which states that the expected squared error of an estimator can only decrease after conditioning on a sufficient statistic. We provide several applications of this insight, including a simplified derivation of the log-optimal e-variable for linear regression with known variance.

, ,
doi.org/10.48550/arXiv.2512.16759
Machine Learning

de Roos, D., Chugg, B., Grünwald, P., & Ramdas, A. (2025). Rao-Blackwellized e-variables. doi:10.48550/arXiv.2512.16759