We discuss a~simple definition of conditional mutual information (CMI) for fields and $\sigma$-fields. The new definition is applicable also in nonregular cases, unlike the well-known but more restricted definition of CMI by Dobrushin. Certain properties of the two notions of CMI and their equivalence for countably generated $\sigma$-fields are established. We also consider an application, which concerns the ergodic decomposition of mutual information for stationary processes. In this case, CMI is tightly linked, via additivity of information, with entropy defined as self-information. Thus we reconsider the latter concept in some detail.

, , , ,
Statistics & Probability Letters
Learning when all models are wrong
Quantum Computing and Advanced System Research

Debowski, L. J. (2009). A general definition of conditional information and its application to ergodic decomposition. Statistics & Probability Letters, 79, 1260–1268.