Posted in 2025
Relative information and the dual numbers
- 10 January 2025
Relative information (Kullback-Leibler divergence) is a key concept in statistics, machine learning and information theory. In fact, singular learning theory tells us that the generalization error of a learning algorithm depends on the structure of algebraic geometric singularities of relative information. This leads us to wonder, what properties of relative information contribute to its ubiquity? In this talk, we will study fundamental characteristics of conditional relative information I_{q||p}(Y|X) where q, p are joint distributions on random variables X, Y. We define the rig category C of random variables and their conditional maps, as well as the rig category R(e) of dual numbers. Relative information can then be constructed, up to a scalar multiple, via rig monoidal functors from C to R(e). Closely related to this construction is the information cohomology of Baudot, Bennequin and Vigneaux.