KL Divergence #
This file imports Mathlib's Kullback-Leibler divergence development for use in the project.
On a finite measurable space, the Kullback-Leibler divergence between finite measures is ∞
exactly in the displayed formula when some singleton has zero ν-mass and nonzero μ-mass;
otherwise it is the finite sum of the log-likelihood ratio against the singleton masses, with
Mathlib's finite-measure correction term.
On a finite measurable space, the Kullback-Leibler divergence between probability measures is
∞ if some singleton has zero ν-mass and nonzero μ-mass; otherwise it is a finite sum over
the PMFs.
On a finite measurable space, the Kullback-Leibler divergence between PMFs is ∞ if some
point has zero q-mass and nonzero p-mass; otherwise it is a finite sum over the PMFs.
On a finite measurable space, the Kullback-Leibler divergence between finite measures is ∞
if some singleton has zero ν-mass and nonzero μ-mass; otherwise it is the finite sum of
μ {ω} * log (μ {ω} / ν {ω}), with Mathlib's finite-measure correction term.
On a finite measurable space, the Kullback-Leibler divergence between probability measures is
∞ if some singleton has zero ν-mass and nonzero μ-mass; otherwise it is the finite sum of
p ω * log (p ω / q ω), where p and q are the PMFs of the two measures.
On a finite space, KL divergence from μ to a probability measure with full support is
finite.
On a finite measurable space, the Kullback-Leibler divergence between PMFs is ∞ if some
point has zero q-mass and nonzero p-mass; otherwise it is
∑ ω, p ω * log (p ω / q ω).