Documentation

CommunicationComplexity.InformationTheory.KLDivergence

KL Divergence #

This file imports Mathlib's Kullback-Leibler divergence development for use in the project.

On a finite measurable space, the Kullback-Leibler divergence between finite measures is exactly in the displayed formula when some singleton has zero ν-mass and nonzero μ-mass; otherwise it is the finite sum of the log-likelihood ratio against the singleton masses, with Mathlib's finite-measure correction term.

theorem CommunicationComplexity.FiniteMeasureSpace.probabilityMeasure_klDiv_eq_sum_llr {Ω : Type u_1} [MeasurableSpace Ω] [FiniteMeasureSpace Ω] (μ ν : MeasureTheory.ProbabilityMeasure Ω) :
InformationTheory.klDiv μ ν = if ∃ (ω : Ω), (↑ν).toPMF ω = 0 (↑μ).toPMF ω 0 then else ENNReal.ofReal (∑ ω : Ω, ((↑μ).toPMF ω).toReal * MeasureTheory.llr (↑μ) (↑ν) ω)

On a finite measurable space, the Kullback-Leibler divergence between probability measures is if some singleton has zero ν-mass and nonzero μ-mass; otherwise it is a finite sum over the PMFs.

On a finite measurable space, the Kullback-Leibler divergence between PMFs is if some point has zero q-mass and nonzero p-mass; otherwise it is a finite sum over the PMFs.

On a finite measurable space, the Kullback-Leibler divergence between finite measures is if some singleton has zero ν-mass and nonzero μ-mass; otherwise it is the finite sum of μ {ω} * log (μ {ω} / ν {ω}), with Mathlib's finite-measure correction term.

theorem CommunicationComplexity.FiniteMeasureSpace.probabilityMeasure_klDiv_eq_sum_log {Ω : Type u_1} [MeasurableSpace Ω] [FiniteMeasureSpace Ω] (μ ν : MeasureTheory.ProbabilityMeasure Ω) :
InformationTheory.klDiv μ ν = if ∃ (ω : Ω), (↑ν).toPMF ω = 0 (↑μ).toPMF ω 0 then else ENNReal.ofReal (∑ ω : Ω, ((↑μ).toPMF ω).toReal * Real.log (((↑μ).toPMF ω).toReal / ((↑ν).toPMF ω).toReal))

On a finite measurable space, the Kullback-Leibler divergence between probability measures is if some singleton has zero ν-mass and nonzero μ-mass; otherwise it is the finite sum of p ω * log (p ω / q ω), where p and q are the PMFs of the two measures.

On a finite space, KL divergence from μ to a probability measure with full support is finite.

On a finite measurable space, the Kullback-Leibler divergence between PMFs is if some point has zero q-mass and nonzero p-mass; otherwise it is ∑ ω, p ω * log (p ω / q ω).