Documentation

CommunicationComplexity.InformationTheory.Entropy

Entropy #

This file imports the entropy definitions and basic API from PFR.ForMathlib.Entropy.Basic.

The main definitions live in the ProbabilityTheory namespace:

It also provides notations such as H[X], H[X ; μ], H[X | Y], and I[X : Y].

theorem ProbabilityTheory.entropy_le_log_of_card_le {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] [Fintype S] [Nonempty S] [MeasurableSingletonClass S] (X : ΩS) (μ : MeasureTheory.Measure Ω) {N : } (hcard : Fintype.card S N) :
H[X; μ] Real.log N

Entropy is bounded by the logarithm of any natural-number upper bound on the alphabet size.

Entropy is bounded by c * log 2 when the alphabet has size at most 2 ^ c.

Mutual information is bounded by the entropy of the left variable.

Mutual information is bounded by the entropy of the right variable.

theorem ProbabilityTheory.mutualInfo_congr_ae {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} [MeasurableSpace T] {X : ΩS} {Y : ΩT} {μ : MeasureTheory.Measure Ω} {X' : ΩS} {Y' : ΩT} (hX : Measurable X) (hY : Measurable Y) (hXae : X =ᵐ[μ] X') (hYae : Y =ᵐ[μ] Y') :
I[X : Y ; μ] = I[X' : Y' ; μ]

Mutual information is unchanged by an injective recoding of the right variable.

theorem ProbabilityTheory.mutualInfo_eq_zero_of_ae_eq_const_left {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} [MeasurableSpace T] [MeasurableSingletonClass S] [MeasurableSingletonClass T] {X : ΩS} {Y : ΩT} {μ : MeasureTheory.Measure Ω} [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] (hX : Measurable X) (hY : Measurable Y) (c : S) (hconst : X =ᵐ[μ] fun (x : Ω) => c) :
I[X : Y ; μ] = 0

If the left variable is almost surely constant, its mutual information with any finite variable is zero.

theorem ProbabilityTheory.mutualInfo_eq_zero_of_ae_eq_const_right {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} [MeasurableSpace T] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [Countable S] [Countable T] {X : ΩS} {Y : ΩT} {μ : MeasureTheory.Measure Ω} [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] (hX : Measurable X) (hY : Measurable Y) (c : T) (hconst : Y =ᵐ[μ] fun (x : Ω) => c) :
I[X : Y ; μ] = 0

If the right variable is almost surely constant, its mutual information with any finite variable is zero.

theorem ProbabilityTheory.condMutualInfo_eq_zero_of_ae_eq_const_left {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] (hX : Measurable X) (hY : Measurable Y) (c : S) (hconst : X =ᵐ[μ] fun (x : Ω) => c) :
I[X : Y|Z;μ] = 0

If the left variable is almost surely constant, its conditional mutual information with any finite variable is zero.

theorem ProbabilityTheory.condMutualInfo_eq_zero_of_ae_eq_const_right {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] (hX : Measurable X) (hY : Measurable Y) (c : T) (hconst : Y =ᵐ[μ] fun (x : Ω) => c) :
I[X : Y|Z;μ] = 0

If the right variable is almost surely constant, its conditional mutual information with any finite variable is zero.

theorem ProbabilityTheory.indepFun_of_measureReal_inter_preimage_singleton_eq_mul {Ω : Type u_5} {S : Type u_6} {T : Type u_7} [MeasurableSpace Ω] [MeasurableSpace S] [MeasurableSpace T] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [Finite S] [Finite T] (μ : MeasureTheory.Measure Ω) [MeasureTheory.IsFiniteMeasure μ] (X : ΩS) (Y : ΩT) (hX : Measurable X) (hY : Measurable Y) (h : ∀ (x : S) (y : T), μ.real (X ⁻¹' {x} Y ⁻¹' {y}) = μ.real (X ⁻¹' {x}) * μ.real (Y ⁻¹' {y})) :
IndepFun X Y μ

For finite alphabets, independence of two random variables follows from factorization on singleton fibers.

Conditional mutual information is bounded by the left conditional entropy.

Conditional mutual information is bounded by the right conditional entropy.

Conditional mutual information is bounded by the entropy of the left variable.

Conditional mutual information is bounded by the entropy of the right variable.

theorem ProbabilityTheory.condMutualInfo_comp_left_le_of_comp_conditioning {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {V : Type u_5} [MeasurableSpace V] [MeasurableSingletonClass V] [Countable V] [MeasureTheory.IsProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) (f : USV) :
I[fun (ω : Ω) => f (Z ω) (X ω) : Y|Z;μ] I[X : Y|Z;μ]

Conditional data processing where the left-side postprocessing may depend on the conditioning value.

theorem ProbabilityTheory.condMutualInfo_comp_right_le_of_comp_conditioning {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {V : Type u_5} [MeasurableSpace V] [MeasurableSingletonClass V] [Countable V] [MeasureTheory.IsProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) (f : UTV) (hf : Measurable (Function.uncurry f)) :
I[X : fun (ω : Ω) => f (Z ω) (Y ω)|Z;μ] I[X : Y|Z;μ]

Conditional data processing where the right-side postprocessing may depend on the conditioning value.

theorem ProbabilityTheory.condMutualInfo_conditioning_prod_function_eq {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {V : Type u_5} [MeasurableSpace V] [MeasurableSingletonClass V] [Countable V] [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) (f : UV) :
I[X : Y|fun (ω : Ω) => (Z ω, f (Z ω));μ] = I[X : Y|Z;μ]

Adding a deterministic function of the conditioning variable to the conditioning variable does not change conditional mutual information.

theorem ProbabilityTheory.condMutualInfo_conditioning_prod_left_function_le {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {V : Type u_5} [MeasurableSpace V] [MeasurableSingletonClass V] [Countable V] [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) (f : USV) (hf : Measurable (Function.uncurry f)) :
I[X : Y|fun (ω : Ω) => (Z ω, f (Z ω) (X ω));μ] I[X : Y|Z;μ]

Conditioning additionally on a deterministic function of the left variable and the existing conditioning data cannot increase conditional mutual information.

theorem ProbabilityTheory.condMutualInfo_conditioning_prod_right_function_le {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {V : Type u_5} [MeasurableSpace V] [MeasurableSingletonClass V] [Countable V] [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) (f : UTV) (hf : Measurable (Function.uncurry f)) :
I[X : Y|fun (ω : Ω) => (Z ω, f (Z ω) (Y ω));μ] I[X : Y|Z;μ]

Conditioning additionally on a deterministic function of the right variable and the existing conditioning data cannot increase conditional mutual information.

If a coarse conditioning variable is a deterministic function of a finer one, and the finer conditioning variable carries no conditional information about X beyond the coarse variable, then the finer conditioning can only increase I[X : Y | -].

theorem ProbabilityTheory.condMutualInfo_prod_left_eq_add {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {V : Type u_5} [MeasurableSpace V] [MeasurableSingletonClass V] [Countable V] {W : ΩV} (hX : Measurable X) (hW : Measurable W) (hY : Measurable Y) (hZ : Measurable Z) [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange W] [FiniteRange Y] [FiniteRange Z] :
I[fun (ω : Ω) => (X ω, W ω) : Y|Z;μ] = I[X : Y|Z;μ] + I[W : Y|fun (ω : Ω) => (X ω, Z ω);μ]

Chain rule for conditional mutual information, splitting a pair on the left.

theorem ProbabilityTheory.condMutualInfo_prod_right_eq_add {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {V : Type u_5} [MeasurableSpace V] [MeasurableSingletonClass V] [Countable V] {W : ΩV} (hX : Measurable X) (hY : Measurable Y) (hW : Measurable W) (hZ : Measurable Z) [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange W] [FiniteRange Z] :
I[X : fun (ω : Ω) => (Y ω, W ω)|Z;μ] = I[X : Y|Z;μ] + I[X : W|fun (ω : Ω) => (Y ω, Z ω);μ]

Chain rule for conditional mutual information, splitting a pair on the right.

theorem ProbabilityTheory.condMutualInfo_le_prod_right_snd {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {V : Type u_5} [MeasurableSpace V] [MeasurableSingletonClass V] [Countable V] {W : ΩV} (hX : Measurable X) (hY : Measurable Y) (hW : Measurable W) (hZ : Measurable Z) [MeasureTheory.IsProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange W] [FiniteRange Z] :
I[X : W|Z;μ] I[X : fun (ω : Ω) => (Y ω, W ω)|Z;μ]

Adding an extra right-side variable cannot decrease conditional mutual information.

def ProbabilityTheory.boolVectorStrictPrefix {Ω : Type u_6} {m : } (X : ΩFin mBool) (i : Fin m) (ω : Ω) :
Fin iBool

The strict prefix of a boolean vector-valued random variable.

Equations
Instances For
    theorem ProbabilityTheory.condMutualInfo_boolVector_eq_sum_strictPrefix {Ω : Type u_6} {T : Type u_7} {U : Type u_8} [MeasurableSpace Ω] [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable T] [Countable U] {m : } {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange Y] [FiniteRange Z] (X : ΩFin mBool) (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) :
    I[X : Y|Z;μ] = i : Fin m, I[fun (ω : Ω) => X ω i : Y|fun (ω : Ω) => (boolVectorStrictPrefix X i ω, Z ω);μ]

    Chain rule for conditional mutual information against a finite boolean vector, exposing coordinates from left to right.

    theorem ProbabilityTheory.cond_cond_eq_cond_of_subset {Ω₀ : Type u_6} [MeasurableSpace Ω₀] (μ : MeasureTheory.Measure Ω₀) [MeasureTheory.IsFiniteMeasure μ] {A F : Set Ω₀} (hA : MeasurableSet A) (hF : MeasurableSet F) (hFA : F A) :
    μ[|A][|F] = μ[|F]

    Conditioning on a larger event and then on a smaller event is the same as conditioning directly on the smaller event.

    Real masses obey the same reweighting identity for nested conditioning events.

    theorem ProbabilityTheory.condMutualInfo_prod_conditioning_eq_sum {Ω₀ : Type u_6} {S₀ : Type u_7} {T₀ : Type u_8} {A : Type u_9} {U₀ : Type u_10} [MeasurableSpace Ω₀] [MeasurableSpace S₀] [MeasurableSpace T₀] [MeasurableSpace A] [MeasurableSpace U₀] [MeasurableSingletonClass S₀] [MeasurableSingletonClass T₀] [MeasurableSingletonClass A] [MeasurableSingletonClass U₀] [Countable S₀] [Countable T₀] [Fintype A] [Finite U₀] {X : Ω₀S₀} {Y : Ω₀T₀} {K : Ω₀A} {Z : Ω₀U₀} {μ : MeasureTheory.Measure Ω₀} [MeasureTheory.IsFiniteMeasure μ] (hK : Measurable K) (hZ : Measurable Z) [FiniteRange X] [FiniteRange Y] :
    I[X : Y|fun (ω : Ω₀) => (K ω, Z ω);μ] = k : A, μ.real (K ⁻¹' {k}) * I[X : Y|Z;μ[|K ⁻¹' {k}]]

    If the conditioning variable is a pair (K, Z), conditional mutual information is the average over the fibers of K of the conditional mutual information given Z.

    theorem ProbabilityTheory.measureReal_mul_cond_condMutualInfo_le_condMutualInfo_of_event_eq_preimage {Ω₀ : Type u_6} {S₀ : Type u_7} {T₀ : Type u_8} {U₀ : Type u_9} [MeasurableSpace Ω₀] [MeasurableSpace S₀] [MeasurableSpace T₀] [MeasurableSpace U₀] [MeasurableSingletonClass S₀] [MeasurableSingletonClass T₀] [MeasurableSingletonClass U₀] [Countable S₀] [Countable T₀] [Countable U₀] (μ : MeasureTheory.Measure Ω₀) [MeasureTheory.IsProbabilityMeasure μ] (X : Ω₀S₀) (Y : Ω₀T₀) (Z : Ω₀U₀) (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) [FiniteRange X] [FiniteRange Y] [FiniteRange Z] {A : Set Ω₀} {B : Set U₀} (hB : MeasurableSet B) (hA : A = Z ⁻¹' B) :
    μ.real A * I[X : Y|Z;μ[|A]] I[X : Y|Z;μ]

    If an event is determined by the conditioning variable, then the contribution of the conditional mutual information on that event is bounded by the original conditional mutual information. This is the information-theoretic reweighting used in Claim 6.21.

    theorem ProbabilityTheory.mutualInfo_prod_right_eq_add {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} [MeasurableSpace T] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [Countable S] [Countable T] {X : ΩS} {Y : ΩT} {μ : MeasureTheory.Measure Ω} {V : Type u_5} [MeasurableSpace V] [MeasurableSingletonClass V] [Countable V] {W : ΩV} (hX : Measurable X) (hY : Measurable Y) (hW : Measurable W) [MeasureTheory.IsZeroOrProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange W] :
    I[X : fun (ω : Ω) => (Y ω, W ω) ; μ] = I[X : Y ; μ] + I[X : W|Y;μ]

    Chain rule for mutual information, splitting a pair on the right.

    Chain-rule rearrangement: if conditioning on Y does not increase the dependence between X and W, then conditioning on W cannot increase the information that Y has about X beyond the unconditioned I[X : Y].

    theorem ProbabilityTheory.condEntropy_congr_ae {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} [MeasurableSpace T] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [Countable S] [Countable T] {X : ΩS} {Y : ΩT} {μ : MeasureTheory.Measure Ω} {X' : ΩS} {Y' : ΩT} [MeasureTheory.IsProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange X'] [FiniteRange Y'] (hX : Measurable X) (hY : Measurable Y) (hX' : Measurable X') (hY' : Measurable Y') (hXae : X =ᵐ[μ] X') (hYae : Y =ᵐ[μ] Y') :
    H[X | Y ; μ] = H[X' | Y' ; μ]

    Conditional entropy is unchanged when both variables are replaced by almost-everywhere equal variables.

    theorem ProbabilityTheory.condMutualInfo_congr_ae_left_right {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {X' : ΩS} {Y' : ΩT} [MeasureTheory.IsProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange X'] [FiniteRange Y'] [FiniteRange Z] (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) (hX' : Measurable X') (hY' : Measurable Y') (hXae : X =ᵐ[μ] X') (hYae : Y =ᵐ[μ] Y') :
    I[X : Y|Z;μ] = I[X' : Y'|Z;μ]

    Conditional mutual information is unchanged when the two measured variables are replaced by almost-everywhere equal variables and the conditioning variable is unchanged.

    theorem ProbabilityTheory.condMutualInfo_congr_ae {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {X' : ΩS} {Y' : ΩT} {Z' : ΩU} [MeasureTheory.IsProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] [FiniteRange X'] [FiniteRange Y'] [FiniteRange Z'] (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) (hX' : Measurable X') (hY' : Measurable Y') (hZ' : Measurable Z') (hXae : X =ᵐ[μ] X') (hYae : Y =ᵐ[μ] Y') (hZae : Z =ᵐ[μ] Z') :
    I[X : Y|Z;μ] = I[X' : Y'|Z';μ]

    Conditional mutual information is unchanged when all three variables are replaced by almost-everywhere equal variables.

    theorem ProbabilityTheory.IdentDistrib.condMutualInfo_eq {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {Ω' : Type u_6} [MeasurableSpace Ω'] {μ' : MeasureTheory.Measure Ω'} {X' : Ω'S} {Y' : Ω'T} {Z' : Ω'U} [MeasureTheory.IsProbabilityMeasure μ] [MeasureTheory.IsProbabilityMeasure μ'] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] [FiniteRange X'] [FiniteRange Y'] [FiniteRange Z'] (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) (hX' : Measurable X') (hY' : Measurable Y') (hZ' : Measurable Z') (h : IdentDistrib (fun (ω : Ω) => (X ω, Y ω, Z ω)) (fun (ω : Ω') => (X' ω, Y' ω, Z' ω)) μ μ') :
    I[X : Y|Z;μ] = I[X' : Y'|Z';μ']

    Conditional mutual information is determined by the joint law of (X, Y, Z).

    theorem ProbabilityTheory.condMutualInfo_comp_right_conditioning_of_injective {Ω : Type u_1} {S : Type u_2} [MeasurableSpace Ω] [MeasurableSpace S] {T : Type u_3} {U : Type u_4} [MeasurableSpace T] [MeasurableSpace U] [MeasurableSingletonClass S] [MeasurableSingletonClass T] [MeasurableSingletonClass U] [Countable S] [Countable T] [Countable U] {X : ΩS} {Y : ΩT} {Z : ΩU} {μ : MeasureTheory.Measure Ω} {V : Type u_6} {W : Type u_7} [MeasurableSpace V] [MeasurableSpace W] [MeasurableSingletonClass V] [MeasurableSingletonClass W] [Countable V] [Countable W] {f : TV} {g : UW} [MeasureTheory.IsProbabilityMeasure μ] [FiniteRange X] [FiniteRange Y] [FiniteRange Z] (hX : Measurable X) (hY : Measurable Y) (hZ : Measurable Z) (hfmeas : Measurable f) (hgmeas : Measurable g) (hfinj : Function.Injective f) (hginj : Function.Injective g) :
    I[X : f Y|g Z;μ] = I[X : Y|Z;μ]

    Conditional mutual information is unchanged by injective recodings of the right variable and the conditioning variable.

    theorem ProbabilityTheory.IdentDistrib.cond_of_pair {Ω : Type u_1} [MeasurableSpace Ω] {μ : MeasureTheory.Measure Ω} {A : Type u_6} {B : Type u_7} [MeasurableSpace A] [MeasurableSpace B] {Ω' : Type u_8} [MeasurableSpace Ω'] {μ' : MeasureTheory.Measure Ω'} {X : ΩA} {Y : ΩB} {X' : Ω'A} {Y' : Ω'B} {s : Set B} (hs : MeasurableSet s) (hY : Measurable Y) (hY' : Measurable Y') (h : IdentDistrib (fun (ω : Ω) => (X ω, Y ω)) (fun (ω : Ω') => (X' ω, Y' ω)) μ μ') :
    IdentDistrib X X' μ[|Y ⁻¹' s] μ'[|Y' ⁻¹' s]

    If (X, Y) and (X', Y') have the same joint law, then conditioning on the same measurable event in the Y/Y' coordinate preserves the law of X/X'. This is the heterogeneous-codomain version of ProbabilityTheory.IdentDistrib.cond.