Equivalence between two possible versions of the first condition for the finiteness of the conditional KL divergence, the second version is the preferred one.
Equivalence between two possible versions of the second condition for the finiteness of the conditional KL divergence, the first version is the preferred one.
Kullback-Leibler divergence between two kernels κ and η conditional to a measure μ. It is defined as KL(κ, η | μ) := ∫ x, KL(κ x, η x) dμ.
Equations
- One or more equations did not get rendered due to their size.
Instances For
This is to handle the case in condKL_compProd_meas
when the lhs is ⊤, in this case the rhs is
'morally' also ⊤, so the equality holds, but actually in Lean the equality is not true, because of
how we handle the infinities in the integrals, so we have to make a separate lemma for this case.
The chain rule for the KL divergence.
The chain rule for the KL divergence.
Tensorization property for KL divergence