Rényi divergence #
Main definitions #
FooBar
Main statements #
fooBar_unique
Notation #
Implementation details #
Rényi divergence of order a
. If a = 1
, it is defined as kl μ ν
, otherwise as
(a - 1)⁻¹ * log (ν(α) + (a - 1) * Hₐ(μ, ν))
.
If ν
is a probability measure then this becomes the more usual definition
(a - 1)⁻¹ * log (1 + (a - 1) * Hₐ(μ, ν))
, but this definition maintains some useful properties
also for a general finite measure ν
, in particular the integral form
Rₐ(μ, ν) = (a - 1)⁻¹ * log (∫ x, ((∂μ/∂ν) x) ^ a ∂ν)
.
We use ENNReal.log instead of Real.log, because it is monotone on ℝ≥0∞
, while the real log is
monotone only on (0, ∞)
(Real.log 0 = 0
). This allows us to transfer inequalities from the
Hellinger divergence to the Rényi divergence.
Equations
- ProbabilityTheory.renyiDiv a μ ν = if a = 1 then ProbabilityTheory.kl μ ν else ↑(a - 1)⁻¹ * (↑(ν Set.univ) + (↑a - 1) * ProbabilityTheory.hellingerDiv a μ ν).toENNReal.log
Instances For
The Rényi divergence renyiDiv a μ ν
can be written as the log of an integral
with respect to ν
.
The Rényi divergence renyiDiv a μ ν
can be written as the log of an integral
with respect to ν
.
If a < 1
, use renyiDiv_eq_log_integral_of_lt_one
instead.
If μ ≪ ν
, the Rényi divergence renyiDiv a μ ν
can be written as the log of an integral
with respect to μ
.
If μ ≪ ν
, the Rényi divergence renyiDiv a μ ν
can be written as the log of an integral
with respect to μ
.
If a < 1
, use renyiDiv_eq_log_integral_of_lt_one'
instead.
Density of the Rényi measure renyiMeasure a μ ν
with respect to μ + ν
.
Equations
- ProbabilityTheory.renyiDensity a μ ν x = μ.rnDeriv (μ + ν) x ^ a * ν.rnDeriv (μ + ν) x ^ (1 - a) * ENNReal.ofReal (Real.exp (-(a - 1) * (ProbabilityTheory.renyiDiv a μ ν).toReal))
Instances For
Tilted measure of μ
with respect to ν
parametrized by a
.
Equations
- ProbabilityTheory.renyiMeasure a μ ν = (μ + ν).withDensity (ProbabilityTheory.renyiDensity a μ ν)
Instances For
The Data Processing Inequality for the Renyi divergence.