Lower bounds for hypothesis testing based on information theory

Bibliography

CDDG17

Florence Clerc, Vincent Danos, Fredrik Dahlqvist, and Ilias Garnier, Pointless learning, Foundations of Software Science and Computation Structures: 20th International Conference, FOSSACS 2017, Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2017, Uppsala, Sweden, April 22-29, 2017, Proceedings 20, Springer, 2017, pp. 355–369.

Csi63

Imre Csiszár, Eine informationstheoretische ungleichung und ihre anwendung auf den beweis der ergodizität von markoffschen ketten, A Magyar Tudományos Akadémia Matematikai Kutató Intézetének Közleményei 8 (1963), no. 1-2, 85–108.

DSDG18

Fredrik Dahlqvist, Alexandra Silva, Vincent Danos, and Ilias Garnier, Borel kernels and their approximation, categorically, Electronic Notes in Theoretical Computer Science 341 (2018), 91–119.

HV11

Peter Harremoës and Igor Vajda, On pairs of \( f \)-divergences and their joint range, IEEE Transactions on Information Theory 57 (2011), no. 6, 3230–3235.

Lie12

Friedrich Liese, \(\phi \)-divergences, sufficiency, bayes sufficiency, and deficiency, Kybernetika 48 (2012), no. 4, 690–713.

LV06

Friedrich Liese and Igor Vajda, On divergences and informations in statistics and information theory, IEEE Transactions on Information Theory 52 (2006), no. 10, 4394–4412.

PW24

Yury Polyanskiy and Yihong Wu, Information theory: From coding to learning.

SV16

Igal Sason and Sergio Verdú, \( f \)-divergence inequalities, IEEE Transactions on Information Theory 62 (2016), no. 11, 5973–6006.

VEH14

Tim Van Erven and Peter Harremos, Rényi divergence and kullback-leibler divergence, IEEE Transactions on Information Theory 60 (2014), no. 7, 3797–3820.

ZL18

Zhixin Zhou and Ping Li, Non-asymptotic chernoff lower bound and its application to community detection in stochastic block model, arXiv preprint arXiv:1812.11269 (2018).