On Relations Between the Relative Entropy and

Markov chains chi-squared divergence f-divergences information contraction large deviations maximal correlation method of types relative entropy strong data–processing inequalities

Journal

Entropy (Basel, Switzerland)
ISSN: 1099-4300
Titre abrégé: Entropy (Basel)
Pays: Switzerland
ID NLM: 101243874

Informations de publication

Date de publication:
18 May 2020
Historique:
received: 22 04 2020
revised: 12 05 2020
accepted: 17 05 2020
entrez: 8 12 2020
pubmed: 9 12 2020
medline: 9 12 2020
Statut: epublish

Résumé

The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of

Identifiants

pubmed: 33286335
pii: e22050563
doi: 10.3390/e22050563
pmc: PMC7848888
pii:
doi:

Types de publication

Journal Article

Langues

eng

Références

Entropy (Basel). 2018 May 19;20(5):
pubmed: 33265473
Entropy (Basel). 2019 Dec 30;22(1):
pubmed: 33285826
Entropy (Basel). 2020 Feb 16;22(2):
pubmed: 33285995

Auteurs

Tomohiro Nishiyama (T)

Independent Researcher, Tokyo 206-0003, Japan.

Igal Sason (I)

Faculty of Electrical Engineering, Technion-Israel Institute of Technology, Technion City, Haifa 3200003, Israel.

Classifications MeSH