Rényi Cross-Entropy Measures for Common Distributions and Processes with Memory.

Entropy (Basel)

Department of Mathematics and Statistics, Queen's University, Kingston, ON K7L 3N6, Canada.

Published: October 2022

Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi cross-entropy and the Natural Rényi cross-entropy, were recently used as loss functions for the improved design of deep learning generative adversarial networks. In this work, we derive the Rényi and Natural Rényi differential cross-entropy measures in closed form for a wide class of common continuous distributions belonging to the exponential family, and we tabulate the results for ease of reference. We also summarise the Rényi-type cross-entropy rates between stationary Gaussian processes and between finite-alphabet time-invariant Markov sources.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9601846PMC
http://dx.doi.org/10.3390/e24101417DOI Listing

Publication Analysis

Top Keywords

rényi cross-entropy
12
cross-entropy measures
8
natural rényi
8
rényi
5
cross-entropy
5
measures common
4
common distributions
4
distributions processes
4
processes memory
4
memory rényi-type
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!