Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi cross-entropy and the Natural Rényi cross-entropy, were recently used as loss functions for the improved design of deep learning generative adversarial networks. In this work, we derive the Rényi and Natural Rényi differential cross-entropy measures in closed form for a wide class of common continuous distributions belonging to the exponential family, and we tabulate the results for ease of reference. We also summarise the Rényi-type cross-entropy rates between stationary Gaussian processes and between finite-alphabet time-invariant Markov sources.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9601846 | PMC |
http://dx.doi.org/10.3390/e24101417 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!