Jensen's inequality is important for obtaining inequalities for divergence between probability distribution. By applying a refinement of Jensen's inequality (Horváth et al. in Math. Inequal. Appl. 14:777-791, 2011) and introducing a new functional based on an -divergence functional, we obtain some estimates for the new functionals, the -divergence, and Rényi divergence. Some inequalities for Rényi and Shannon estimates are constructed. The Zipf-Mandelbrot law is used to illustrate the result. In addition, we generalize the refinement of Jensen's inequality and new inequalities of Rényi Shannon entropies for an -convex function using the Montgomery identity. It is also given that the maximization of Shannon entropy is a transition from the Zipf-Mandelbrot law to a hybrid Zipf-Mandelbrot law.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6244721PMC
http://dx.doi.org/10.1186/s13660-018-1902-9DOI Listing

Publication Analysis

Top Keywords

jensen's inequality
16
refinement jensen's
12
zipf-mandelbrot law
12
rényi divergence
8
montgomery identity
8
inequalities rényi
8
rényi shannon
8
inequality
4
inequality estimation
4
rényi
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!