Jensen's inequality is important for obtaining inequalities for divergence between probability distribution. By applying a refinement of Jensen's inequality (Horváth et al. in Math. Inequal. Appl. 14:777-791, 2011) and introducing a new functional based on an -divergence functional, we obtain some estimates for the new functionals, the -divergence, and Rényi divergence. Some inequalities for Rényi and Shannon estimates are constructed. The Zipf-Mandelbrot law is used to illustrate the result. In addition, we generalize the refinement of Jensen's inequality and new inequalities of Rényi Shannon entropies for an -convex function using the Montgomery identity. It is also given that the maximization of Shannon entropy is a transition from the Zipf-Mandelbrot law to a hybrid Zipf-Mandelbrot law.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6244721 | PMC |
http://dx.doi.org/10.1186/s13660-018-1902-9 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!