A finite-sample, distribution-free, probabilistic lower bound on mutual information.

Neural Comput

Computer and Information Science and Engineering, University of Florida, Gainesville, FL 32611, USA.

Published: July 2011

For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadratic time algorithm is described for computing the bound and its corresponding class-conditional distribution functions. We compare our approach to existing techniques and show the superiority of our bound to a method inspired by Fano's inequality where the continuous random variable is discretized.

Download full-text PDF

Source
http://dx.doi.org/10.1162/NECO_a_00144DOI Listing

Publication Analysis

Top Keywords

probabilistic lower
8
lower bound
8
bound mutual
8
bound
5
finite-sample distribution-free
4
distribution-free probabilistic
4
mutual memoryless
4
memoryless communication
4
communication channel
4
channel binary-valued
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!