Entropy-based divergence measures have proven their effectiveness in many areas of computer vision and pattern recognition. However, the complexity of their implementation might be prohibitive in resource-limited applications, as they require estimates of probability densities which are expensive to compute directly for high-dimensional data. In this paper, we investigate the usage of a non-parametric distribution-free metric, known as the Henze-Penrose test statistic to obtain bounds for the $k$ -nearest neighbors ( $k$ -NN) classification accuracy. Simulation results demonstrate the effectiveness and the reliability of this metric in estimating the inter-class separability. In addition, the proposed bounds on the $k$ -NN classification are exploited for evaluating the efficacy of different pre-processing techniques as well as selecting the least number of features that would achieve the desired classification performance.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TIP.2018.2862352DOI Listing

Publication Analysis

Top Keywords

henze-penrose test
8
test statistic
8
bounds $k$
8
$k$ -nn
8
-nn classification
8
metric driven
4
classification
4
driven classification
4
classification non-parametric
4
non-parametric approach
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!