Entropy-based divergence measures have proven their effectiveness in many areas of computer vision and pattern recognition. However, the complexity of their implementation might be prohibitive in resource-limited applications, as they require estimates of probability densities which are expensive to compute directly for high-dimensional data. In this paper, we investigate the usage of a non-parametric distribution-free metric, known as the Henze-Penrose test statistic to obtain bounds for the $k$ -nearest neighbors ( $k$ -NN) classification accuracy. Simulation results demonstrate the effectiveness and the reliability of this metric in estimating the inter-class separability. In addition, the proposed bounds on the $k$ -NN classification are exploited for evaluating the efficacy of different pre-processing techniques as well as selecting the least number of features that would achieve the desired classification performance.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TIP.2018.2862352 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!