Feature extraction using information-theoretic learning.

IEEE Trans Pattern Anal Mach Intell

Biomagnetic Imaging Laboratory, University of California at San Francisco, Room C-324B, San Francisco, CA 94122, USA.

Published: September 2006

A classification system typically consists of both a feature extractor (preprocessor) and a classifier. These two components can be trained either independently or simultaneously. The former option has an implementation advantage since the extractor need only be trained once for use with any classifier, whereas the latter has an advantage since it can be used to minimize classification error directly. Certain criteria, such as Minimum Classification Error, are better suited for simultaneous training, whereas other criteria, such as Mutual Information, are amenable for training the feature extractor either independently or simultaneously. Herein, an information-theoretic criterion is introduced and is evaluated for training the extractor independently of the classifier. The proposed method uses nonparametric estimation of Renyi's entropy to train the extractor by maximizing an approximation of the mutual information between the class labels and the output of the feature extractor. The evaluations show that the proposed method, even though it uses independent training, performs at least as well as three feature extraction methods that train the extractor and classifier simultaneously.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6528827PMC
http://dx.doi.org/10.1109/TPAMI.2006.186DOI Listing

Publication Analysis

Top Keywords

feature extractor
12
feature extraction
8
independently simultaneously
8
classification error
8
extractor independently
8
proposed method
8
train extractor
8
extractor
7
feature
5
extraction information-theoretic
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!