Efficient and robust feature extraction by maximum margin criterion.

IEEE Trans Neural Netw

Department of Computer Science and Engineering, University of California, Riverside, CA 92521, USA.

Published: January 2006

In pattern recognition, feature extraction techniques are widely employed to reduce the dimensionality of data and to enhance the discriminatory information. Principal component analysis (PCA) and linear discriminant analysis (LDA) are the two most popular linear dimensionality reduction methods. However, PCA is not very effective for the extraction of the most discriminant features, and LDA is not stable due to the small sample size problem. In this paper, we propose some new (linear and nonlinear) feature extractors based on maximum margin criterion (MMC). Geometrically, feature extractors based on MMC maximize the (average) margin between classes after dimensionality reduction. It is shown that MMC can represent class separability better than PCA. As a connection to LDA, we may also derive LDA from MMC by incorporating some constraints. By using some other constraints, we establish a new linear feature extractor that does not suffer from the small sample size problem, which is known to cause serious stability problems for LDA. The kernelized (nonlinear) counterpart of this linear feature extractor is also established in the paper. Our extensive experiments demonstrate that the new feature extractors are effective, stable, and efficient.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNN.2005.860852DOI Listing

Publication Analysis

Top Keywords

feature extractors
12
feature extraction
8
maximum margin
8
margin criterion
8
dimensionality reduction
8
small sample
8
sample size
8
size problem
8
extractors based
8
linear feature
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!