A Bayesian approach to joint feature selection and classifier design.

IEEE Trans Pattern Anal Mach Intell

Department of Electrical Engineering, Duke University, Durham, NC 27708-0291, USA.

Published: September 2004

This paper adopts a Bayesian approach to simultaneously learn both an optimal nonlinear classifier and a subset of predictor variables (or features) that are most relevant to the classification task. The approach uses heavy-tailed priors to promote sparsity in the utilization of both basis functions and features; these priors act as regularizers for the likelihood function that rewards good classification on the training data. We derive an expectation-maximization (EM) algorithm to efficiently compute a maximum a posteriori (MAP) point estimate of the various parameters. The algorithm is an extension of recent state-of-the-art sparse Bayesian classifiers, which in turn can be seen as Bayesian counterparts of support vector machines. Experimental comparisons using kernel classifiers demonstrate both parsimonious feature selection and excellent classification accuracy on a range of synthetic and benchmark data sets.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TPAMI.2004.55DOI Listing

Publication Analysis

Top Keywords

bayesian approach
8
feature selection
8
bayesian
4
approach joint
4
joint feature
4
selection classifier
4
classifier design
4
design paper
4
paper adopts
4
adopts bayesian
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!