Nonnegative matrix factorization (NMF) is a well-known paradigm for data representation. Traditional NMF-based classification methods first perform NMF or one of its variants on input data samples to obtain their low-dimensional representations, which are successively classified by means of a typical classifier [e.g., k -nearest neighbors (KNN) and support vector machine (SVM)]. Such a stepwise manner may overlook the dependency between the two processes, resulting in the compromise of the classification accuracy. In this paper, we elegantly unify the two processes by formulating a novel constrained optimization model, namely dual embedding regularized NMF (DENMF), which is semi-supervised. Our DENMF solution simultaneously finds the low-dimensional representations and assignment matrix via joint optimization for better classification. Specifically, input data samples are projected onto a couple of low-dimensional spaces (i.e., feature and label spaces), and locally linear embedding is employed to preserve the identical local geometric structure in different spaces. Moreover, we propose an alternating iteration algorithm to solve the resulting DENMF, whose convergence is theoretically proven. Experimental results over five benchmark datasets demonstrate that DENMF can achieve higher classification accuracy than state-of-the-art algorithms.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TIP.2019.2907054DOI Listing

Publication Analysis

Top Keywords

dual embedding
8
embedding regularized
8
nonnegative matrix
8
matrix factorization
8
input data
8
data samples
8
low-dimensional representations
8
classification accuracy
8
classification
5
simultaneous dimensionality
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!