Learning from data streams that emerge from nonstationary environments has many real-world applications and poses various challenges. A key characteristic of such a task is the varying nature of target functions and data distributions over time (concept drifts). Most existing work relies solely on labeled data to adapt to concept drifts in classification problems. However, labeling all instances in a potentially life-long data stream is frequently prohibitively expensive, hindering such approaches. Therefore, we propose a novel algorithm to exploit unlabeled instances, which are typically plentiful and easily obtained. The algorithm is an online semisupervised radial basis function neural network (OSNN) with manifold-based training to exploit unlabeled data while tackling concept drifts in classification problems. OSNN employs a novel semisupervised learning vector quantization (SLVQ) to train network centers and learn meaningful data representations that change over time. It uses manifold learning on dynamic graphs to adjust the network weights. Our experiments confirm that OSNN can effectively use unlabeled data to elucidate underlying structures of data streams while its dynamic topology learning provides robustness to concept drifts.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNNLS.2021.3132584DOI Listing

Publication Analysis

Top Keywords

concept drifts
16
data streams
12
data
9
online semisupervised
8
neural network
8
drifts classification
8
classification problems
8
exploit unlabeled
8
unlabeled data
8
osnn
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!