IEEE Trans Neural Netw Learn Syst
July 2020
Vector-valued neural learning has emerged as a promising direction in deep learning recently. Traditionally, training data for neural networks (NNs) are formulated as a vector of scalars; however, its performance may not be optimal since associations among adjacent scalars are not modeled. In this article, we propose a new vector neural architecture called the Arbitrary BIlinear Product NN (ABIPNN), which processes information as vectors in each neuron, and the feedforward projections are defined using arbitrary bilinear products.
View Article and Find Full Text PDFThis paper presents a novel VLSI architecture for the training of radial basis function (RBF) networks. The architecture contains the circuits for fuzzy C-means (FCM) and the recursive Least Mean Square (LMS) operations. The FCM circuit is designed for the training of centers in the hidden layer of the RBF network.
View Article and Find Full Text PDF