Kernel orthonormalization in radial basis function neural networks.

IEEE Trans Neural Netw

Fac. of Process. and Environ. Eng., Tech. Univ. Lodz.

Published: October 2012

This paper deals with optimization of the computations involved in training radial basis function (RBF) neural networks. The main contribution of the reported work is the method for network weights calculation, in which the key idea is to transform the RBF kernels into an orthonormal set of functions (using the standard Gram-Schmidt orthogonalization). This significantly reduces the computing time if the RBF training scheme, which relies on adding one kernel hidden node at a time to improve network performance, is adopted. Another property of the method is that, after the RBF network weights are computed, the original network structure can be restored back. An additional strength of the method is the possibility to decompose the proposed computing task into a number of parallel subtasks so gaining further savings on computing time. Also, the proposed weight calculation technique has low storage requirements. These features make the method very attractive for hardware implementation. The paper presents a detailed derivation of the proposed network weights calculation procedure and demonstrates its validity for RBF network training on a number of data classification and function approximation problems.

Download full-text PDF

Source
http://dx.doi.org/10.1109/72.623218DOI Listing

Publication Analysis

Top Keywords

network weights
12
radial basis
8
basis function
8
neural networks
8
weights calculation
8
computing time
8
rbf network
8
network
6
rbf
5
kernel orthonormalization
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!