In this paper, we present a new class of quasi-Newton methods for an effective learning in large multilayer perceptron (MLP)-networks. The algorithms introduced in this work, named LQN, utilize an iterative scheme of a generalized BFGS-type method, involving a suitable family of matrix algebras L. The main advantages of these innovative methods are based upon the fact that they have an O(nlogn) complexity per step and that they require O(n) memory allocations. Numerical experiences, performed on a set of standard benchmarks of MLP-networks, show the competitivity of the LQN methods, especially for large values of n.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNN.2003.809425DOI Listing

Publication Analysis

Top Keywords

class quasi-newtonian
4
methods
4
quasi-newtonian methods
4
methods optimal
4
optimal learning
4
learning mlp-networks
4
mlp-networks paper
4
paper class
4
class quasi-newton
4
quasi-newton methods
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!