Constraining the topology of neural networks to ensure dynamics with symmetry properties.

Phys Rev E Stat Nonlin Soft Matter Phys

Programa de Pós Graduçao em Engenharia Elétrica, Universidade Federal de Minas Gerais, Avenida Antônio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais, Brazil.

Published: February 2004

This paper addresses the training of network models from data produced by systems with symmetry properties. It is argued that although general networks are global approximators, in practice some properties such as symmetry are very hard to learn from data. In order to guarantee that the final network will be symmetrical, constraints are developed for two types of models, namely, the multilayer perceptron (MLP) network and the radial basis function (RBF) network. In global modeling problems it becomes crucial to impose conditions for symmetry in order to stand a chance of reproducing symmetry-related phenomena. Sufficient conditions are given for MLP and RBF networks to have a set of fixed points that are symmetrical with respect to the origin of the phase space. In the case of MLP networks such conditions reduce to the absence of bias parameters and the requirement of odd activation functions. This turns out to be important from a dynamical point of view since some phenomena are only observed in the context of symmetry, which is not a structurally stable property. The results are illustrated using bench systems that display symmetry, such as the Duffing-Ueda oscillator and the Lorenz system.

Download full-text PDF

Source
http://dx.doi.org/10.1103/PhysRevE.69.026701DOI Listing

Publication Analysis

Top Keywords

symmetry properties
8
symmetry
6
constraining topology
4
topology neural
4
networks
4
neural networks
4
networks ensure
4
ensure dynamics
4
dynamics symmetry
4
properties paper
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!