Recursive self-organizing network models.

Neural Netw

Research Group LNM, Department of Mathematics/Computer Science, University of Osnabrück, Albrechtstrasse 28, Osnabrück D-49069, Germany.

Published: January 2005

Self-organizing models constitute valuable tools for data visualization, clustering, and data mining. Here, we focus on extensions of basic vector-based models by recursive computation in such a way that sequential and tree-structured data can be processed directly. The aim of this article is to give a unified review of important models recently proposed in literature, to investigate fundamental mathematical properties of these models, and to compare the approaches by experiments. We first review several models proposed in literature from a unifying perspective, thereby making use of an underlying general framework which also includes supervised recurrent and recursive models as special cases. We shortly discuss how the models can be related to different neuron lattices. Then, we investigate theoretical properties of the models in detail: we explicitly formalize how structures are internally stored in different context models and which similarity measures are induced by the recursive mapping onto the structures. We assess the representational capabilities of the models, and we shortly discuss the issues of topology preservation and noise tolerance. The models are compared in an experiment with time series data. Finally, we add an experiment for one context model for tree-structured data to demonstrate the capability to process complex structures.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2004.06.009DOI Listing

Publication Analysis

Top Keywords

models
12
tree-structured data
8
review models
8
models proposed
8
proposed literature
8
properties models
8
shortly discuss
8
data
5
recursive
4
recursive self-organizing
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!