Publications by authors named "Stanislaw H Zak"

Variable neural adaptive robust control strategies are proposed for the output tracking control of a class of multiinput multioutput uncertain systems. The controllers incorporate a novel variable-structure radial basis function (RBF) network as the self-organizing approximator for unknown system dynamics. It can determine the network structure online dynamically by adding or removing RBFs according to the tracking performance.

View Article and Find Full Text PDF

The hypothalamic-pituitary-adrenal (HPA) axis is critical in maintaining homeostasis under physical and psychological stress by modulating cortisol levels in the body. Dysregulation of cortisol levels is linked to numerous stress-related disorders. In this paper, an automated treatment methodology is proposed, employing a variant of nonlinear model predictive control (NMPC), called explicit MPC (EMPC).

View Article and Find Full Text PDF

In this paper, a generalized Brain-State-in-a-Box (gBSB)-based hybrid neural network is proposed for storing and retrieving pattern sequences. The hybrid network consists of autoassociative and heteroassociative parts. Then, a large-scale image storage and retrieval neural system is constructed using the gBSB-based hybrid neural network and the pattern decomposition concept.

View Article and Find Full Text PDF

Real-time approximators for continuous-time dynamical systems with many inputs are presented. These approximators employ a novel self-organizing radial basis function (RBF) network, which varies its structure dynamically to keep the prescribed approximation accuracy. The RBFs can be added or removed online in order to achieve the appropriate network complexity for the real-time approximation of the dynamical systems and to maintain the overall computational efficiency.

View Article and Find Full Text PDF

Motivation: The still emerging combination of technologies that enable description and characterization of all expressed proteins in a biological system is known as proteomics. Although many separation and analysis technologies have been employed in proteomics, it remains a challenge to predict peptide behavior during separation processes. New informatics tools are needed to model the experimental analysis method that will allow scientists to predict peptide separation and assist with required data mining steps, such as protein identification.

View Article and Find Full Text PDF

A class of interconnected neural networks composed of generalized Brain-State-in-a-Box (gBSB) neural subnetworks is considered. Interconnected gBSB neural network architectures are proposed along with their stability conditions. The design of the interconnected neural networks is reduced to the problem of solving linear matrix inequalities (LMIs) to determine the interconnection parameters.

View Article and Find Full Text PDF

This paper is concerned with large scale associative memory design. A serious problem with neural associative memories is the quadratic growth of the number of interconnections with the problem size. An overlapping decomposition algorithm is proposed to attack this problem.

View Article and Find Full Text PDF

The problem of implementing associative memories using sparsely interconnected generalized Brain-State-in-a-Box (gBSB) network is addressed in this paper. In particular, a "designer" neural network that synthesizes the associative memories is proposed. An upper bound on the time required for the designer network to reach a solution is determined.

View Article and Find Full Text PDF

We propose and investigate new types of neural network models. They can be viewed as discrete linear systems operating on closed and bounded, that is, compact, convex domains. We first analyze the dynamic behavior of a neural network model on an arbitrary convex domain.

View Article and Find Full Text PDF

We propose learning and forgetting techniques for the generalized brain-state-in-a-box (BSB) based associative memories. A generalization of the BSB model allows each neuron to have its own bias and the synaptic weight matrix does not have to be symmetric. A pattern is learned by a memory if its noisy or an incomplete version presented to the memory is mapped back to this pattern.

View Article and Find Full Text PDF