Residual connections have been proposed as an architecture-based inductive bias to mitigate the problem of exploding and vanishing gradients and increased task performance in both feed-forward and recurrent networks (RNNs) when trained with the backpropagation algorithm. Yet, little is known about how residual connections in RNNs influence their dynamics and fading memory properties. Here, we introduce weakly coupled residual recurrent networks (WCRNNs) in which residual connections result in well-defined Lyapunov exponents and allow for studying properties of fading memory.
View Article and Find Full Text PDFPerceptual, motor and cognitive processes are based on rich interactions between remote regions in the human brain. Such interactions can be carried out through phase synchronization of oscillatory signals. Neuronal synchronization has been primarily studied within the same frequency range, e.
View Article and Find Full Text PDF