Learning Continuous-Time Dynamics With Attention.

IEEE Trans Pattern Anal Mach Intell

Published: February 2023

Learning the hidden dynamics from sequence data is crucial. Attention mechanism can be introduced to spotlight on the region of interest for sequential learning. Traditional attention was measured between a query and a sequence based on a discrete-time state trajectory. Such a mechanism could not characterize the irregularly-sampled sequence data. This paper presents an attentive differential network (ADN) where the attention over continuous-time dynamics is developed. The continuous-time attention is performed over the dynamics at all time. The missing information in irregular or sparse samples can be seamlessly compensated and attended. Self attention is computed to find the attended state trajectory. However, the memory cost for attention score between a query and a sequence is demanding since self attention treats all time instants as query points in an ordinary differential equation solver. This issue is tackled by imposing the causality constraint in causal ADN (CADN) where the query is merged up to current time. To enhance the model robustness, this study further explores a latent CADN where the attended dynamics are calculated in an encoder-decoder structure via Bayesian learning. Experiments on the irregularly-sampled actions, dialogues and bio-signals illustrate the merits of the proposed methods in action recognition, emotion recognition and mortality prediction, respectively.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TPAMI.2022.3162711DOI Listing

Publication Analysis

Top Keywords

continuous-time dynamics
8
attention
8
sequence data
8
query sequence
8
state trajectory
8
dynamics
5
learning
4
learning continuous-time
4
dynamics attention
4
attention learning
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!