Unsupervised domain adaptation (UDA) has successfully addressed the domain shift problem for visual applications. Yet, these approaches may have limited performance for time series data due to the following reasons. First, they mainly rely on the large-scale dataset (i.e., ImageNet) for source pretraining, which is not applicable for time series data. Second, they ignore the temporal dimension on the feature space of the source and target domains during the domain alignment step. Finally, most of the prior UDA methods can only align the global features without considering the fine-grained class distribution of the target domain. To address these limitations, we propose a SeLf-supervised AutoRegressive Domain Adaptation (SLARDA) framework. In particular, we first design a self-supervised (SL) learning module that uses forecasting as an auxiliary task to improve the transferability of source features. Second, we propose a novel autoregressive domain adaptation technique that incorporates temporal dependence of both source and target features during domain alignment. Finally, we develop an ensemble teacher model to align class-wise distribution in the target domain via a confident pseudo labeling approach. Extensive experiments have been conducted on three real-world time series applications with 30 cross-domain scenarios. The results demonstrate that our proposed SLARDA method significantly outperforms the state-of-the-art approaches for time series domain adaptation. Our source code is available at: https://github.com/mohamedr002/SLARDA.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNNLS.2022.3183252DOI Listing

Publication Analysis

Top Keywords

domain adaptation
20
time series
20
autoregressive domain
12
series data
12
domain
10
self-supervised autoregressive
8
source target
8
domain alignment
8
distribution target
8
target domain
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!