Transformer-Based Contrastive Learning With Dynamic Masking and Adaptive Pathways for Time Series Anomaly Detection
Funding: The authors received no specific funding for this work.
ABSTRACT
Time Series Anomaly Detection (TSAD) has demonstrated broad applicability across various industries, including manufacturing, healthcare, and finance. Its primary objective is to identify unusual deviations in the test set by capturing the typical behavioral patterns of timing data. Despite their strong detection capabilities when labeled data is not available, current reconstruction-based approaches still struggle with anomalous interference and inadequate semantic information extraction at higher time series levels. To tackle these problems, we provide a multi-scale dual-domain patch attention contrast learning model (DMAP-DDCL) that incorporates adaptive path selection and adaptive dynamic context-aware masking. Dynamic context-aware masks are specifically used by DMAP-DDCL to improve the model's generalization ability and mitigate bias resulting from the influence of anomalous data during training. Multi-scale patch segmentation and dual attention to the segmented patches are introduced to capture local details and global correlations as time dependencies. By enlarging the contrast between the two data perspectives, global and local, DMAP-DDCL improves the capacity to differentiate between normal and abnormal patterns. In addition, we enhance the adaptive path of the multi-scale bi-domain attention network, which adapts the multi-scale modeling process to the temporal dynamics of the inputs and enhances the model's accuracy. According to experimental results, DMAP-DDCL performs better on five real datasets from various domains than eight state-of-the-art baselines. Specifically, our model enhances F1 and R_AUC_ROC by an average of 7.5% and 16.67%.
Open Research
Data Availability Statement
The data that support the findings of this study are openly available in Anomaly-Transformer at https://drive.google.com/drive/folders/1gisthCoE-RrKJ0j3KPV7xiibhHWT9qRm?usp=sharing.