Home > Research > Publications & Outputs > STGAT-MAD: Spatial-Temporal Graph Attention Net...

Electronic data

  • manuscript V3 20210925 for ICASSP2022

    Rights statement: ©2022 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    Accepted author manuscript, 913 KB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

STGAT-MAD: Spatial-Temporal Graph Attention Network for Multivariate Time Series Anomaly Detection

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
  • Jun Zhan
  • Siqi Wang
  • Xiandong Ma
  • Chengkun Wu
  • Canqun Yang
  • Detian Zeng
  • Shilin Wang
Close
Publication date27/04/2022
Host publication(ICASSP 2022) 2022 IEEE International Conference on Acoustics, Speech and Signal Processing, May 22-27, 2022, Singapore
PublisherIEEE
Number of pages5
<mark>Original language</mark>English

Abstract

Anomaly detection in multivariate time series data is challenging due to complex temporal and feature correlations and heterogeneity. This paper proposes a novel unsupervised multi-scale stacked spatial-temporal graph attention network for multivariate time series anomaly detection (STGATMAD). The core of our framework is to coherently capture the feature and temporal correlations among multivariate time-series data with stackable STGAT networks. Meanwhile, a multi-scale input network is exploited to capture the temporal correlations in different time-scales. Experiments on a new wind turbine dataset (built and released by us) and three public datasets show that our method detects anomalies more accurately than baseline approaches and provide interpretability through observing the attention score among multiple sensors and different times.

Bibliographic note

©2022 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.