Please use this identifier to cite or link to this item:
Title: Deep learning techniques for anomaly detection in time series data using transformer
Authors: Chan, Rachel Si Min
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Chan, R. S. M. (2022). Deep learning techniques for anomaly detection in time series data using transformer. Final Year Project (FYP), Nanyang Technological University, Singapore.
Abstract: Due to the growing demand and applications in many fields producing massive amounts of high dimensional data, anomaly detection is becoming increasingly important. The correlation between sequences makes multivariate anomalies more difficult on top of univariate anomalies. With the proliferation of data modalities, the issue of anomaly detection in large-scale databases is getting harder. Therefore, the main objective of this project will be to identify irregularities in multivariate time series data using deep learning techniques. To achieve the objectives of this project, Transformers, which have achieved state-of-the-art-performance in various natural language processing and computer vision tasks, are used to detect anomalies in multivariate time series data. Further improvements were also proposed to address issues in the base model. When dealing with extremely low dimensional time series where the granularity of the data is exceedingly small, the base model has a relative weakness. A 1D-convolutional layer is used to extract more meaningful representations of low dimension input parts in order to overcome this. It is important to capture both temporal and spatial information as multivariate time series is made up of many channels. The encoders in each tower of a two-tower framework are specifically designed to capture the step-wise and channel-wise correlation. The two towers are combined to merge the features of the two encoders. Convolutional Neural Network is not fully utilised when combined with Transformer as they are loosely coupled. Two Tightly Coupled Convolutional Transformer architecture, CSPAttention and Passthrough Mechanism are suggested as solutions to this problem and to reduce computation costs. The computing cost of self-attention is decreased by 30% by using CSPAttention which combines CSPNet with a self-attention mechanism. The passthrough mechanism enables Transformer-like models to obtain more precise information with minimally increased computational costs when applied to a stack of self-attention blocks. The models are evaluated on several benchmark datasets for multivariate time series regression and classification. All in all, the modelling approaches have exceeded the current state-of-the-art performance of supervised methods.
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Rachel Chan - FYP Final Report.pdf
  Restricted Access
972.46 kBAdobe PDFView/Open

Page view(s)

Updated on Apr 22, 2024

Download(s) 50

Updated on Apr 22, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.