Please use this identifier to cite or link to this item:
Title: Statistical graph signal processing
Authors: Shi, Enbing
Keywords: Engineering::Computer science and engineering::Computing methodologies::Symbolic and algebraic manipulation
Issue Date: 2023
Publisher: Nanyang Technological University
Source: Shi, E. (2023). Statistical graph signal processing. Master's thesis, Nanyang Technological University, Singapore.
Abstract: This study provides an insight into the application of different filters in Graph Signal Processing (GSP) on different datasets. First, a comprehensive overview of GSP-related concepts is given, including the derivation of graph signals, the computation of Laplace matrices, and their application in various practical scenarios. Next, we discuss in detail the latest GSP filter design methods, covering various linear and non-linear methods. We review graph filtering, graph signal sampling, graph signal compression/reconstruction, graph neural networks, etc., and compare and analyse the advantages and limitations of each type of filter from a theoretical perspective. Four models, linear regression (LR), linear regression graph (LRG), kernel regression (KR) and kernel regression graph (KRG), were selected to process different data sets and the effects of training sample variation and noise interference on the prediction accuracy (i.e., normalised mean square error) of the models were investigated in depth. Through simulation experiments, we found that the performance of all models improved as the number of training samples increased. In some cases, the KR and KRG models outperformed the LR and LRG models, which only capture linear relationships, due to their ability to capture non-linear relationships in the data. In contrast, in noisy environments, the LRG and KRG models have higher robustness in handling additive white Gaussian noise due to the superiority of their Gaussian noise model design. In addition, we observed that the normalised mean square error of all models were approximately stable when the training sample size reached a certain level, which may mean that the performance of the models had reached their limits and further increasing the training data size may not significantly improve the performance.
Schools: School of Electrical and Electronic Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
  Restricted Access
393.51 kBAdobe PDFView/Open

Page view(s)

Updated on Sep 23, 2023


Updated on Sep 23, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.