Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/76922
Title: Semi supervised learning with graph convolutional networks
Authors: Ong, Jia Rui
Keywords: DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2019
Abstract: Deep learning has achieved unprecedented performances on a broad range of problems involving data in the euclidean space such as 2-D images in object recognition and 1-D paragraphs of text in machine translation. The availability of new datasets in the non-euclidean domain, such as social networks and 3D point clouds, have spurred recent efforts in generalising deep neural networks to graphs. In this report, we present the first comparative study between Graph Convolutional Networks (GCNs), Residual Gated Graph ConvNets (RGGCNs) and Graph Attention Networks (GATs), on two fundamental tasks in network science, semi-supervised classification and semi-supervised clustering, to analyse their experimental performances. We improve the existing capabilities of GATs by increasing the number of graph attention layers, and RGGCNs by reducing the number of learnable parameters together with the use of edge gate normalization. We introduce edge dropin, a novel method for regularizing graphs through the addition of edge-level noise. Our final RGGCN and GAT models are within 1% and 5% of GCN’s and RGGCN’s test accuracy on the Cora and semi-supervised clustering dataset generated with the stochastic block model respectively.
URI: http://hdl.handle.net/10356/76922
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP Report (Jia Rui) Updated.pdf
  Restricted Access
1.99 MBAdobe PDFView/Open

Page view(s)

143
Updated on Oct 16, 2021

Download(s) 50

20
Updated on Oct 16, 2021

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.