Please use this identifier to cite or link to this item:
Title: Understanding variations (variant & invariant) of classification tasks/targets
Authors: Wan, Tai Fong
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2020
Publisher: Nanyang Technological University
Project: SCSE19-0087
Abstract: There still lacks a certain mechanism to cater for variance in data and a lack of levels of impact brought by variance. We introduce a composite term called learning, where average improvement upon every epoch divided by previous loss value to have a standard reference across our models of differing architecture. We use specially designed datasets on Convolutional Neural Network (CNN) and Recurrent Neural Network (RNN) to observe the effects of variance on bottom-up and top-down neural network architectures respectively. We find that variance has degrees, such that given datasets of different applied operations, the amount of loss varies notably. We find that variance has dimensions, such that the amount of variance introduced to the image, affects the confidence of the model prediction. We find that even providing a single training data with no operation applied to it, the CNN and RNN architecture could give lower validation losses (with CNN being significantly lower). This study shows the significance of variance impact on model performance manifested in data and the pressing need to understand variance to better design mitigations and mechanisms to handle it.
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Wan Tai Fong FYP Report SCSE19-0087 Understanding Variations.pdf
  Restricted Access
1.25 MBAdobe PDFView/Open

Page view(s)

Updated on Jun 18, 2024


Updated on Jun 18, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.