Please use this identifier to cite or link to this item:
Title: Comparing scalability of RVFL network
Authors: Yeo, Chester Jie Sheng
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Yeo, C. J. S. (2022). Comparing scalability of RVFL network. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: A1225-212
Abstract: Traditionally, random vector functional link (RVFL) is a randomization based neural networks has been gaining significant traction as it is able to overcome the shortcomings of conventional models. It has been successfully applied to a diverse range of tasks such as classification, regression, visual tracking, and forecasting. Randomization based neural network employs a closed form solution to optimize parameters, which also means it only needs to train once quickly by feeding all samples to the model together, unlike back-propagation trained neural networks that require multiple iterations RVFL, is a typical representative with a single hidden layer with universal approximate ability. With weights and biases randomly generated, its uniqueness lies with the direct link that connects information between the input and output layer. This approach does not work when the size of the training dataset is huge. This project will evaluate three approaches to manage this problem: iterative learning, online learning, and vector quantization. Through the proposed methods, we hope to solve the issue of scalability in RVFL. The experimental results shows that conventional least squares classifier is the best way to solve this problem and highlights that scalability is not a strong suit of RVFL, with vector quantization being the closest performer and area of further research for work with RVFL.
Schools: School of Electrical and Electronic Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
850.66 kBAdobe PDFView/Open

Page view(s)

Updated on Apr 18, 2024


Updated on Apr 18, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.