Please use this identifier to cite or link to this item:
Title: Mobile platform control using machine learning and human inputs
Authors: Li, Shumiao
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Li, S. (2022). Mobile platform control using machine learning and human inputs. Master's thesis, Nanyang Technological University, Singapore.
Project: ISM-DISS-03125
Abstract: Using human input to control a mobile platform has gained popularity in recent years. With the rapid development of machine learning, more and more types of human input can be analyzed by computers to obtain human intention. In this dissertation, human eye images selected from the MGIIGaze dataset are extracted and labeled into five classes according to the gaze direction, corresponding to five commands that a mobile platform can recognize. There are five machine learning models (k-Nearest Neighbors, Random Forest, two types of Convolutional Neural Networks, and Vision Transformer) explored and compared in this dissertation to find the most suitable one for the eye images. K-fold validation and grid research methods are applied to optimize each model. The conclusion is that the Vision Transformer outperforms other models for this dataset. More studies using algorithms combined with Vision Transformer can also be explored in the future with a larger dataset.
Schools: School of Electrical and Electronic Engineering 
Research Centres: Schaeffler Hub for Advanced REsearch (SHARE) Lab 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
Li, Shumiao_MSc_dissertation.pdf
  Restricted Access
1.54 MBAdobe PDFView/Open

Page view(s)

Updated on Feb 20, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.