Please use this identifier to cite or link to this item:
|Title:||Research study on eye gaze estimation using neural network||Authors:||Hu, Qingyao||Keywords:||DRNTU::Engineering::Computer science and engineering||Issue Date:||2019||Abstract:||Since the 1800s, studies of eye movement and eye tracking has long attracted interest from various researchers and scientists. However, eye tracking technology has slowly moved towards eye gaze estimation technology instead as it has been proven to be more useful in real-world applications. This project aims to collect eye gaze data, explore methods for eye gaze estimation using convolutional neural network and implement the methods on mobile devices to explore the application of the eye gaze estimation technology. Neural networks are a part of deep learning technology which has been adopted by various major IT companies around the globe. It utilizes artificial neurons to learn the patterns that exist in complex data such as image and audio to predict certain results. To collect eye gaze data, two Android applications, GazeCollect and Gazestimate was developed to collect 2 different types of eye gaze data. GazeCollect gathers data of participants looking at random red dots displayed on the mobile device. On the other hand, Gazestimate gathers data through a simple card matching game which participants are required to complete. For this project, classification method was explored in this project to predict eye gaze location on mobile devices by segmenting the mobile device’s screen into 32 areas. A combination of image and scalar inputs are used as input for the convolutional neural network model to compensate for the difference in device orientation and size. Resnet is used in the convolutional network model architecture as Resnet has been proven to perform accurately in image recognition task. The classification method explored yielded a best accuracy of 47.76 to 49.37% accuracy on predicting eye gaze location. Finally, Gazestimate simple card matching game was modified to be interactive using eye gaze estimation. Additional functionality was added to Gazestimate to display the exact eye gaze prediction on the device’s screen by displaying red dots too. These functionalities were used to showcase eye gaze estimation techniques during Nanyang Technological University’s Open House 2019||URI:||http://hdl.handle.net/10356/77032||Rights:||Nanyang Technological University||Fulltext Permission:||restricted||Fulltext Availability:||With Fulltext|
|Appears in Collections:||SCSE Student Reports (FYP/IA/PA/PI)|
Updated on May 13, 2021
Updated on May 13, 2021
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.