Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/83125
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLiu, Jigangen
dc.contributor.authorLee, Francis Bu Sungen
dc.contributor.authorRajan, Deepuen
dc.date.accessioned2019-07-04T01:51:45Zen
dc.date.accessioned2019-12-06T15:12:17Z-
dc.date.available2019-07-04T01:51:45Zen
dc.date.available2019-12-06T15:12:17Z-
dc.date.copyright2019-02-01en
dc.date.issued2019en
dc.identifier.citationLiu, J., Lee, F. B. S., & Rajan, D. (2019). Free-head appearance-based eye gaze estimation on mobile devices. 2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC). doi:10.1109/ICAIIC.2019.8669057en
dc.identifier.urihttps://hdl.handle.net/10356/83125-
dc.description.abstractEye gaze tracking plays an important role in human-computer interaction applications. In recent years, many research have been performed to explore gaze estimation methods to handle free-head movement, most of which focused on gaze direction estimation. Gaze point estimation on the screen is another important application. In this paper, we proposed a two-step training network, called GazeEstimator, to improve the estimation accuracy of gaze location on mobile devices. The first step is to train an eye landmarks localization network on 300W-LP dataset [1], and the second step is to train a gaze estimation network on GazeCapture dataset [2]. Some processing operations are performed between the two networks for data cleaning. The first network is able to localize eye precisely on the image, while the gaze estimation network use only eye images and eye grids as inputs, and it is robust to facial expressions and occlusion.Compared with state-of-the-art gaze estimation method, iTracker, our proposed deep network achieves higher accuracy and is able to estimate gaze location even in the condition that the full face cannot be detected.en
dc.format.extent6 p.en
dc.language.isoenen
dc.rights© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/ICAIIC.2019.8669057en
dc.subjectEye Gaze Estimationen
dc.subjectDeep Learningen
dc.subjectEngineering::Computer science and engineeringen
dc.titleFree-head appearance-based eye gaze estimation on mobile devicesen
dc.typeConference Paperen
dc.contributor.schoolSchool of Computer Science and Engineeringen
dc.contributor.conference2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)en
dc.identifier.doi10.1109/ICAIIC.2019.8669057en
dc.description.versionAccepted versionen
dc.identifier.rims211387en
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:SCSE Conference Papers
Files in This Item:
File Description SizeFormat 
Free-Head Appearance-Based Eye Gaze Estimation on Mobile Devices.pdf1.01 MBAdobe PDFThumbnail
View/Open

SCOPUSTM   
Citations 20

20
Updated on Mar 27, 2024

Web of ScienceTM
Citations 20

9
Updated on Oct 27, 2023

Page view(s) 20

681
Updated on Mar 29, 2024

Download(s) 5

703
Updated on Mar 29, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.