Please use this identifier to cite or link to this item:
Title: Android smart phone based participatory sensing (2)
Authors: Leung, Lap Kan
Keywords: DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2015
Abstract: The purpose of this project is to implement an user-friendly ios application that allows users to input their passwords via using their eyes with a high level of security. This application can run on both ipad and iphone. This application is built upon a working gaze tracking system which provides reasonable accuracy of the estimated gaze position. Once the application is loaded successfully onto the ios device, facial detection of the user is performed and a calibration process has to be completed before the application can accurately estimate the gaze position. As the user changes his gaze position from one grid to another, the program should be able to identify that. The program should also determine the correct number of characters that the user has inputted. This project is based on an existing eye tracking program named opengazer which can run successfully on mac book. The results obtained from opengazer has an horizontal and vertical error of < 2 degree. This project aims to develop an eye tracking system running on IOS devices which provides similar accuracy as the mac book version. At this point of writing, due to difficulties encountered which will be explained in the report, the implementation of the ipad version is still a work in progress. Further testing results could be obtained once the implementation has been completed.
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP report final.pdf
  Restricted Access
9.28 MBAdobe PDFView/Open

Page view(s)

Updated on Dec 5, 2020


Updated on Dec 5, 2020

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.