Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/59747
Full metadata record
DC FieldValueLanguage
dc.contributor.authorJanis Abols
dc.date.accessioned2014-05-14T01:58:05Z
dc.date.available2014-05-14T01:58:05Z
dc.date.copyright2014en_US
dc.date.issued2014
dc.identifier.urihttp://hdl.handle.net/10356/59747
dc.description.abstractA wrist-worn standalone device measuring three principal accelerations along three orthogonal axes, x, y and z, is capable of communicating these values wirelessly to a computer. These acceleration values are sufficient to infer the orientation of the wearer's arm. This forms a facility for the design of a mouse-pointer controlling software based on the inferred hand position and movement. Additionally, a hand gesture performed by the wearer is also described with acceleration values, which are sufficient to distinguish between various different gestures. This project develops software that allows for real-time control of the mouse pointer and real-time gesture recognition, to perform user-defined tasks. The gestures and their corresponding tasks are fully defined by the user of the software. Mouse pointer control is achieved by means of monitoring the orientation of the device in 3D, meanwhile the gesture recognition is achieved by monitoring the total acceleration of the device, and maintaining a number of past acceleration values in a data buffer. Upon exceeding a preset total acceleration threshold (signifying the presence of a hand gesture), the collected 3-axis data is captured and analysed to be potentially recognised as a gesture. Gesture classification, learning and recognition is achieved by using a multi-layer-perceptron, i.e. an artificial neural network, which works as an automatic data classifier, which does not require supervised learning, i.e. is capable of learning patterns independently. Actions to be performed upon recognising a gesture are also user-defined and very flexible in nature. These are executed upon recognising a gesture with a preset confidence level. Finally, the software is capable of maintaining several user-defined gesture-action profiles that can be cycled through by the user and selected as required, without stopping wireless interaction with the computer.en_US
dc.format.extent51 p.en_US
dc.language.isoenen_US
dc.rightsNanyang Technological University
dc.subjectDRNTU::Engineering::Computer science and engineering::Software::Software engineeringen_US
dc.titleFlying mouse projecten_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorYu Yajunen_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeBachelor of Engineeringen_US
dc.contributor.researchCentre for Integrated Circuits and Systemsen_US
item.grantfulltextrestricted-
item.fulltextWith Fulltext-
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
fyp-final-report-Janis-Abols.pdf
  Restricted Access
Main article1.18 MBAdobe PDFView/Open

Page view(s)

398
Updated on Apr 19, 2025

Download(s)

16
Updated on Apr 19, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.