Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/63479
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Gunawan, Christhio | |
dc.date.accessioned | 2015-05-14T02:35:33Z | |
dc.date.available | 2015-05-14T02:35:33Z | |
dc.date.copyright | 2015 | en_US |
dc.date.issued | 2015 | |
dc.identifier.uri | http://hdl.handle.net/10356/63479 | |
dc.description.abstract | Recognizing facial emotions is a fundamental aspect of interpersonal communication. People with diseases like Autism, Alzheimer's disease or Parkinson's disease have impairment to understand other people's emotions. In order to help people who are unable to visualize people's facial emotions during their face to face communication, a need of real-time emotion recognizer is required. The objective of the project is to study some selected existing facial emotion recognition algorithms and implement the most suitable algorithm. The emotion recognition software is written in Python language with OpenCV as the main library to run image processing tools in the program. Haar-like classifier is used to detect face, mouth, and eyes region. After identification of ROI (Regions of Interest), features extraction is required for the software to identify emotions. Shi-Tomasi corner detector is used to collect distance between corners of the lip. Other than corners of the lip, teeth area is also computed in order to help identifying emotions. For the eyes region, Hough circle transform is utilized to identify large eye-opening. From all the features extracted from the image, four basic emotions can be identified by the software. They are neutral, happy, fear, and surprise. From the experiment where participant’s expression maintains the same expression under three minutes for three times run, the accuracy table can be created. The neutral has the highest rate of accuracy with 100% correctness. The happy emotion has the accuracy of 75.33%. Fear emotion has an accuracy of 74.84%. Finally, surprise emotion has second highest rate of 93.6% accuracy. | en_US |
dc.format.extent | 22 p. | en_US |
dc.language.iso | en | en_US |
dc.rights | Nanyang Technological University | |
dc.subject | DRNTU::Engineering::Computer science and engineering::Computer applications::Social and behavioral sciences | en_US |
dc.title | Emotion recognition from facial expressions | en_US |
dc.type | Final Year Project (FYP) | en_US |
dc.contributor.supervisor | Vinod Achutavarrier Prasad | en_US |
dc.contributor.school | School of Computer Engineering | en_US |
dc.description.degree | Bachelor of Engineering (Computer Engineering) | en_US |
item.grantfulltext | restricted | - |
item.fulltext | With Fulltext | - |
Appears in Collections: | SCSE Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FINAL YEAR PROJECT REPORT.pdf Restricted Access | 983.49 kB | Adobe PDF | View/Open |
Page view(s)
488
Updated on Apr 29, 2025
Download(s) 50
61
Updated on Apr 29, 2025
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.