Please use this identifier to cite or link to this item:
Title: Real-time emotion detection
Authors: Koh, Melvyn Nguan Theng
Keywords: DRNTU::Engineering::Computer science and engineering
Issue Date: 2018
Abstract: Deep learning dominates the field of computer vision in recent years and in every few weeks a new deep learning technology takes over the other. Herein, convolutional neural network (CNN) is applied in this project. Detecting facial expressions have been a very fast-growing topic in the field of computer vision as facial expressions are seen as a significant role in human communication and behavioural analysis. Ever since Paul Ekman devised the Facial Action Coding System (FACS) to detect a human facial feature and model the facial behaviours, many scientists are inspired to conduct psychological research on detecting real emotions of a person. Therefore, this has in turn inspired computer scientists to conduct tremendous active research in this field – finding the most accurate and fast models to detecting the true emotion of a person with a camera. This involves using Extended Cohn-Kanade (CK+) and FER2013 datasets. This project aims to build a Real-Time Emotion Detection application that detects seven emotions namely – Anger, Disgust, Fear, Happy, Sad, Surprise and Neutral. The software application is written in Python programming language with OpenCV for processing images and videos. CNN-based approach is done with Google’s Tensorflow machine-learning library to construct the trained model. Lastly, Keras is used as the high-level neural networks API (application programming interface) that runs on top of Tensorflow. The model is trained and evaluated on the FER2013 and CK+ datasets.
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
Final Year Project Report3.34 MBAdobe PDFView/Open

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.