Please use this identifier to cite or link to this item:
Title: Planning of new experimental protocol for identifying human emotions & analysis of new kinect data
Authors: Lim, Jia-Min
Keywords: DRNTU::Engineering
Issue Date: 2016
Abstract: Humans use our emotions every day – be it a positive or negative expression, they represent our feelings at that point in time. Researchers have been analysing humans and their facial expressions for years, motivated to know how else they can better make use of these emotions to predict people’s response for useful applications. In this project, we look at how we can utilise the Kinect platform to better understand humans by detecting their facial expressions and associate it with emotions. This report summarizes the protocol to conducting an experiment which uses a motion controller developing device, Kinect for Windows version 1, to analyse the images by different methods of classification. System will be trained to recognise facial movement and identify the emotions associated based on the difference in these facial movements. This project is a collaborated between 2 Final Year Project (FYP) students and 2 Masters students. The author will primarily focus on setting up the protocol for conducting new experiments followed by analysing the new data with Kinect 1. The other students will use other devices such as Kinect 2 and the eye tracker. After conducting new experiments with new subjects, images of the data are put through different classifiers to train the system to recognise and identify facial expressions to emotions. The purpose of this project is to help applications use this system to better gauge people’s response for different applications and make a more informed follow-up decision.
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Final Report (Lim Jia-Min, Project No A1118-151).pdf
  Restricted Access
3.69 MBAdobe PDFView/Open

Page view(s)

Updated on Jun 22, 2021


Updated on Jun 22, 2021

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.