Please use this identifier to cite or link to this item:
Title: Dynamic facial expression for emotion recognition
Authors: Peh, Raymond Jin Rui
Keywords: DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation
Issue Date: 2014
Abstract: The report presents the final year project in details for the works done on facial visuals to determine a good combination of features extraction method and classifier that can best describe six basic emotions. The need to improve the system arises from the fact that there has been an increasing shift towards the human-machine interaction in recent technologies. Having the exposure to several implementations, the approach taken consists of feature representations using Haar features and features selection using Genetic Algorithm based on Sparse Representation Classifier. Raw images are first processed before they are described by the extracted features, which are the inputs to the classifier to be accurately recognized into their respective emotions. From the numerous experiments carried out and results achieved, evaluation is done to weigh the significance of features against each validation set and highlight strong sparsity level so that further adjustments can be done to vary the necessary parameters. Extreme Learning Machines was also investigated to have a hybrid of classifiers for a more robust system.
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
2 MBAdobe PDFView/Open

Page view(s) 50

checked on Oct 20, 2020

Download(s) 50

checked on Oct 20, 2020

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.