Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/64184
Title: Automated real-time analytics for multi-party dialogues
Authors: See, Yihui
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2015
Abstract: This project explores the idea of detecting high-level features, which includes human personality and emotion, based on low-level prosodic cues displayed in a conversation. After listening to audio recordings lasting 2 minutes long each, participants were made to rate the high-level features displayed by each speaker during the conversation. The high-level features were selected on the basis that they are easier to be identified in a conversation by the human brain. The analysis from the annotations received from the participants showed that not all high-level features can be well identified. From the two-party dialogs, politeness, confusion and hostility were the most easily identifiable features from both annotations and classification results. The same analysis had also been extended to multiparty dialogs. In this project, a group of 4 speakers was used as the representation for multiparty dialogs. The analysis from the annotations received gave a slightly different outcome compared to the two-party dialogs. Judging from the annotations, interest, disagreement, likeability, politeness and respect were the more easily identifiable features. However, the classifiers built from the annotations gave more accurate detection for the features likeability, friendliness, respect, confusion and hostility.
URI: http://hdl.handle.net/10356/64184
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYPReport_final_submission.pdf
  Restricted Access
1.78 MBAdobe PDFView/Open

Page view(s)

112
checked on Sep 26, 2020

Download(s)

13
checked on Sep 26, 2020

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.