Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLiu, Yisi.en
dc.contributor.authorSourina, Olga.en
dc.description.abstractEmotions recognized from Electroencephalogram (EEG) could reflect the real "inner" feelings of the human. Recently, research on real-time emotion recognition received more attention since it could be applied in games, e-learning systems or even in marketing. EEG signal can be divided into the delta, theta, alpha, beta, and gamma waves based on their frequency bands. Based on the Valence-Arousal-Dominance emotion model, we proposed a subject-dependent algorithm using the beta/alpha ratio to recognize high and low dominance levels of emotions from EEG. Three experiments were designed and carried out to collect the EEG data labeled with emotions. Sound clips from International Affective Digitized Sounds (IADS) database and music pieces were used to evoke emotions in the experiments. Our approach would allow real-time recognition of the emotions defined with different dominance levels in Valence-Arousal-Dominance model.en
dc.titleEEG-based dominance level recognition for emotion-enabled interactionen
dc.typeConference Paperen
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen
dc.contributor.conferenceIEEE International Conference on Multimedia and Expo (2012 : Melbourne, Australia)en
item.fulltextNo Fulltext-
Appears in Collections:EEE Conference Papers

Google ScholarTM



Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.