Please use this identifier to cite or link to this item:
Title: Instance-based genre-specific music emotion prediction with an EEG setup
Authors: Liu, Xiaoyu
Keywords: DRNTU::Science::Biological sciences::Human anatomy and physiology::Neurobiology
Issue Date: 2018
Abstract: This paper explores a novel direction in music-induced emotion (music emotion) analysis – the effects of different genres on the prediction of music emotion. We aim to compare the performance of various classifiers in the prediction of the emotion induced by music, as well as to investigate the adaptation of advanced features (such as asymmetries) in improving classification accuracy. The study is supported by real-world experiments where 10 subjects listened to 20 musical pieces from 5 genres- classical, heavy metal, electronic dance music, pop and rap, during which electroencephalogram (EEG) data were collected. A maximum 10-fold cross-validation accuracy of 98.4% for subject-independent and 99.0% for subject-dependent data were obtained for the classification of short instances of each song. The emotion of popular music was shown to have been most accurately predicted, with a classification accuracy of 99.6%. Further examination was conducted to investigate the effect of music emotion on the relaxation of subjects while listening. Part of the work has been accepted for publication in IEEE 40th Engineering in Medicine and Biology Science (EMBC) conference 2018.
Schools: School of Electrical and Electronic Engineering 
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP Report.pdf
  Restricted Access
1.88 MBAdobe PDFView/Open

Page view(s)

Updated on Sep 30, 2023


Updated on Sep 30, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.