Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/140539
Title: | EEG-based emotion recognition using deep learning techniques | Authors: | Lang, Zihui | Keywords: | Engineering::Electrical and electronic engineering | Issue Date: | 2020 | Publisher: | Nanyang Technological University | Project: | A3263-191 | Abstract: | Emotion recognition plays a vital role in human-machine interface as well as brain computer interfaces. Emotion is one of the key factors to understand human behavior and cognition. By precisely analyzing human emotion from Electroencephalogram(EEG) via computational methods such as deep learning other traditional statistical methods, further researches related to cognitive science, neural technology and psychology can be discovered. In this paper, common state-of-the-art EEG emotion recognition techniques are reviewed, and a deep Convolutional Neural Network is constructed to better classifying subject independent emotion based on a 62 channel SEED dataset. By using a segmented signal as input, the model increases 15% accuracy compared to the baseline EEGNet. To help the model better characterize EEG features, channel selection is applied and five-channel profiles of 4,6,9,12,15 channels are trained separately, with 9 channel profile achieved the highest accuracy. Differential entropy extracted from the original signal is used as another input to compare the performance and robustness of the model when dealing with different input format. | URI: | https://hdl.handle.net/10356/140539 | Schools: | School of Electrical and Electronic Engineering | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | EEE Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FYP_Final_Report.pdf Restricted Access | 8.53 MB | Adobe PDF | View/Open |
Page view(s)
287
Updated on Mar 17, 2025
Download(s) 50
28
Updated on Mar 17, 2025
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.