Please use this identifier to cite or link to this item:
|Title:||Brain computer interface via EEG signals||Authors:||Koh, Ji Peng||Keywords:||DRNTU::Engineering::Computer science and engineering::Computer applications::Social and behavioral sciences||Issue Date:||2018||Abstract:||Emotions are states in feeling that results in physical and psychological changes that influences thought, behaviour and action. A person’s true affective state can be hidden with words and action. Analysing a person’s brainwaves can reveal their affective state. This project makes use of two different stimuli types, still images with audio (IA clips) and movie clips. The objective of this project is to find out the accuracy of classifying emotions into six different classes. This project will also determine the effect of different stimuli types as well as the different orders of stimuli type on the effectiveness of elicitation of emotion. There will be two different orders of stimuli type. The first is movie clips followed by IA clips, called the MIA session. The second is IA clips followed by movie clips, called the IAM session. The data collection for this project was done in the Recording Room in Lee Wee Nam Library to minimise outside distraction and for maximum noise isolation. This project uses support vector machine (SVM) classification to classify the emotions of each participant using their electroencephalogram (EEG) signals. Using SVM with radial basis function (RBF) kernel and feature smoothing using moving mean, this project found the best window size as well as moving mean window sizes to use for this dataset. This project found that the session type affects the results of the classification and that it was better to use a single stimuli type at a time than a combination of both. In other words, it is better to use just movie clips or just IA clips rather than a combination of both movie clips and IA clips as stimuli. It was also discovered that the first stimuli type of each session yields slightly better results than the second, and could be due to the participants losing concentration over the duration of the data collection.||URI:||http://hdl.handle.net/10356/73955||Rights:||Nanyang Technological University||Fulltext Permission:||restricted||Fulltext Availability:||With Fulltext|
|Appears in Collections:||SCSE Student Reports (FYP/IA/PA/PI)|
checked on Sep 25, 2020
checked on Sep 25, 2020
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.