Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/87699
Title: Learning temporal information for brain-computer interface using convolutional neural networks
Authors: Sakhavi, Siavash
Guan, Cuntai
Yan, Shuicheng
Keywords: Brain-computer Interface (BCI)
Convolutional Neural Network (CNN)
Issue Date: 2018
Source: Sakhavi, S., Guan, C., & Yan, S. Learning Temporal Information for Brain-Computer Interface Using Convolutional Neural Networks. IEEE Transactions on Neural Networks and Learning Systems, in press.
Series/Report no.: IEEE Transactions on Neural Networks and Learning Systems
Abstract: Deep learning (DL) methods and architectures have been the state-of-the-art classification algorithms for computer vision and natural language processing problems. However, the successful application of these methods in motor imagery (MI) brain-computer interfaces (BCIs), in order to boost classification performance, is still limited. In this paper, we propose a classification framework for MI data by introducing a new temporal representation of the data and also utilizing a convolutional neural network (CNN) architecture for classification. The new representation is generated from modifying the filter-bank common spatial patterns method, and the CNN is designed and optimized accordingly for the representation. Our framework outperforms the best classification method in the literature on the BCI competition IV-2a 4-class MI data set by 7% increase in average subject accuracy. Furthermore, by studying the convolutional weights of the trained networks, we gain an insight into the temporal characteristics of EEG.
URI: https://hdl.handle.net/10356/87699
http://hdl.handle.net/10220/45497
ISSN: 2162-237X
DOI: http://dx.doi.org/10.1109/TNNLS.2018.2789927
Rights: © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: [http://dx.doi.org/10.1109/TNNLS.2018.2789927].
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Journal Articles

Google ScholarTM

Check

Altmetric

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.