Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/102407
Full metadata record
DC FieldValueLanguage
dc.contributor.authorRoy, Subhrajiten
dc.contributor.authorBanerjee, Amitavaen
dc.contributor.authorBasu, Arindamen
dc.date.accessioned2015-01-12T01:36:42Zen
dc.date.accessioned2019-12-06T20:54:26Z-
dc.date.available2015-01-12T01:36:42Zen
dc.date.available2019-12-06T20:54:26Z-
dc.date.copyright2014en
dc.date.issued2014en
dc.identifier.citationRoy, S., Banerjee, A., & Basu, A. (2014). Liquid state machine with dendritically enhanced readout for low-power, neuromorphic VLSI implementations. IEEE transactions on biomedical circuits and systems, 8(5), 681-695.en
dc.identifier.issn1932-4545en
dc.identifier.urihttps://hdl.handle.net/10356/102407-
dc.description.abstractIn this paper, we describe a new neuro-inspired, hardware-friendly readout stage for the liquid state machine (LSM), a popular model for reservoir computing. Compared to the parallel perceptron architecture trained by the p-delta algorithm, which is the state of the art in terms of performance of readout stages, our readout architecture and learning algorithm can attain better performance with significantly less synaptic resources making it attractive for VLSI implementation. Inspired by the nonlinear properties of dendrites in biological neurons, our readout stage incorporates neurons having multiple dendrites with a lumped nonlinearity (two compartment model). The number of synaptic connections on each branch is significantly lower than the total number of connections from the liquid neurons and the learning algorithm tries to find the best `combination' of input connections on each branch to reduce the error. Hence, the learning involves network rewiring (NRW) of the readout network similar to structural plasticity observed in its biological counterparts. We show that compared to a single perceptron using analog weights, this architecture for the readout can attain, even by using the same number of binary valued synapses, up to 3.3 times less error for a two-class spike train classification problem and 2.4 times less error for an input rate approximation task. Even with 60 times larger synapses, a group of 60 parallel perceptrons cannot attain the performance of the proposed dendritically enhanced readout. An additional advantage of this method for hardware implementations is that the `choice' of connectivity can be easily implemented exploiting address event representation (AER) protocols commonly used in current neuromorphic systems where the connection matrix is stored in memory. Also, due to the use of binary synapses, our proposed method is more robust against statistical variations.en
dc.format.extent14 p.en
dc.language.isoenen
dc.relation.ispartofseriesIEEE transactions on biomedical circuits and systemsen
dc.rights© 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: [DOI:http://dx.doi.org/10.1109/TBCAS.2014.2362969].en
dc.subjectDRNTU::Engineering::Electrical and electronic engineering::Electronic circuitsen
dc.titleLiquid state machine with dendritically enhanced readout for low-power, neuromorphic VLSI implementationsen
dc.typeJournal Articleen
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen
dc.identifier.doi10.1109/TBCAS.2014.2362969en
dc.description.versionAccepted versionen
dc.identifier.rims180736en
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:EEE Journal Articles
Files in This Item:
File Description SizeFormat 
J1_TBCAS_2014.pdf728.56 kBAdobe PDFThumbnail
View/Open

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.