Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/151224
Title: Deep stacked stochastic configuration networks for lifelong learning of non-stationary data streams
Authors: Pratama, Mahardhika
Wang, Dianhui
Keywords: Engineering::Computer science and engineering
Issue Date: 2019
Source: Pratama, M. & Wang, D. (2019). Deep stacked stochastic configuration networks for lifelong learning of non-stationary data streams. Information Sciences, 495, 150-174. https://dx.doi.org/10.1016/j.ins.2019.04.055
Journal: Information Sciences
Abstract: The concept of SCN offers a fast framework with universal approximation guarantee for lifelong learning of non-stationary data streams. Its adaptive scope selection property enables for proper random generation of hidden unit parameters advancing conventional randomized approaches constrained with a fixed scope of random parameters. This paper proposes deep stacked stochastic configuration network (DSSCN) for continual learning of non-stationary data streams which contributes two major aspects: 1) DSSCN features a self-constructing methodology of deep stacked network structure where hidden unit and hidden layer are extracted automatically from continuously generated data streams; 2) the concept of SCN is developed to randomly assign inverse covariance matrix of multivariate Gaussian function in the hidden node addition step bypassing its computationally prohibitive tuning phase. Numerical evaluation and comparison with prominent data stream algorithms under two procedures: periodic hold-out and prequential test-then-train processes demonstrate the advantage of proposed methodology.
URI: https://hdl.handle.net/10356/151224
ISSN: 0020-0255
DOI: 10.1016/j.ins.2019.04.055
Rights: © 2019 Elsevier Inc. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

Page view(s)

46
Updated on Oct 24, 2021

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.