Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/160972
Title: | DEVDAN: Deep evolving denoising autoencoder | Authors: | Ashfahani, Andri Pratama, Mahardhika Lughofer, Edwin Ong, Yew Soon |
Keywords: | Engineering::Computer science and engineering | Issue Date: | 2020 | Source: | Ashfahani, A., Pratama, M., Lughofer, E. & Ong, Y. S. (2020). DEVDAN: Deep evolving denoising autoencoder. Neurocomputing, 390, 297-314. https://dx.doi.org/10.1016/j.neucom.2019.07.106 | Journal: | Neurocomputing | Abstract: | The Denoising Autoencoder (DAE) enhances the flexibility of data stream method in exploiting unlabeled samples. Nonetheless, the feasibility of DAE for data stream analytic deserves in-depth study because it characterizes a fixed network capacity which cannot adapt to rapidly changing environments. Deep evolving denoising autoencoder (DEVDAN), is proposed in this paper. It features an open structure in the generative phase and the discriminative phase where the hidden units can be automatically added and discarded on the fly. The generative phase refines the predictive performance of discriminative model exploiting unlabeled data. Furthermore, DEVDAN is free of the problem-specific threshold and works fully in the single-pass learning fashion. We show that DEVDAN can find competitive network architecture compared with state-of-the-art methods on the classification task using ten prominent datasets simulated under the prequential test-then-train protocol. | URI: | https://hdl.handle.net/10356/160972 | ISSN: | 0925-2312 | DOI: | 10.1016/j.neucom.2019.07.106 | Schools: | School of Computer Science and Engineering | Rights: | © 2019 Elsevier B.V. All rights reserved. | Fulltext Permission: | none | Fulltext Availability: | No Fulltext |
Appears in Collections: | SCSE Journal Articles |
SCOPUSTM
Citations
5
64
Updated on Sep 22, 2023
Web of ScienceTM
Citations
5
56
Updated on Sep 27, 2023
Page view(s)
41
Updated on Sep 29, 2023
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.