Please use this identifier to cite or link to this item:
|Title:||Evolutionary multi-task learning for modular knowledge representation in neural networks||Authors:||Chandra, Rohitash
|Keywords:||Engineering::Computer science and engineering||Issue Date:||2017||Source:||Chandra, R., Gupta, A., Ong, Y.-S., & Goh, C.-K. (2018). Evolutionary multi-task learning for modular knowledge representation in neural networks. Neural Processing Letters, 47(3), 993-1009. doi:10.1007/s11063-017-9718-z||Journal:||Neural Processing Letters||Abstract:||The brain can be viewed as a complex modular structure with features of information processing through knowledge storage and retrieval. Modularity ensures that the knowledge is stored in a manner where any complications in certain modules do not affect the overall functionality of the brain. Although artificial neural networks have been very promising in prediction and recognition tasks, they are limited in terms of learning algorithms that can provide modularity in knowledge representation that could be helpful in using knowledge modules when needed. Multi-task learning enables learning algorithms to feature knowledge in general representation from several related tasks. There has not been much work done that incorporates multi-task learning for modular knowledge representation in neural networks. In this paper, we present multi-task learning for modular knowledge representation in neural networks via modular network topologies. In the proposed method, each task is defined by the selected regions in a network topology (module). Modular knowledge representation would be effective even if some of the neurons and connections are disrupted or removed from selected modules in the network. We demonstrate the effectiveness of the method using single hidden layer feedforward networks to learn selected n-bit parity problems of varying levels of difficulty. Furthermore, we apply the method to benchmark pattern classification problems. The simulation and experimental results, in general, show that the proposed method retains performance quality although the knowledge is represented as modules.||URI:||https://hdl.handle.net/10356/138814||ISSN:||1370-4621||DOI:||10.1007/s11063-017-9718-z||Rights:||© 2017 Springer Science+Business Media, LLC. All rights reserved.||Fulltext Permission:||none||Fulltext Availability:||No Fulltext|
|Appears in Collections:||SCSE Journal Articles|
Updated on Mar 2, 2021
Updated on Mar 5, 2021
Updated on May 26, 2022
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.