Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/89670
Full metadata record
DC FieldValueLanguage
dc.contributor.authorWang, Dien
dc.contributor.authorTan, Ah-Hweeen
dc.contributor.authorMiao, Chunyanen
dc.date.accessioned2018-12-19T07:29:55Zen
dc.date.accessioned2019-12-06T17:30:49Z-
dc.date.available2018-12-19T07:29:55Zen
dc.date.available2019-12-06T17:30:49Z-
dc.date.copyright2016-05-01en
dc.date.issued2016en
dc.identifier.citationWang, D., Tan, A.-H., Miao, C. (2016). Modelling autobiographical memory in human-like autonomous agents. Proceedings of the 2016 International Conference on Autonomous Agents & Multiagent Systems (AAMAS 2016), 845-853.en
dc.identifier.urihttps://hdl.handle.net/10356/89670-
dc.description.abstractAlthough autobiographical memory is an important part of the human mind, there has been little effort on modeling autobiographical memory in autonomous agents. With the motivation of developing human-like intelligence, in this paper, we delineate our approach to enable an agent to maintain memories of its own and to wander in mind. Our model, named Autobiographical Memory-Adaptive Resonance Theory network (AM-ART), is designed to capture autobiographical memories, comprising pictorial snapshots of one's life experiences together with the associated context, namely time, location, people, activity, and emotion. In terms of both network structure and dynamics, AM-ART coincides with the autobiographical memory model established by the psychologists, which has been supported by neural imaging evidence. Specifically, the bottom-up memory search and the top-down memory readout operations of AM-ART replicate how the brain encodes and retrieves autobiographical memories. Furthermore, the wandering in reminiscence function of AM-ART mimics how human wanders in mind. For evaluations, we conducted experiments on a data set collected from the public domain to test the performance of AM-ART in response to exact, partial, and noisy memory retrieval cues. Moreover, our statistical analysis shows that AM-ART can simulate the phenomenon of wandering in reminiscence.en
dc.description.sponsorshipNRF (Natl Research Foundation, S’pore)en
dc.format.extent9 p.en
dc.language.isoenen
dc.rights© 2016 International Foundation for Autonomous Agents and Multiagent Systems (IFAAMAS). This paper was published in Proceedings of the 2016 International Conference on Autonomous Agents & Multiagent Systems (AAMAS 2016) and is made available as an electronic reprint (preprint) with permission of International Foundation for Autonomous Agents and Multiagent Systems (IFAAMAS). The published version is available at: [https://dl.acm.org/citation.cfm?id=2937048]. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper is prohibited and is subject to penalties under law.en
dc.subjectDRNTU::Engineering::Computer science and engineeringen
dc.subjectCognitive Modelen
dc.subjectComputational Autobiographical Memory Modelen
dc.titleModeling autobiographical memory in human-like autonomous agentsen
dc.typeConference Paperen
dc.contributor.schoolSchool of Computer Science and Engineeringen
dc.contributor.conferenceProceedings of the 2016 International Conference on Autonomous Agents & Multiagent Systems (AAMAS 2016)en
dc.contributor.researchNTU-UBC Research Centre of Excellence in Active Living for the Elderlyen
dc.description.versionPublished versionen
dc.identifier.urlhttps://dl.acm.org/citation.cfm?id=2937048en
dc.identifier.rims193903en
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:SCSE Conference Papers
Files in This Item:
File Description SizeFormat 
AAMAS2016.pdf795.75 kBAdobe PDFThumbnail
View/Open

Page view(s) 50

460
Updated on Mar 28, 2024

Download(s) 50

124
Updated on Mar 28, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.