Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/160695
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLi, Weien_US
dc.contributor.authorZhu, Luyaoen_US
dc.contributor.authorCambria, Eriken_US
dc.date.accessioned2022-08-01T03:56:39Z-
dc.date.available2022-08-01T03:56:39Z-
dc.date.issued2021-
dc.identifier.citationLi, W., Zhu, L. & Cambria, E. (2021). Taylor's theorem: a new perspective for neural tensor networks. Knowledge-Based Systems, 228, 107258-. https://dx.doi.org/10.1016/j.knosys.2021.107258en_US
dc.identifier.issn0950-7051en_US
dc.identifier.urihttps://hdl.handle.net/10356/160695-
dc.description.abstractNeural tensor networks have been widely used in a large number of natural language processing tasks such as conversational sentiment analysis, named entity recognition and knowledge base completion. However, the mathematical explanation of neural tensor networks remains a challenging problem, due to the bilinear term. According to Taylor's theorem, a kth order differentiable function can be approximated by a kth order Taylor polynomial around a given point. Therefore, we provide a mathematical explanation of neural tensor networks and also reveal the inner link between them and feedforward neural networks from the perspective of Taylor's theorem. In addition, we unify two forms of neural tensor networks into a single framework and present factorization methods to make the neural tensor networks parameter-efficient. Experimental results bring some valuable insights into neural tensor networks.en_US
dc.description.sponsorshipAgency for Science, Technology and Research (A*STAR)en_US
dc.language.isoenen_US
dc.relationA18A2b0046en_US
dc.relation.ispartofKnowledge-Based Systemsen_US
dc.rights© 2021 Elsevier B.V. All rights reserved.en_US
dc.subjectEngineering::Computer science and engineeringen_US
dc.titleTaylor's theorem: a new perspective for neural tensor networksen_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.identifier.doi10.1016/j.knosys.2021.107258-
dc.identifier.scopus2-s2.0-85110103450-
dc.identifier.volume228en_US
dc.identifier.spage107258en_US
dc.subject.keywordsNeural Tensor Networksen_US
dc.subject.keywordsNatural Language Processingen_US
dc.description.acknowledgementThis research is supported by the Agency for Science, Technology and Research (A*STAR), Singapore under its AME Programmatic Funding Scheme (Project #A18A2b0046).en_US
item.grantfulltextnone-
item.fulltextNo Fulltext-
Appears in Collections:SCSE Journal Articles

SCOPUSTM   
Citations 20

12
Updated on Mar 16, 2023

Web of ScienceTM
Citations 20

11
Updated on Mar 18, 2023

Page view(s)

27
Updated on Mar 20, 2023

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.