dc.contributor.authorZhao, Rui
dc.contributor.authorMao, Kezhi
dc.identifier.citationZhao, R., & Mao, K. (2017). Topic-Aware Deep Compositional Models for Sentence Classification. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 25(2), 248-260.en_US
dc.description.abstractIn recent years, deep compositional models have emerged as a popular technique for representation learning of sentence in computational linguistic and natural language processing. These models normally train various forms of neural networks on top of pretrained word embeddings using a task-specific corpus. However, most of these works neglect the multisense nature of words in the pretrained word embeddings. In this paper we introduce topic models to enrich the word embeddings for multisenses of words. The integration of the topic model with various semantic compositional processes leads to topic-aware convolutional neural network and topic-aware long short term memory networks. Different from previous multisense word embeddings models that assign multiple independent and sense-specific embeddings to each word, our proposed models are lightweight and have flexible frameworks that regard word sense as the composition of two parts: a general sense derived from a large corpus and a topic-specific sense derived from a task-specific corpus. In addition, our proposed models focus on semantic composition instead of word understanding. With the help of topic models, we can integrate the topic-specific sense at word-level before the composition and sentence-level after the composition. Comprehensive experiments on five public sentence classification datasets are conducted and the results show that our proposed topic-aware deep compositional models produce competitive or better performance than other text representation learning methods.en_US
dc.format.extent13 p.en_US
dc.relation.ispartofseriesIEEE/ACM Transactions on Audio, Speech, and Language Processingen_US
dc.rights© 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: [http://dx.doi.org/10.1109/TASLP.2016.2632521].en_US
dc.subjectMachine learningen_US
dc.subjectNatural language processingen_US
dc.titleTopic-Aware Deep Compositional Models for Sentence Classificationen_US
dc.typeJournal Article
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.versionAccepted versionen_US

Files in this item


This item appears in the following Collection(s)

Show simple item record