Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/168505
Title: Zero-shot text classification via self-supervised tuning
Authors: Liu, Chaoqun
Zhang, Wenxuan
Chen, Guizhen
Wu, Xiaobao
Luu, Anh Tuan
Chang, Chip Hong
Bing, Lidong
Keywords: Computer Science - Computation and Language
Computer Science - Artificial Intelligence
Engineering::Computer science and engineering::Computing methodologies::Document and text processing
Issue Date: 2023
Source: Liu, C., Zhang, W., Chen, G., Wu, X., Luu, A. T., Chang, C. H. & Bing, L. (2023). Zero-shot text classification via self-supervised tuning. 61st Annual Meeting of the Association for Computational Linguistics (ACL 2023).
Project: Alibaba-NTU-AIR2021B6 
MOE-T1-RS21/20 
metadata.dc.contributor.conference: 61st Annual Meeting of the Association for Computational Linguistics (ACL 2023)
Abstract: Existing solutions to zero-shot text classification either conduct prompting with pre-trained language models, which is sensitive to the choices of templates, or rely on large-scale annotated data of relevant tasks for meta-tuning. In this work, we propose a new paradigm based on self-supervised learning to solve zero-shot text classification tasks by tuning the language models with unlabeled data, called self-supervised tuning. By exploring the inherent structure of free texts, we propose a new learning objective called first sentence prediction to bridge the gap between unlabeled data and text classification tasks. After tuning the model to learn to predict the first sentence in a paragraph based on the rest, the model is able to conduct zero-shot inference on unseen tasks such as topic classification and sentiment analysis. Experimental results show that our model outperforms the state-of-the-art baselines on 7 out of 10 tasks. Moreover, the analysis reveals that our model is less sensitive to the prompt design. Our code and pre-trained models are publicly available at https://github.com/DAMO-NLP-SG/SSTuning .
URI: https://hdl.handle.net/10356/168505
URL: https://2023.aclweb.org/
Schools: Interdisciplinary Graduate School (IGS) 
Organisations: Alibaba Group 
Research Centres: Alibaba-NTU Singapore Joint Research Institute
Rights: © 2023 Association for Computational Linguistics. All rights reserved. This paper was published in the Proceedings of 61st Annual Meeting of the Association for Computational Linguistics (ACL 2023) and is made available with permission of Association for Computational Linguistics.
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:IGS Conference Papers

Files in This Item:
File Description SizeFormat 
ACL_2023_Zero_shot_text_classification (3).pdf687.71 kBAdobe PDFThumbnail
View/Open

Page view(s)

63
Updated on Sep 25, 2023

Download(s)

4
Updated on Sep 25, 2023

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.