Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/153544
Title: Latent-optimized adversarial neural transfer for sarcasm detection
Authors: Guo, Xu
Li, Boyang
Yu, Han
Miao, Chunyan
Keywords: Engineering::Computer science and engineering
Issue Date: 2021
Source: Guo, X., Li, B., Yu, H. & Miao, C. (2021). Latent-optimized adversarial neural transfer for sarcasm detection. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 5394-5407.
Project: AISG2-RP-2020-019 
NRF-NRFI05-2019-0002 
NRF-NRFF13-2021-0006 
NWJ2020-008 
A20G8b0102 
NSC-2019-011 
Conference: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Abstract: The existence of multiple datasets for sarcasm detection prompts us to apply transfer learning to exploit their commonality. The adversarial neural transfer (ANT) framework utilizes multiple loss terms that encourage the source-domain and the target-domain feature distributions to be similar while optimizing for domain-specific performance. However, these objectives may be in conflict, which can lead to optimization difficulties and sometimes diminished transfer. We propose a generalized latent optimization strategy that allows different losses to accommodate each other and improves training dynamics. The proposed method outperforms transfer learning and meta-learning baselines. In particular, we achieve 10.02% absolute performance gain over the previous state of the art on the iSarcasm dataset.
URI: https://hdl.handle.net/10356/153544
Schools: School of Computer Science and Engineering 
Research Centres: Joint NTU-UBC Research Centre of Excellence in Active Living for the Elderly (LILY) 
Rights: © 2021 Association for Computational Linguistics. This is an open-access article distributed under the terms of the Creative Commons Attribution License.
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Conference Papers

Files in This Item:
File Description SizeFormat 
2021.naacl-main.425.pdf1.01 MBAdobe PDFThumbnail
View/Open

Page view(s)

268
Updated on May 7, 2025

Download(s) 50

112
Updated on May 7, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.