Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/159793
Title: A randomized link transformer for diverse open-domain dialogue generation
Authors: Lee, Jing Yang
Lee, Kong Aik
Gan, Woon-Seng
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2022
Source: Lee, J. Y., Lee, K. A. & Gan, W. (2022). A randomized link transformer for diverse open-domain dialogue generation. 4th Workshop on NLP for Conversational AI at ACL 2022 (NLP4ConvAI 2022), 1-11. https://dx.doi.org/10.18653/v1/2022.nlp4convai-1.1
Conference: 4th Workshop on NLP for Conversational AI at ACL 2022 (NLP4ConvAI 2022)
Abstract: A major issue in open-domain dialogue generation is the agent’s tendency to generate repetitive and generic responses. The lack in response diversity has been addressed in recent years via the use of latent variable models, such as the Conditional Variational Auto-Encoder (CVAE), which typically involve learning a latent Gaussian distribution over potential response intents. However, due to latent variable collapse, training latent variable dialogue models are notoriously complex, requiring substantial modification to the standard training process and loss function. Other approaches proposed to improve response diversity also largely entail a significant increase in training complexity. Hence, this paper proposes a Randomized Link (RL) Transformer as an alternative to the latent variable models. The RL Transformer does not require any additional enhancements to the training process or loss function. Empirical results show that, when it comes to response diversity, the RL Transformer achieved comparable performance compared to latent variable models.
URI: https://hdl.handle.net/10356/159793
URL: https://aclanthology.org/volumes/2022.nlp4convai-1/
DOI: 10.18653/v1/2022.nlp4convai-1.1
Schools: School of Electrical and Electronic Engineering 
Rights: © 2022 Association for Computational Linguistics. This is an open-access article distributed under the terms of the Creative Commons Attribution License.
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Conference Papers

Files in This Item:
File Description SizeFormat 
2022.nlp4convai-1.1.pdf245.48 kBAdobe PDFThumbnail
View/Open

SCOPUSTM   
Citations 50

1
Updated on Mar 7, 2025

Page view(s)

197
Updated on Mar 20, 2025

Download(s) 50

53
Updated on Mar 20, 2025

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.