A randomized link transformer for diverse open-domain dialogue generation
A major issue in open-domain dialogue generation is the agent’s tendency to generate repetitive and generic responses. The lack in response diversity has been addressed in recent years via the use of latent variable models, such as the Conditional Variational Auto-Encoder (CVAE), which typically inv...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/159793 https://aclanthology.org/volumes/2022.nlp4convai-1/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-159793 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1597932022-07-06T02:07:30Z A randomized link transformer for diverse open-domain dialogue generation Lee, Jing Yang Lee, Kong Aik Gan, Woon-Seng School of Electrical and Electronic Engineering 4th Workshop on NLP for Conversational AI at ACL 2022 (NLP4ConvAI 2022) Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Randomized Link Transformer Open-Domain Dialogue Generation A major issue in open-domain dialogue generation is the agent’s tendency to generate repetitive and generic responses. The lack in response diversity has been addressed in recent years via the use of latent variable models, such as the Conditional Variational Auto-Encoder (CVAE), which typically involve learning a latent Gaussian distribution over potential response intents. However, due to latent variable collapse, training latent variable dialogue models are notoriously complex, requiring substantial modification to the standard training process and loss function. Other approaches proposed to improve response diversity also largely entail a significant increase in training complexity. Hence, this paper proposes a Randomized Link (RL) Transformer as an alternative to the latent variable models. The RL Transformer does not require any additional enhancements to the training process or loss function. Empirical results show that, when it comes to response diversity, the RL Transformer achieved comparable performance compared to latent variable models. Published version 2022-07-06T02:06:36Z 2022-07-06T02:06:36Z 2022 Conference Paper Lee, J. Y., Lee, K. A. & Gan, W. (2022). A randomized link transformer for diverse open-domain dialogue generation. 4th Workshop on NLP for Conversational AI at ACL 2022 (NLP4ConvAI 2022), 1-11. https://dx.doi.org/10.18653/v1/2022.nlp4convai-1.1 https://hdl.handle.net/10356/159793 10.18653/v1/2022.nlp4convai-1.1 https://aclanthology.org/volumes/2022.nlp4convai-1/ 1 11 en © 2022 Association for Computational Linguistics. This is an open-access article distributed under the terms of the Creative Commons Attribution License. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Randomized Link Transformer Open-Domain Dialogue Generation |
spellingShingle |
Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Randomized Link Transformer Open-Domain Dialogue Generation Lee, Jing Yang Lee, Kong Aik Gan, Woon-Seng A randomized link transformer for diverse open-domain dialogue generation |
description |
A major issue in open-domain dialogue generation is the agent’s tendency to generate repetitive and generic responses. The lack in response diversity has been addressed in recent years via the use of latent variable models, such as the Conditional Variational Auto-Encoder (CVAE), which typically involve learning a latent Gaussian distribution over potential response intents. However, due to latent variable collapse, training latent variable dialogue models are notoriously complex, requiring substantial modification to the standard training process and loss function. Other approaches proposed to improve response diversity also largely entail a significant increase in training complexity. Hence, this paper proposes a Randomized Link (RL) Transformer as an alternative to the latent variable models. The RL Transformer does not require any additional enhancements to the training process or loss function. Empirical results show that, when it comes to response diversity, the RL Transformer achieved comparable performance compared to latent variable models. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Lee, Jing Yang Lee, Kong Aik Gan, Woon-Seng |
format |
Conference or Workshop Item |
author |
Lee, Jing Yang Lee, Kong Aik Gan, Woon-Seng |
author_sort |
Lee, Jing Yang |
title |
A randomized link transformer for diverse open-domain dialogue generation |
title_short |
A randomized link transformer for diverse open-domain dialogue generation |
title_full |
A randomized link transformer for diverse open-domain dialogue generation |
title_fullStr |
A randomized link transformer for diverse open-domain dialogue generation |
title_full_unstemmed |
A randomized link transformer for diverse open-domain dialogue generation |
title_sort |
randomized link transformer for diverse open-domain dialogue generation |
publishDate |
2022 |
url |
https://hdl.handle.net/10356/159793 https://aclanthology.org/volumes/2022.nlp4convai-1/ |
_version_ |
1738844807354122240 |