Improving GAN training with probability ratio clipping and sample reweighting

Despite success on a wide range of problems related to vision, generative adversarial networks (GANs) often suffer from inferior performance due to unstable training, especially for text generation. To solve this issue, we propose a new variational GAN training framework which enjoys superior traini...

Full description

Saved in:
Bibliographic Details
Main Authors: WU, Yue, ZHOU, Pan, GORDON, Andrew Wilson, XING, Eric, HU, Zhiting
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2020
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8996
https://ink.library.smu.edu.sg/context/sis_research/article/9999/viewcontent/2020_NeurIPS_GAN.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9999
record_format dspace
spelling sg-smu-ink.sis_research-99992024-07-25T08:24:10Z Improving GAN training with probability ratio clipping and sample reweighting WU, Yue ZHOU, Pan GORDON, Andrew Wilson XING, Eric HU, Zhiting Despite success on a wide range of problems related to vision, generative adversarial networks (GANs) often suffer from inferior performance due to unstable training, especially for text generation. To solve this issue, we propose a new variational GAN training framework which enjoys superior training stability. Our approach is inspired by a connection of GANs and reinforcement learning under a variational perspective. The connection leads to (1) probability ratio clipping that regularizes generator training to prevent excessively large updates, and (2) a sample re-weighting mechanism that improves discriminator training by downplaying bad-quality fake samples. Moreover, our variational GAN framework can provably overcome the training issue in many GANs that an optimal discriminator cannot provide any informative gradient to training generator. By plugging the training approach in diverse state-of-the-art GAN architectures, we obtain significantly improved performance over a range of tasks, including text generation, text style transfer, and image generation. 2020-12-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8996 https://ink.library.smu.edu.sg/context/sis_research/article/9999/viewcontent/2020_NeurIPS_GAN.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Graphics and Human Computer Interfaces
spellingShingle Graphics and Human Computer Interfaces
WU, Yue
ZHOU, Pan
GORDON, Andrew Wilson
XING, Eric
HU, Zhiting
Improving GAN training with probability ratio clipping and sample reweighting
description Despite success on a wide range of problems related to vision, generative adversarial networks (GANs) often suffer from inferior performance due to unstable training, especially for text generation. To solve this issue, we propose a new variational GAN training framework which enjoys superior training stability. Our approach is inspired by a connection of GANs and reinforcement learning under a variational perspective. The connection leads to (1) probability ratio clipping that regularizes generator training to prevent excessively large updates, and (2) a sample re-weighting mechanism that improves discriminator training by downplaying bad-quality fake samples. Moreover, our variational GAN framework can provably overcome the training issue in many GANs that an optimal discriminator cannot provide any informative gradient to training generator. By plugging the training approach in diverse state-of-the-art GAN architectures, we obtain significantly improved performance over a range of tasks, including text generation, text style transfer, and image generation.
format text
author WU, Yue
ZHOU, Pan
GORDON, Andrew Wilson
XING, Eric
HU, Zhiting
author_facet WU, Yue
ZHOU, Pan
GORDON, Andrew Wilson
XING, Eric
HU, Zhiting
author_sort WU, Yue
title Improving GAN training with probability ratio clipping and sample reweighting
title_short Improving GAN training with probability ratio clipping and sample reweighting
title_full Improving GAN training with probability ratio clipping and sample reweighting
title_fullStr Improving GAN training with probability ratio clipping and sample reweighting
title_full_unstemmed Improving GAN training with probability ratio clipping and sample reweighting
title_sort improving gan training with probability ratio clipping and sample reweighting
publisher Institutional Knowledge at Singapore Management University
publishDate 2020
url https://ink.library.smu.edu.sg/sis_research/8996
https://ink.library.smu.edu.sg/context/sis_research/article/9999/viewcontent/2020_NeurIPS_GAN.pdf
_version_ 1814047704006787072