Wasserstein divergence for GANs

In many domains of computer vision, generative adversarial networks (GANs) have achieved great success, among which the family of Wasserstein GANs (WGANs) is considered to be state-of-the-art due to the theoretical contributions and competitive qualitative performance. However, it is very challengin...

Full description

Saved in:
Bibliographic Details
Main Authors: WU, J., HUANG, Zhiwu, THOMA, J., ACHARYA, D., VAN, Gool L.
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2018
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/6402
https://ink.library.smu.edu.sg/context/sis_research/article/7405/viewcontent/Wasserstein_Divergence_for_GANs.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-7405
record_format dspace
spelling sg-smu-ink.sis_research-74052021-11-23T02:09:55Z Wasserstein divergence for GANs WU, J. HUANG, Zhiwu THOMA, J. ACHARYA, D. VAN, Gool L. In many domains of computer vision, generative adversarial networks (GANs) have achieved great success, among which the family of Wasserstein GANs (WGANs) is considered to be state-of-the-art due to the theoretical contributions and competitive qualitative performance. However, it is very challenging to approximate the k-Lipschitz constraint required by the Wasserstein-1 metric (W-met). In this paper, we propose a novel Wasserstein divergence (W-div), which is a relaxed version of W-met and does not require the k-Lipschitz constraint. As a concrete application, we introduce a Wasserstein divergence objective for GANs (WGAN-div), which can faithfully approximate W-div through optimization. Under various settings, including progressive growing training, we demonstrate the stability of the proposed WGAN-div owing to its theoretical and practical advantages over WGANs. Also, we study the quantitative and visual performance of WGAN-div on standard image synthesis benchmarks, showing the superior performance of WGAN-div compared to the state-of-the-art methods. 2018-09-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/6402 info:doi/10.1007/978-3-030-01228-1_40 https://ink.library.smu.edu.sg/context/sis_research/article/7405/viewcontent/Wasserstein_Divergence_for_GANs.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University GANs; Progressive growing; Wasserstein divergence; Wasserstein metric Databases and Information Systems Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic GANs; Progressive growing; Wasserstein divergence; Wasserstein metric
Databases and Information Systems
Graphics and Human Computer Interfaces
spellingShingle GANs; Progressive growing; Wasserstein divergence; Wasserstein metric
Databases and Information Systems
Graphics and Human Computer Interfaces
WU, J.
HUANG, Zhiwu
THOMA, J.
ACHARYA, D.
VAN, Gool L.
Wasserstein divergence for GANs
description In many domains of computer vision, generative adversarial networks (GANs) have achieved great success, among which the family of Wasserstein GANs (WGANs) is considered to be state-of-the-art due to the theoretical contributions and competitive qualitative performance. However, it is very challenging to approximate the k-Lipschitz constraint required by the Wasserstein-1 metric (W-met). In this paper, we propose a novel Wasserstein divergence (W-div), which is a relaxed version of W-met and does not require the k-Lipschitz constraint. As a concrete application, we introduce a Wasserstein divergence objective for GANs (WGAN-div), which can faithfully approximate W-div through optimization. Under various settings, including progressive growing training, we demonstrate the stability of the proposed WGAN-div owing to its theoretical and practical advantages over WGANs. Also, we study the quantitative and visual performance of WGAN-div on standard image synthesis benchmarks, showing the superior performance of WGAN-div compared to the state-of-the-art methods.
format text
author WU, J.
HUANG, Zhiwu
THOMA, J.
ACHARYA, D.
VAN, Gool L.
author_facet WU, J.
HUANG, Zhiwu
THOMA, J.
ACHARYA, D.
VAN, Gool L.
author_sort WU, J.
title Wasserstein divergence for GANs
title_short Wasserstein divergence for GANs
title_full Wasserstein divergence for GANs
title_fullStr Wasserstein divergence for GANs
title_full_unstemmed Wasserstein divergence for GANs
title_sort wasserstein divergence for gans
publisher Institutional Knowledge at Singapore Management University
publishDate 2018
url https://ink.library.smu.edu.sg/sis_research/6402
https://ink.library.smu.edu.sg/context/sis_research/article/7405/viewcontent/Wasserstein_Divergence_for_GANs.pdf
_version_ 1770575953477500928