Towards better data augmentation using Wasserstein distance in variational auto-encoder

VAE, or variational auto-encoder, compresses data into latent attributes, and generates new data of different varieties. VAE based on KL divergence has been considered as an effective technique for data augmentation. In this paper, we propose the use of Wasserstein distance as a measure of distribut...

Full description

Saved in:
Bibliographic Details
Main Authors: CHEN, Zichuan, LIU, Peng
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2021
Subjects:
Online Access:https://ink.library.smu.edu.sg/lkcsb_research/7046
https://ink.library.smu.edu.sg/context/lkcsb_research/article/8045/viewcontent/2109.14795.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.lkcsb_research-8045
record_format dspace
spelling sg-smu-ink.lkcsb_research-80452022-08-02T07:49:18Z Towards better data augmentation using Wasserstein distance in variational auto-encoder CHEN, Zichuan LIU, Peng VAE, or variational auto-encoder, compresses data into latent attributes, and generates new data of different varieties. VAE based on KL divergence has been considered as an effective technique for data augmentation. In this paper, we propose the use of Wasserstein distance as a measure of distributional similarity for the latent attributes, and show its superior theoretical lower bound (ELBO) compared with that of KL divergence under mild conditions. Using multiple experiments, we demonstrate that the new loss function exhibits better convergence property and generates artificial images that could better aid the image classification tasks. 2021-09-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/lkcsb_research/7046 https://ink.library.smu.edu.sg/context/lkcsb_research/article/8045/viewcontent/2109.14795.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection Lee Kong Chian School Of Business eng Institutional Knowledge at Singapore Management University Finance Finance and Financial Management
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Finance
Finance and Financial Management
spellingShingle Finance
Finance and Financial Management
CHEN, Zichuan
LIU, Peng
Towards better data augmentation using Wasserstein distance in variational auto-encoder
description VAE, or variational auto-encoder, compresses data into latent attributes, and generates new data of different varieties. VAE based on KL divergence has been considered as an effective technique for data augmentation. In this paper, we propose the use of Wasserstein distance as a measure of distributional similarity for the latent attributes, and show its superior theoretical lower bound (ELBO) compared with that of KL divergence under mild conditions. Using multiple experiments, we demonstrate that the new loss function exhibits better convergence property and generates artificial images that could better aid the image classification tasks.
format text
author CHEN, Zichuan
LIU, Peng
author_facet CHEN, Zichuan
LIU, Peng
author_sort CHEN, Zichuan
title Towards better data augmentation using Wasserstein distance in variational auto-encoder
title_short Towards better data augmentation using Wasserstein distance in variational auto-encoder
title_full Towards better data augmentation using Wasserstein distance in variational auto-encoder
title_fullStr Towards better data augmentation using Wasserstein distance in variational auto-encoder
title_full_unstemmed Towards better data augmentation using Wasserstein distance in variational auto-encoder
title_sort towards better data augmentation using wasserstein distance in variational auto-encoder
publisher Institutional Knowledge at Singapore Management University
publishDate 2021
url https://ink.library.smu.edu.sg/lkcsb_research/7046
https://ink.library.smu.edu.sg/context/lkcsb_research/article/8045/viewcontent/2109.14795.pdf
_version_ 1770576263749042176