Generative flows with invertible attentions

Flow-based generative models have shown an excellent ability to explicitly learn the probability density function of data via a sequence of invertible transformations. Yet, learning attentions in generative flows remains understudied, while it has made breakthroughs in other domains. To fill the gap...

Full description

Saved in:
Bibliographic Details
Main Authors: SUKTHANKER, Rhea Sanjay, HUANG, Zhiwu, KUMAR, Suryansh, TIMOFTE, Radu, VAN GOOL, Luc
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7612
https://ink.library.smu.edu.sg/context/sis_research/article/8615/viewcontent/01_Generative_Flows_With_Invertible_Attentions_CVPR_2022_paper.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8615
record_format dspace
spelling sg-smu-ink.sis_research-86152022-12-22T03:26:42Z Generative flows with invertible attentions SUKTHANKER, Rhea Sanjay HUANG, Zhiwu KUMAR, Suryansh TIMOFTE, Radu VAN GOOL, Luc Flow-based generative models have shown an excellent ability to explicitly learn the probability density function of data via a sequence of invertible transformations. Yet, learning attentions in generative flows remains understudied, while it has made breakthroughs in other domains. To fill the gap, this paper introduces two types of invertible attention mechanisms, i.e., map-based and transformer-based attentions, for both unconditional and conditional generative flows. The key idea is to exploit a masked scheme of these two attentions to learn long-range data dependencies in the context of generative flows. The masked scheme allows for invertible attention modules with tractable Jacobian determinants, enabling its seamless integration at any positions of the flow-based models. The proposed attention mechanisms lead to more efficient generative flows, due to their capability of modeling the long-term data dependencies. Evaluation on multiple image synthesis tasks shows that the proposed attention flows result in efficient models and compare favorably against the state-of-the-art unconditional and conditional generative flows. 2022-06-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7612 info:doi/10.1109/CVPR52688.2022.01095 https://ink.library.smu.edu.sg/context/sis_research/article/8615/viewcontent/01_Generative_Flows_With_Invertible_Attentions_CVPR_2022_paper.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University flow-based generative models invertible attention Artificial Intelligence and Robotics Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic flow-based generative models
invertible attention
Artificial Intelligence and Robotics
Databases and Information Systems
spellingShingle flow-based generative models
invertible attention
Artificial Intelligence and Robotics
Databases and Information Systems
SUKTHANKER, Rhea Sanjay
HUANG, Zhiwu
KUMAR, Suryansh
TIMOFTE, Radu
VAN GOOL, Luc
Generative flows with invertible attentions
description Flow-based generative models have shown an excellent ability to explicitly learn the probability density function of data via a sequence of invertible transformations. Yet, learning attentions in generative flows remains understudied, while it has made breakthroughs in other domains. To fill the gap, this paper introduces two types of invertible attention mechanisms, i.e., map-based and transformer-based attentions, for both unconditional and conditional generative flows. The key idea is to exploit a masked scheme of these two attentions to learn long-range data dependencies in the context of generative flows. The masked scheme allows for invertible attention modules with tractable Jacobian determinants, enabling its seamless integration at any positions of the flow-based models. The proposed attention mechanisms lead to more efficient generative flows, due to their capability of modeling the long-term data dependencies. Evaluation on multiple image synthesis tasks shows that the proposed attention flows result in efficient models and compare favorably against the state-of-the-art unconditional and conditional generative flows.
format text
author SUKTHANKER, Rhea Sanjay
HUANG, Zhiwu
KUMAR, Suryansh
TIMOFTE, Radu
VAN GOOL, Luc
author_facet SUKTHANKER, Rhea Sanjay
HUANG, Zhiwu
KUMAR, Suryansh
TIMOFTE, Radu
VAN GOOL, Luc
author_sort SUKTHANKER, Rhea Sanjay
title Generative flows with invertible attentions
title_short Generative flows with invertible attentions
title_full Generative flows with invertible attentions
title_fullStr Generative flows with invertible attentions
title_full_unstemmed Generative flows with invertible attentions
title_sort generative flows with invertible attentions
publisher Institutional Knowledge at Singapore Management University
publishDate 2022
url https://ink.library.smu.edu.sg/sis_research/7612
https://ink.library.smu.edu.sg/context/sis_research/article/8615/viewcontent/01_Generative_Flows_With_Invertible_Attentions_CVPR_2022_paper.pdf
_version_ 1770576394277879808