Generative flows with invertible attentions
Flow-based generative models have shown an excellent ability to explicitly learn the probability density function of data via a sequence of invertible transformations. Yet, learning attentions in generative flows remains understudied, while it has made breakthroughs in other domains. To fill the gap...
Saved in:
Main Authors: | SUKTHANKER, Rhea Sanjay, HUANG, Zhiwu, KUMAR, Suryansh, TIMOFTE, Radu, VAN GOOL, Luc |
---|---|
格式: | text |
語言: | English |
出版: |
Institutional Knowledge at Singapore Management University
2022
|
主題: | |
在線閱讀: | https://ink.library.smu.edu.sg/sis_research/7612 https://ink.library.smu.edu.sg/context/sis_research/article/8615/viewcontent/01_Generative_Flows_With_Invertible_Attentions_CVPR_2022_paper.pdf |
標簽: |
添加標簽
沒有標簽, 成為第一個標記此記錄!
|
機構: | Singapore Management University |
語言: | English |
相似書籍
-
Question-attentive review-level explanation for neural rating regression
由: LE, Trung Hoang, et al.
出版: (2024) -
Towards gradient-based time-series explanations through a spatiotemporal attention network
由: LEE, Min Hun
出版: (2024) -
RISurConv : Rotation invariant surface attention-augmented convolutions for 3D point cloud classification and segmentation
由: ZHANG, Zhiyuan, et al.
出版: (2024) -
Long-term leap attention, short-term periodic shift for video classification
由: ZHANG, Hao, et al.
出版: (2022) -
Managing the creative frontier of Generative AI: The novelty-usefulness tradeoff
由: MUKHERJEE, Anirban., et al.
出版: (2023)