Inception transformer
Recent studies show that Transformer has strong capability of building long-range dependencies, yet is incompetent in capturing high frequencies that predominantly convey local information. To tackle this issue, we present a novel and general-purpose Inception Transformer, or iFormer for short, that...
Saved in:
Main Authors: | SI, Chenyang, YU, Weihao, ZHOU, Pan, ZHOU, Yichen, WANG, Xinchao, YAN, Shuicheng |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/9026 https://ink.library.smu.edu.sg/context/sis_research/article/10029/viewcontent/2022_NeurIPS_inception.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
InceptionNeXt: When Inception meets ConvNeXt
by: YU, Weihao, et al.
Published: (2024) -
MetaFormer is actually what you need for vision
by: YU, Weihao, et al.
Published: (2022) -
MetaFormer baselines for vision
by: YU, Weihao, et al.
Published: (2023) -
Towards understanding why Lookahead generalizes better than SGD and beyond
by: ZHOU, Pan, et al.
Published: (2021) -
Understanding generalization and optimization performance of deep CNNs
by: ZHOU, Pan, et al.
Published: (2018)