Cross-thought for sentence encoder pre-training
In this paper, we propose Cross-Thought, a novel approach to pre-training sequence encoder, which is instrumental in building reusable sequence embeddings for large-scale NLP tasks such as question answering. Instead of using the original signals of full sentences, we train a Transformer-based seque...
Saved in:
Main Authors: | WANG, Shuohang, FANG, Yuwei, SUN, Siqi, GAN, Zhe, CHENG, Yu, LIU, Jingjing, JIANG, Jing |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2020
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/5602 https://ink.library.smu.edu.sg/context/sis_research/article/6605/viewcontent/EMNLP_2020a.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
VLStereoSet: A study of stereotypical bias in pre-trained vision-language models
by: ZHOU, Kankan, et al.
Published: (2022) -
Can syntax help? Improving an LSTM-based Sentence Compression Model for New Domains
by: WANG, Liangguo, et al.
Published: (2017) -
Does BERT understand idioms? A probing-based empirical study of BERT encodings of idioms
by: TAN, Minghuan, et al.
Published: (2021) -
Translate-train embracing translationese artifacts
by: YU, Sicheng, et al.
Published: (2022) -
Plan-and-solve prompting: Improving zero-shot chain-of-thought reasoning by large language models
by: WANG, Lei, et al.
Published: (2023)