COSY: COunterfactual SYntax for cross-lingual understanding
Pre-trained multilingual language models, e.g., multilingual-BERT, are widely used in cross-lingual tasks, yielding the state-of-the-art performance. However, such models suffer from a large performance gap between source and target languages, especially in the zero-shot setting, where the models ar...
Saved in:
Main Authors: | YU, Sicheng, ZHANG, Hao, NIU, Yulei, SUN, Qianru, JIANG, Jing |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2021
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/6510 https://ink.library.smu.edu.sg/context/sis_research/article/7513/viewcontent/2021.acl_long.48.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Interventional training for out-of-distribution natural language understanding
by: YU, Sicheng, et al.
Published: (2022) -
Exploiting query logs for cross-lingual query suggestions.
by: GAO, Wei, et al.
Published: (2010) -
Cross-lingual query suggestion using query logs of different languages
by: GAO, Wei, et al.
Published: (2007) -
Robustness and cross-lingual transfer: An exploration of out-of-distribution scenario in natural language processing
by: YU, SICHENG,
Published: (2022) -
Personalized microblog sentiment classification via adversarial cross-lingual learning
by: WANG, Weichao, et al.
Published: (2018)