COSY: COunterfactual SYntax for cross-lingual understanding
Pre-trained multilingual language models, e.g., multilingual-BERT, are widely used in cross-lingual tasks, yielding the state-of-the-art performance. However, such models suffer from a large performance gap between source and target languages, especially in the zero-shot setting, where the models ar...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2021
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/6510 https://ink.library.smu.edu.sg/context/sis_research/article/7513/viewcontent/2021.acl_long.48.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-7513 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-75132023-02-10T05:31:28Z COSY: COunterfactual SYntax for cross-lingual understanding YU, Sicheng ZHANG, Hao NIU, Yulei SUN, Qianru JIANG, Jing Pre-trained multilingual language models, e.g., multilingual-BERT, are widely used in cross-lingual tasks, yielding the state-of-the-art performance. However, such models suffer from a large performance gap between source and target languages, especially in the zero-shot setting, where the models are fine-tuned only on English but tested on other languages for the same task. We tackle this issue by incorporating language-agnostic information, specifically, universal syntax such as dependency relations and POS tags, into language models, based on the observation that universal syntax is transferable across different languages. Our approach, named COunterfactual SYntax (COSY), includes the design of SYntax-aware networks as well as a COunterfactual training method to implicitly force the networks to learn not only the semantics but also the syntax. To evaluate COSY, we conduct cross-lingual experiments on natural language inference and question answering using mBERT and XLM-R as network backbones. Our results show that COSY achieves the state-of-the-art performance for both tasks, without using auxiliary dataset. 2021-08-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/6510 info:doi/10.18653/v1/2021.acl-long.48 https://ink.library.smu.edu.sg/context/sis_research/article/7513/viewcontent/2021.acl_long.48.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Computational linguistics Natural language processing systems Semantics Databases and Information Systems |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Computational linguistics Natural language processing systems Semantics Databases and Information Systems |
spellingShingle |
Computational linguistics Natural language processing systems Semantics Databases and Information Systems YU, Sicheng ZHANG, Hao NIU, Yulei SUN, Qianru JIANG, Jing COSY: COunterfactual SYntax for cross-lingual understanding |
description |
Pre-trained multilingual language models, e.g., multilingual-BERT, are widely used in cross-lingual tasks, yielding the state-of-the-art performance. However, such models suffer from a large performance gap between source and target languages, especially in the zero-shot setting, where the models are fine-tuned only on English but tested on other languages for the same task. We tackle this issue by incorporating language-agnostic information, specifically, universal syntax such as dependency relations and POS tags, into language models, based on the observation that universal syntax is transferable across different languages. Our approach, named COunterfactual SYntax (COSY), includes the design of SYntax-aware networks as well as a COunterfactual training method to implicitly force the networks to learn not only the semantics but also the syntax. To evaluate COSY, we conduct cross-lingual experiments on natural language inference and question answering using mBERT and XLM-R as network backbones. Our results show that COSY achieves the state-of-the-art performance for both tasks, without using auxiliary dataset. |
format |
text |
author |
YU, Sicheng ZHANG, Hao NIU, Yulei SUN, Qianru JIANG, Jing |
author_facet |
YU, Sicheng ZHANG, Hao NIU, Yulei SUN, Qianru JIANG, Jing |
author_sort |
YU, Sicheng |
title |
COSY: COunterfactual SYntax for cross-lingual understanding |
title_short |
COSY: COunterfactual SYntax for cross-lingual understanding |
title_full |
COSY: COunterfactual SYntax for cross-lingual understanding |
title_fullStr |
COSY: COunterfactual SYntax for cross-lingual understanding |
title_full_unstemmed |
COSY: COunterfactual SYntax for cross-lingual understanding |
title_sort |
cosy: counterfactual syntax for cross-lingual understanding |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2021 |
url |
https://ink.library.smu.edu.sg/sis_research/6510 https://ink.library.smu.edu.sg/context/sis_research/article/7513/viewcontent/2021.acl_long.48.pdf |
_version_ |
1770575977665003520 |