Contrastive learning approach to word-in-context task for low-resource languages

Word in context (WiC) task aims to determine whether a target word’s occurrences in two sentences share the same sense. In this paper, we propose a Contrastive Learning WiC (CLWiC) framework to improve the learning of sentence/word representations and classification of target word senses in the sent...

Full description

Saved in:
Bibliographic Details
Main Authors: LO, Pei-Chi, LEE, Yang-Yin, CHEN, Hsien-Hao, KWEE, Agus Trisnajaya, LIM, Ee-peng
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2023
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8327
https://ink.library.smu.edu.sg/context/sis_research/article/9330/viewcontent/014_Contrastive_poster.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:Word in context (WiC) task aims to determine whether a target word’s occurrences in two sentences share the same sense. In this paper, we propose a Contrastive Learning WiC (CLWiC) framework to improve the learning of sentence/word representations and classification of target word senses in the sentence pair when performing WiC on lowresource languages. In representation learning, CLWiC trains a pre-trained language model’s ability to cope with lowresource languages using both unsupervised and supervised contrastive learning. The WiC classifier learning further finetunes the language model with WiC classification loss under two classifier architecture options, SGBERT and WiSBERT, which use single-encoder and dual-encoder for encoding a WiC task instance respectively. We evaluate the models developed based on CLWiC framework on a new WiC dataset constructed for Singlish, a low-resource English creole language used in Singapore, as well as the standard English WiC benchmark dataset. Our experiments show that CLWiC-based models using both unsupervised and supervised contrastive learning outperform those not using contrastive learning. This performance difference is more substantial for the Singlish dataset than for the English dataset. Unsupervised contrastive learning appears to improve WiC performance more than supervised one. Finally, we show that using joint learning strategy, we can achieve the best WiC performance.