BlendCSE: blend contrastive learnings for sentence embeddings with rich semantics and transferability

Sentence representation is one of the most fundamental research topics in natural language processing (NLP), as its quality directly affects various downstream task performances. Recent studies for sentence representations have established state-of-the-art (SOTA) performance on semantic representati...

Full description

Saved in:
Bibliographic Details
Main Authors: Xu, Jiahao, Soh, Charlie Zhanyi, Xu, Liwen, Chen, Lihui
Other Authors: School of Electrical and Electronic Engineering
Format: Article
Language:English
Published: 2024
Subjects:
Online Access:https://hdl.handle.net/10356/173052
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-173052
record_format dspace
spelling sg-ntu-dr.10356-1730522024-01-12T15:41:44Z BlendCSE: blend contrastive learnings for sentence embeddings with rich semantics and transferability Xu, Jiahao Soh, Charlie Zhanyi Xu, Liwen Chen, Lihui School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Semantic Sentence Embedding Contrastive Learning Sentence representation is one of the most fundamental research topics in natural language processing (NLP), as its quality directly affects various downstream task performances. Recent studies for sentence representations have established state-of-the-art (SOTA) performance on semantic representation tasks. However, embeddings by those approaches share unsatisfying transferability when applied to various specific applications. Seldom work studies the transferability of semantic sentence embeddings. In this paper, we first explore the transferability characteristic of the sentence embeddings, and present BlendCSE, a new sentence embedding model targeting rich semantics and transferability. BlendCSE blends three recent advanced NLP learning methodologies, namely, continue learning on masked language modeling (MLM), contrastive learning (CL) with data augmentations (DA), and semantic supervised learning. The main objectives of BlendCSE are to capture token/word level information, diversified linguistic properties, and sentence semantics, respectively. Empirical studies demonstrate that BlendCSE captures semantics comparably well on STS tasks, yet surpasses existing methods on various transfer tasks, yielding even stronger transferability on document-level applications. Ablation studies verified that the three learning objectives synergy well to capture semantics and transferability effectively. Submitted/Accepted version 2024-01-10T05:39:17Z 2024-01-10T05:39:17Z 2024 Journal Article Xu, J., Soh, C. Z., Xu, L. & Chen, L. (2024). BlendCSE: blend contrastive learnings for sentence embeddings with rich semantics and transferability. Expert Systems With Applications, 238, 121909-. https://dx.doi.org/10.1016/j.eswa.2023.121909 0957-4174 https://hdl.handle.net/10356/173052 10.1016/j.eswa.2023.121909 2-s2.0-85174899547 238 121909 en Expert Systems with Applications © 2023 Elsevier Ltd. All rights reserved. This article may be downloaded for personal use only. Any other use requires prior permission of the copyright holder. The Version of Record is available online at http://doi.org/10.1016/j.eswa.2023.121909. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
Semantic Sentence Embedding
Contrastive Learning
spellingShingle Engineering::Electrical and electronic engineering
Semantic Sentence Embedding
Contrastive Learning
Xu, Jiahao
Soh, Charlie Zhanyi
Xu, Liwen
Chen, Lihui
BlendCSE: blend contrastive learnings for sentence embeddings with rich semantics and transferability
description Sentence representation is one of the most fundamental research topics in natural language processing (NLP), as its quality directly affects various downstream task performances. Recent studies for sentence representations have established state-of-the-art (SOTA) performance on semantic representation tasks. However, embeddings by those approaches share unsatisfying transferability when applied to various specific applications. Seldom work studies the transferability of semantic sentence embeddings. In this paper, we first explore the transferability characteristic of the sentence embeddings, and present BlendCSE, a new sentence embedding model targeting rich semantics and transferability. BlendCSE blends three recent advanced NLP learning methodologies, namely, continue learning on masked language modeling (MLM), contrastive learning (CL) with data augmentations (DA), and semantic supervised learning. The main objectives of BlendCSE are to capture token/word level information, diversified linguistic properties, and sentence semantics, respectively. Empirical studies demonstrate that BlendCSE captures semantics comparably well on STS tasks, yet surpasses existing methods on various transfer tasks, yielding even stronger transferability on document-level applications. Ablation studies verified that the three learning objectives synergy well to capture semantics and transferability effectively.
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Xu, Jiahao
Soh, Charlie Zhanyi
Xu, Liwen
Chen, Lihui
format Article
author Xu, Jiahao
Soh, Charlie Zhanyi
Xu, Liwen
Chen, Lihui
author_sort Xu, Jiahao
title BlendCSE: blend contrastive learnings for sentence embeddings with rich semantics and transferability
title_short BlendCSE: blend contrastive learnings for sentence embeddings with rich semantics and transferability
title_full BlendCSE: blend contrastive learnings for sentence embeddings with rich semantics and transferability
title_fullStr BlendCSE: blend contrastive learnings for sentence embeddings with rich semantics and transferability
title_full_unstemmed BlendCSE: blend contrastive learnings for sentence embeddings with rich semantics and transferability
title_sort blendcse: blend contrastive learnings for sentence embeddings with rich semantics and transferability
publishDate 2024
url https://hdl.handle.net/10356/173052
_version_ 1789483106400468992