Ranking-aware contrastive learning with large language models

Generating high-quality word and sentence representations is a foundational task in natural language processing (NLP). In recent years, various embedding methodologies have been proposed, notably those leveraging the capabilities of large language models for in-context learning. Research has shown t...

Full description

Saved in:
Bibliographic Details
Main Author: Hu, Yuqi
Other Authors: Lihui Chen
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2024
Subjects:
Online Access:https://hdl.handle.net/10356/177983
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-177983
record_format dspace
spelling sg-ntu-dr.10356-1779832024-06-07T15:43:45Z Ranking-aware contrastive learning with large language models Hu, Yuqi Lihui Chen School of Electrical and Electronic Engineering elhchen@ntu.edu.sg, ELHCHEN@ntu.edu.sg Computer and Information Science Ranking consistency Contrastive learning Ranking distillation Generating high-quality word and sentence representations is a foundational task in natural language processing (NLP). In recent years, various embedding methodologies have been proposed, notably those leveraging the capabilities of large language models for in-context learning. Research has shown that language model performance can be enhanced by integrating a query with multiple examples. Inspired by this research, this project explores the use of a contrastive learning framework combined with ranking knowledge to enhance the generation and retrieval of sentence embeddings, aiming to more accurately identify the most similar sentences in in-context learning scenarios. Subsequent experiments tested various ranking strategies within the contrastive learning framework, yielding novel insights and conclusions. Master's degree 2024-06-04T05:22:48Z 2024-06-04T05:22:48Z 2024 Thesis-Master by Coursework Hu, Y. (2024). Ranking-aware contrastive learning with large language models. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/177983 https://hdl.handle.net/10356/177983 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Computer and Information Science
Ranking consistency
Contrastive learning
Ranking distillation
spellingShingle Computer and Information Science
Ranking consistency
Contrastive learning
Ranking distillation
Hu, Yuqi
Ranking-aware contrastive learning with large language models
description Generating high-quality word and sentence representations is a foundational task in natural language processing (NLP). In recent years, various embedding methodologies have been proposed, notably those leveraging the capabilities of large language models for in-context learning. Research has shown that language model performance can be enhanced by integrating a query with multiple examples. Inspired by this research, this project explores the use of a contrastive learning framework combined with ranking knowledge to enhance the generation and retrieval of sentence embeddings, aiming to more accurately identify the most similar sentences in in-context learning scenarios. Subsequent experiments tested various ranking strategies within the contrastive learning framework, yielding novel insights and conclusions.
author2 Lihui Chen
author_facet Lihui Chen
Hu, Yuqi
format Thesis-Master by Coursework
author Hu, Yuqi
author_sort Hu, Yuqi
title Ranking-aware contrastive learning with large language models
title_short Ranking-aware contrastive learning with large language models
title_full Ranking-aware contrastive learning with large language models
title_fullStr Ranking-aware contrastive learning with large language models
title_full_unstemmed Ranking-aware contrastive learning with large language models
title_sort ranking-aware contrastive learning with large language models
publisher Nanyang Technological University
publishDate 2024
url https://hdl.handle.net/10356/177983
_version_ 1806059905897988096