Prompt-based learning for text classification in natural language processing

Prompt-based learning represents a novel paradigm in natural language processing (NLP) that enables the repurposing of pre-trained models for different kinds of downstream tasks without requiring additional supervised training. As a departure from traditional supervised learning approaches, prompt-b...

Full description

Saved in:
Bibliographic Details
Main Author: Xie, Yuanli
Other Authors: Mao Kezhi
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2025
Subjects:
Online Access:https://hdl.handle.net/10356/182694
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Prompt-based learning represents a novel paradigm in natural language processing (NLP) that enables the repurposing of pre-trained models for different kinds of downstream tasks without requiring additional supervised training. As a departure from traditional supervised learning approaches, prompt-based learning leverages carefully designed prompts to guide model behavior, offering a flexible and efficient alternative for solving various tasks such as text classification. This dissertation investigates the application of prompt-based learning in text classification, focusing on its effectiveness in optimizing the performance of large pre-trained models. Through a series of controlled experiments, it systematically examines the influence of different prompt designs on model accuracy and generalization. By analyzing these findings, this research contributes to the growing system of knowledge on prompt engineering and emphazises the transformative potential of prompt-based learning in NLP.