Prompt sensitivity of transformer variants for text classification
This study investigates the sensitivity of various Transformer model architectures, encoder-only (BERT), decoder-only (GPT-2), and encoder-decoder (T5), in response to various types of prompt modifications on text classification tasks. By leveraging a fine-tuning approach, the models were evaluated...
Saved in:
Main Author: | Ong, Li Han |
---|---|
Other Authors: | Wang Wenya |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/181519 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
ClusterPrompt: Cluster semantic enhanced prompt learning for new intent discovery
by: LIANG, Jinggui, et al.
Published: (2023) -
Deep learning techniques for hate speech detection
by: Teng, Yen Fong
Published: (2024) -
TrueGPT: can you privately extract algorithms from ChatGPT in tabular classification?
by: Soegeng, Hans Farrell
Published: (2024) -
Generative AI and education
by: Chieng, Shannon Shuen Ern
Published: (2024) -
Cost-sensitive online classification
by: WANG, Jialei, et al.
Published: (2012)