Prompt sensitivity of transformer variants for text classification
This study investigates the sensitivity of various Transformer model architectures, encoder-only (BERT), decoder-only (GPT-2), and encoder-decoder (T5), in response to various types of prompt modifications on text classification tasks. By leveraging a fine-tuning approach, the models were evaluated...
Saved in:
主要作者: | Ong, Li Han |
---|---|
其他作者: | Wang Wenya |
格式: | Final Year Project |
語言: | English |
出版: |
Nanyang Technological University
2024
|
主題: | |
在線閱讀: | https://hdl.handle.net/10356/181519 |
標簽: |
添加標簽
沒有標簽, 成為第一個標記此記錄!
|
相似書籍
-
ClusterPrompt: Cluster semantic enhanced prompt learning for new intent discovery
由: LIANG, Jinggui, et al.
出版: (2023) -
Deep learning techniques for hate speech detection
由: Teng, Yen Fong
出版: (2024) -
TrueGPT: can you privately extract algorithms from ChatGPT in tabular classification?
由: Soegeng, Hans Farrell
出版: (2024) -
Generative AI and education
由: Chieng, Shannon Shuen Ern
出版: (2024) -
Cost-sensitive online classification
由: WANG, Jialei, et al.
出版: (2012)