Prompt sensitivity of transformer variants for text classification

This study investigates the sensitivity of various Transformer model architectures, encoder-only (BERT), decoder-only (GPT-2), and encoder-decoder (T5), in response to various types of prompt modifications on text classification tasks. By leveraging a fine-tuning approach, the models were evaluated...

全面介紹

Saved in:
書目詳細資料
主要作者: Ong, Li Han
其他作者: Wang Wenya
格式: Final Year Project
語言:English
出版: Nanyang Technological University 2024
主題:
在線閱讀:https://hdl.handle.net/10356/181519
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English