Zero-shot text classification via self-supervised tuning
Existing solutions to zero-shot text classification either conduct prompting with pre-trained language models, which is sensitive to the choices of templates, or rely on large-scale annotated data of relevant tasks for meta-tuning. In this work, we propose a new paradigm based on self-supervised...
Saved in:
Main Authors: | Liu, Chaoqun, Zhang, Wenxuan, Chen, Guizhen, Wu, Xiaobao, Luu, Anh Tuan, Chang, Chip Hong, Bing, Lidong |
---|---|
Other Authors: | Interdisciplinary Graduate School (IGS) |
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/168505 https://2023.aclweb.org/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Improving zero-shot learning baselines with commonsense knowledge
by: Roy, Abhinaba, et al.
Published: (2023) -
Text mining with minimum human supervision
by: Lim, Kewin Hong Kwan.
Published: (2012) -
Zero-to-strong generalization: eliciting strong capabilities of large language models iteratively without gold labels
by: Liu, Chaoqun, et al.
Published: (2024) -
Zero-shot learning via category-specific visual-semantic mapping and label refinement
by: Niu, Li, et al.
Published: (2020) -
Modularized zero-shot VQA with pre-trained models
by: CAO, Rui, et al.
Published: (2023)