Augmenting low-resource text classification with graph-grounded pre-training and prompting
ext classification is a fundamental problem in information retrieval with many real-world applications, such as predicting the topics of online articles and the categories of e-commerce product descriptions. However, low-resource text classification, with few or no labeled samples, poses a serious c...
Saved in:
Main Authors: | , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2023
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8143 https://ink.library.smu.edu.sg/context/sis_research/article/9146/viewcontent/SIGIR23_G2P2.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-9146 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-91462023-09-14T08:19:25Z Augmenting low-resource text classification with graph-grounded pre-training and prompting WEN, Zhihao FANG, Yuan ext classification is a fundamental problem in information retrieval with many real-world applications, such as predicting the topics of online articles and the categories of e-commerce product descriptions. However, low-resource text classification, with few or no labeled samples, poses a serious concern for supervised learning. Meanwhile, many text data are inherently grounded on a network structure, such as a hyperlink/citation network for online articles, and a user-item purchase network for e-commerce products. These graph structures capture rich semantic relationships, which can potentially augment low-resource text classification. In this paper, we propose a novel model called Graph-Grounded Pre-training and Prompting (G2P2) to address low-resource text classification in a two-pronged approach. During pre-training, we propose three graph interaction-based contrastive strategies to jointly pre-train a graph-text model; during downstream classification, we explore prompting for the jointly pre-trained model to achieve low-resource classification. Extensive experiments on four real-world datasets demonstrate the strength of G2P2 in zero- and few-shot low-resource text classification tasks. 2023-07-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8143 info:doi/10.1145/3539618.3591641 https://ink.library.smu.edu.sg/context/sis_research/article/9146/viewcontent/SIGIR23_G2P2.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Text classification graph neural networks low-resource learning pre-training prompt-tuning Artificial Intelligence and Robotics Databases and Information Systems |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Text classification graph neural networks low-resource learning pre-training prompt-tuning Artificial Intelligence and Robotics Databases and Information Systems |
spellingShingle |
Text classification graph neural networks low-resource learning pre-training prompt-tuning Artificial Intelligence and Robotics Databases and Information Systems WEN, Zhihao FANG, Yuan Augmenting low-resource text classification with graph-grounded pre-training and prompting |
description |
ext classification is a fundamental problem in information retrieval with many real-world applications, such as predicting the topics of online articles and the categories of e-commerce product descriptions. However, low-resource text classification, with few or no labeled samples, poses a serious concern for supervised learning. Meanwhile, many text data are inherently grounded on a network structure, such as a hyperlink/citation network for online articles, and a user-item purchase network for e-commerce products. These graph structures capture rich semantic relationships, which can potentially augment low-resource text classification. In this paper, we propose a novel model called Graph-Grounded Pre-training and Prompting (G2P2) to address low-resource text classification in a two-pronged approach. During pre-training, we propose three graph interaction-based contrastive strategies to jointly pre-train a graph-text model; during downstream classification, we explore prompting for the jointly pre-trained model to achieve low-resource classification. Extensive experiments on four real-world datasets demonstrate the strength of G2P2 in zero- and few-shot low-resource text classification tasks. |
format |
text |
author |
WEN, Zhihao FANG, Yuan |
author_facet |
WEN, Zhihao FANG, Yuan |
author_sort |
WEN, Zhihao |
title |
Augmenting low-resource text classification with graph-grounded pre-training and prompting |
title_short |
Augmenting low-resource text classification with graph-grounded pre-training and prompting |
title_full |
Augmenting low-resource text classification with graph-grounded pre-training and prompting |
title_fullStr |
Augmenting low-resource text classification with graph-grounded pre-training and prompting |
title_full_unstemmed |
Augmenting low-resource text classification with graph-grounded pre-training and prompting |
title_sort |
augmenting low-resource text classification with graph-grounded pre-training and prompting |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2023 |
url |
https://ink.library.smu.edu.sg/sis_research/8143 https://ink.library.smu.edu.sg/context/sis_research/article/9146/viewcontent/SIGIR23_G2P2.pdf |
_version_ |
1779157180611559424 |