Debunking rumors on Twitter with tree transformer
Rumors are manufactured with no respect for accuracy, but can circulate quickly and widely by "word-of-post" through social media conversations. Conversation tree encodes important information indicative of the credibility of rumor. Existing conversation-based techniques for rumor detectio...
Saved in:
Main Authors: | , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2020
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/5599 https://ink.library.smu.edu.sg/context/sis_research/article/6602/viewcontent/2020.coling_main.476.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-6602 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-66022021-01-07T13:55:52Z Debunking rumors on Twitter with tree transformer MA, Jing GAO, Wei Rumors are manufactured with no respect for accuracy, but can circulate quickly and widely by "word-of-post" through social media conversations. Conversation tree encodes important information indicative of the credibility of rumor. Existing conversation-based techniques for rumor detection either just strictly follow tree edges or treat all the posts fully-connected during feature learning. In this paper, we propose a novel detection model based on tree transformer to better utilize user interactions in the dialogue where post-level self-attention plays the key role for aggregating the intra-/inter-subtree stances. Experimental results on the TWITTER and PHEME datasets show that the proposed approach consistently improves rumor detection performance. 2020-12-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/5599 https://ink.library.smu.edu.sg/context/sis_research/article/6602/viewcontent/2020.coling_main.476.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Databases and Information Systems |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Databases and Information Systems |
spellingShingle |
Databases and Information Systems MA, Jing GAO, Wei Debunking rumors on Twitter with tree transformer |
description |
Rumors are manufactured with no respect for accuracy, but can circulate quickly and widely by "word-of-post" through social media conversations. Conversation tree encodes important information indicative of the credibility of rumor. Existing conversation-based techniques for rumor detection either just strictly follow tree edges or treat all the posts fully-connected during feature learning. In this paper, we propose a novel detection model based on tree transformer to better utilize user interactions in the dialogue where post-level self-attention plays the key role for aggregating the intra-/inter-subtree stances. Experimental results on the TWITTER and PHEME datasets show that the proposed approach consistently improves rumor detection performance. |
format |
text |
author |
MA, Jing GAO, Wei |
author_facet |
MA, Jing GAO, Wei |
author_sort |
MA, Jing |
title |
Debunking rumors on Twitter with tree transformer |
title_short |
Debunking rumors on Twitter with tree transformer |
title_full |
Debunking rumors on Twitter with tree transformer |
title_fullStr |
Debunking rumors on Twitter with tree transformer |
title_full_unstemmed |
Debunking rumors on Twitter with tree transformer |
title_sort |
debunking rumors on twitter with tree transformer |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2020 |
url |
https://ink.library.smu.edu.sg/sis_research/5599 https://ink.library.smu.edu.sg/context/sis_research/article/6602/viewcontent/2020.coling_main.476.pdf |
_version_ |
1770575525419417600 |