Is a pretrained model the answer to situational awareness detection on social media?
Social media can be valuable for extracting information about an event or incident on the ground. However, the vast amount of content shared, and the linguistic variants of languages used on social media make it challenging to identify important situational awareness content to aid in decision-makin...
Saved in:
Main Authors: | , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2023
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7761 https://ink.library.smu.edu.sg/context/sis_research/article/8764/viewcontent/HICSS_pretrained_models_final.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-8764 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-87642023-03-31T00:44:10Z Is a pretrained model the answer to situational awareness detection on social media? LO, Siaw Ling LEE, Kahhe ZHANG, Yuhao Social media can be valuable for extracting information about an event or incident on the ground. However, the vast amount of content shared, and the linguistic variants of languages used on social media make it challenging to identify important situational awareness content to aid in decision-making for first responders. In this study, we assess whether pretrained models can be used to address the aforementioned challenges on social media. Various pretrained models, including static word embedding (such as Word2Vec and GloVe) and contextualized word embedding (such as DistilBERT) are studied in detail. According to our findings, a vanilla DistilBERT pretrained language model is insufficient to identify situation awareness information. Fine-tuning by using datasets of various event types and vocabulary extension is essential to adapt a DistilBERT model for real-world situational awareness detection. 2023-01-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7761 https://ink.library.smu.edu.sg/context/sis_research/article/8764/viewcontent/HICSS_pretrained_models_final.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University pretrained models situational awareness BERT fine tuning vocabulary extension Databases and Information Systems Social Media |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
pretrained models situational awareness BERT fine tuning vocabulary extension Databases and Information Systems Social Media |
spellingShingle |
pretrained models situational awareness BERT fine tuning vocabulary extension Databases and Information Systems Social Media LO, Siaw Ling LEE, Kahhe ZHANG, Yuhao Is a pretrained model the answer to situational awareness detection on social media? |
description |
Social media can be valuable for extracting information about an event or incident on the ground. However, the vast amount of content shared, and the linguistic variants of languages used on social media make it challenging to identify important situational awareness content to aid in decision-making for first responders. In this study, we assess whether pretrained models can be used to address the aforementioned challenges on social media. Various pretrained models, including static word embedding (such as Word2Vec and GloVe) and contextualized word embedding (such as DistilBERT) are studied in detail. According to our findings, a vanilla DistilBERT pretrained language model is insufficient to identify situation awareness information. Fine-tuning by using datasets of various event types and vocabulary extension is essential to adapt a DistilBERT model for real-world situational awareness detection. |
format |
text |
author |
LO, Siaw Ling LEE, Kahhe ZHANG, Yuhao |
author_facet |
LO, Siaw Ling LEE, Kahhe ZHANG, Yuhao |
author_sort |
LO, Siaw Ling |
title |
Is a pretrained model the answer to situational awareness detection on social media? |
title_short |
Is a pretrained model the answer to situational awareness detection on social media? |
title_full |
Is a pretrained model the answer to situational awareness detection on social media? |
title_fullStr |
Is a pretrained model the answer to situational awareness detection on social media? |
title_full_unstemmed |
Is a pretrained model the answer to situational awareness detection on social media? |
title_sort |
is a pretrained model the answer to situational awareness detection on social media? |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2023 |
url |
https://ink.library.smu.edu.sg/sis_research/7761 https://ink.library.smu.edu.sg/context/sis_research/article/8764/viewcontent/HICSS_pretrained_models_final.pdf |
_version_ |
1770576469037154304 |