LOW-RESOURCE CLICKBAIT SPOILING FOR INDONESIAN USING MULTILINGUAL PRE-TRAINED LANGUAGE MODELS
Clickbait spoiling is a new task that aims to produce spoilers from posts or headlines that contain clickbait. Previous research completed this task in English by treating this task as a question answering task. The best model produced has promising performance with a BERTScore of 77.03 for the p...
Saved in:
Main Author: | Putu Intan Maharani, Ni |
---|---|
Format: | Theses |
Language: | Indonesia |
Online Access: | https://digilib.itb.ac.id/gdl/view/80973 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Institut Teknologi Bandung |
Language: | Indonesia |
Similar Items
-
On the transferability of pre-trained language models for low-resource programming languages
by: CHEN, Fuxiang, et al.
Published: (2022) -
Clickbait: Fake News and Role of the State
by: Ang, Benjamin, et al.
Published: (2017) -
Spoiled brat : a profile.
by: Caguiat, Elaine T.
Published: (2008) -
“This Will Blow Your Mind”: examining the urge to click clickbaits
by: Chua, Alton Yeow Kuan, et al.
Published: (2022) -
Multilingual sentiment analysis : From formal to informal and scarce resource languages
by: LO, Siaw Ling, et al.
Published: (2017)