Active crowdsourcing for annotation
Crowdsourcing has shown great potential in obtaining large-scale and cheap labels for different tasks. However, obtaining reliable labels is challenging due to several reasons, such as noisy annotators, limited budget and so on. The state-of-the-art approaches, either suffer in some noisy scenarios,...
Saved in:
Main Authors: | HAO, Shuji, MIAO, Chunyan, HOI, Steven C. H., ZHAO, Peilin |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2015
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/3173 https://ink.library.smu.edu.sg/context/sis_research/article/4174/viewcontent/Active_Crowdsourcing_for_Annotation_accepted.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Online active learning with expert advice
by: HAO, Shuji, et al.
Published: (2018) -
SOAL: Second-order Online Active Learning
by: HAO, Shuji, et al.
Published: (2017) -
Online Passive Aggressive Active Learning and its Applications
by: LU, Jing, et al.
Published: (2014) -
Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks
by: LO, Pei-chi, et al.
Published: (2023) -
Learning relative similarity from data streams: Active online learning approaches
by: Shuji Hao,, et al.
Published: (2015)