Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks
In this work, we investigate the connection between browsing behavior and task quality of crowdsourcing workers performing annotation tasks that require information judgements. Such information judgements are often required to derive ground truth answers to information retrieval queries. We explore...
Saved in:
Main Authors: | LO, Pei-chi, LIM, Ee-peng |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2023
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7950 https://ink.library.smu.edu.sg/context/sis_research/article/8953/viewcontent/Your_Cursor_Reveals_pvoa_cc_by.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Perspectives on crowdsourcing annotations for natural language processing
by: Wang, A., et al.
Published: (2013) -
Active crowdsourcing for annotation
by: HAO, Shuji, et al.
Published: (2015) -
Scalable urban mobile crowdsourcing: Handling uncertainty in worker movement
by: CHENG, Shih-Fen, et al.
Published: (2018) -
Trait motivations of crowdsourcing and task choice: A distal-proximal perspective
by: Pee, Loo Geok, et al.
Published: (2018) -
Multi-worker-aware task planning in real-time spatial crowdsourcing
by: TAO, Qian, et al.
Published: (2018)