Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks

In this work, we investigate the connection between browsing behavior and task quality of crowdsourcing workers performing annotation tasks that require information judgements. Such information judgements are often required to derive ground truth answers to information retrieval queries. We explore...

Full description

Saved in:
Bibliographic Details
Main Authors: LO, Pei-chi, LIM, Ee-peng
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2023
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7950
https://ink.library.smu.edu.sg/context/sis_research/article/8953/viewcontent/Your_Cursor_Reveals_pvoa_cc_by.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8953
record_format dspace
spelling sg-smu-ink.sis_research-89532023-08-15T01:21:14Z Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks LO, Pei-chi LIM, Ee-peng In this work, we investigate the connection between browsing behavior and task quality of crowdsourcing workers performing annotation tasks that require information judgements. Such information judgements are often required to derive ground truth answers to information retrieval queries. We explore the use of workers’ browsing behavior to directly determine their annotation result quality. We hypothesize user attention to be the main factor contributing to a worker’s annotation quality. To predict annotation quality at the task level, we model two aspects of task-specific user attention, also known as general and semantic user attentions . Both aspects of user attention can be modeled using different types of browsing behavior features but most previous research mostly focuses on the former. This work therefore proposes to model semantic user attention by capturing the worker’s understanding of task content using task-semantics specific behavior features. We develop a web-based annotation interface for gathering user behavior data when workers perform a knowledge path retrieval task. With the collected data, we train several prediction models using behavior features corresponding to different aspects of user attention and conduct experiments on a set of annotation tasks performed by 51 Amazon Mechanical Turk workers. We show that the prediction model using both general and semantic user attention features can achieve the best performance of nearly 75% accuracy. 2023-10-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7950 info:doi/10.1109/ACCESS.2022.3212080 https://ink.library.smu.edu.sg/context/sis_research/article/8953/viewcontent/Your_Cursor_Reveals_pvoa_cc_by.pdf http://creativecommons.org/licenses/by/3.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Crowdsourcing Machine Learning Annotations User Modeling Empirical Study Databases and Information Systems Numerical Analysis and Scientific Computing
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Crowdsourcing
Machine Learning
Annotations
User Modeling
Empirical Study
Databases and Information Systems
Numerical Analysis and Scientific Computing
spellingShingle Crowdsourcing
Machine Learning
Annotations
User Modeling
Empirical Study
Databases and Information Systems
Numerical Analysis and Scientific Computing
LO, Pei-chi
LIM, Ee-peng
Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks
description In this work, we investigate the connection between browsing behavior and task quality of crowdsourcing workers performing annotation tasks that require information judgements. Such information judgements are often required to derive ground truth answers to information retrieval queries. We explore the use of workers’ browsing behavior to directly determine their annotation result quality. We hypothesize user attention to be the main factor contributing to a worker’s annotation quality. To predict annotation quality at the task level, we model two aspects of task-specific user attention, also known as general and semantic user attentions . Both aspects of user attention can be modeled using different types of browsing behavior features but most previous research mostly focuses on the former. This work therefore proposes to model semantic user attention by capturing the worker’s understanding of task content using task-semantics specific behavior features. We develop a web-based annotation interface for gathering user behavior data when workers perform a knowledge path retrieval task. With the collected data, we train several prediction models using behavior features corresponding to different aspects of user attention and conduct experiments on a set of annotation tasks performed by 51 Amazon Mechanical Turk workers. We show that the prediction model using both general and semantic user attention features can achieve the best performance of nearly 75% accuracy.
format text
author LO, Pei-chi
LIM, Ee-peng
author_facet LO, Pei-chi
LIM, Ee-peng
author_sort LO, Pei-chi
title Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks
title_short Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks
title_full Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks
title_fullStr Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks
title_full_unstemmed Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks
title_sort your cursor reveals: on analyzing workers’ browsing behavior and annotation quality in crowdsourcing tasks
publisher Institutional Knowledge at Singapore Management University
publishDate 2023
url https://ink.library.smu.edu.sg/sis_research/7950
https://ink.library.smu.edu.sg/context/sis_research/article/8953/viewcontent/Your_Cursor_Reveals_pvoa_cc_by.pdf
_version_ 1779156903522205696