Your Cursor Reveals: On Analyzing Workers’ Browsing Behavior and Annotation Quality in Crowdsourcing Tasks
In this work, we investigate the connection between browsing behavior and task quality of crowdsourcing workers performing annotation tasks that require information judgements. Such information judgements are often required to derive ground truth answers to information retrieval queries. We explore...
Saved in:
| Main Authors: | Pei-Chi Lo, Ee-Peng Lim |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/9911618/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Experts ou (foule de) non-experts ? la question de l’expertise des annotateurs vue de la myriadisation (crowdsourcing)
by: Karën Fort
Published: (2017-02-01) -
On the construction of a large-scale database of AI-assisted annotating lung ventilation-perfusion scintigraphy for pulmonary embolism (VQ4PEDB)
by: Amir Jabbarpour, et al.
Published: (2025-07-01) -
Privacy-preserving task matching scheme for crowdsourcing
by: SONG Fuyuan, et al.
Published: (2025-05-01) -
Correction: On the construction of a large-scale database of AI-assisted annotating lung ventilation-perfusion scintigraphy for pulmonary embolism (VQ4PEDB)
by: Amir Jabbarpour, et al.
Published: (2025-08-01) -
Boosting Crowdsourced Annotation Accuracy: Small Loss Filtering and Augmentation-Driven Training
by: Yanming Fu, et al.
Published: (2024-01-01)