Tasks that require users to have expert knowledge are diffi- cult to crowdsource. They are mostly too complex to be carried out by non-experts and the available experts in the crowd are difficult to target. Adapting an expert task into a non-expert user task, thereby enabling the ordinary “crowd” to accomplish it, can be a useful approach. We studied whether a simplified version of an expert annotation task can be carried out by non-expert users. Users conducted a game-style annota- tion task of oil paintings. The obtained annotations were compared with those from experts. Our results show a significant agreement between the annotations done by experts and non-experts, that users improve over time and that the aggregation of users’ annotations per painting increases their precision.
,
M. de Rijke (Maarten) , T Kentner , A.P. de Vries (Arjen) , F.M.G. de Jong (Franciska) , C. Zhai (ChengXiang) , K. Hofmann (Katja) , K. Radinsky
COMMIT: Socially Enriched Acces to Linked Cultural Media (P06)
European Conference on Information Retrieval
Human-Centered Data Analytics

Traub, M., van Ossenbruggen, J., He, J., & Hardman, L. (2014). Measuring the Effectiveness of Gamesourcing Expert Oil Painting Annotations
. In M. de Rijke, T. Kentner, A. de Vries, F. de Jong, C. Zhai, K. Hofmann, & K. Radinsky (Eds.), .