Crowdsourcing is a market of steadily-growing importance upon which both academia and industry increasingly rely. However, this market appears to be inherently infested with a significant share of malicious workers who try to maximise their profits through cheating or sloppiness. This serves to undermine the very merits crowdsourcing has come to represent. Based on previous experience as well as psychological insights, we propose the use of a game in order to attract and retain a larger share of reliable workers to frequently-requested crowdsourcing tasks such as relevance assessments and clustering. In a large-scale comparative study conducted using recent TREC data, we investigate the performance of traditional HIT designs and a game-based alternative that is able to achieve high quality at significantly lower pay rates, facing fewer malicious submissions.
ACM
Annual ACM SIGIR Conference
Human-Centered Data Analytics

Eickhoff, C., Harris, C. G., de Vries, A., & Srinivasan, P. (2012). Quality through Flow and Immersion: Gamifying Crowdsourced Relevance Assessments. In Proceedings of ACM SIGIR Conference on Research and Development in Information Retrieval 2012. ACM.