Relevance assessments of information retrieval results are often created by domain experts. This expertise is typically expensive in terms of money or personal effort. The TREC 2011 crowdsourcing track aims to evaluate different strategies of crowdsourcing relevance judgements. This work describes the joint participation of Delft University of Technology and The University of Iowa, using GeAnn, a term association game, we generate relevance judgements in an engaging way that encourages quality submissions, which otherwise would have to be motivated through rigid quality control mechanisms and additional incentives such as higher monetary rewards.
Text REtrieval Conference
Human-Centered Data Analytics

Eickhoff, C., Harris, C. G., Srinivasan, P., & de Vries, A. (2011). GeAnn at TREC 2011. In Proceedings of Text REtrieval Conference 2011 (20).