The results of our exploratory study provide new insights to crowdsourcing knowledge intensive tasks. We designed and performed an annotation task on a print collection of the Rijksmuseum Amsterdam, involving experts and crowd workers in the domain-specific description of depicted flowers. We created a testbed to collect annotations from flower experts and crowd workers and analyzed these in regard to user agreement. The findings show promising results, demonstrating how, for given categories, nichesourcing can provide useful annotations by connecting crowdsourcing to domain expertise.
ACM
International World Wide Web Conference
Human-Centered Data Analytics

Oosterman, J., Bozzon, A., Houben, G. J., Nottamkandath, A., Dijkshoorn, C., Aroyo, L., … Traub, M. (2014). Crowd vs Experts: Nichesourcing for Knowledge Intensive Tasks in Cultural Heritage. In Proceedings of International World Wide Web Conference 2014 (WWW 0) (pp. 567–568). ACM.