In the age of fake news and of filter bubbles, assessing the quality of information is a compelling issue: it is important for users to understand the quality of the information they consume online. We report on our experiment aimed at understanding if workers from the crowd can be a suitable alternative to experts for information quality assessment. Results show that the data collected by crowdsourcing seem reliable. The agreement with the experts is not full, but in a task that is so complex and related to the assessor’s background, this is expected and, to some extent, positive.

Additional Metadata
Series CEUR Workshop Proceedings
Conference International Workshop on News Recommendation and Analytics (INRA 2018). In conjunction with the 27th ACM International Conference on Information and Knowledge Management (CIKM 2018)
Citation
Maddalena, E, Ceolin, D, & Mizzaro, S. (2019). Multidimensional news quality: A comparison of crowdsourcing and nichesourcing. In Proceedings of the CIKM 2018 Workshops co-located with 27th ACM International Conference on Information and Knowledge Management.