In the age of fake news and of filter bubbles, assessing the quality of information is a compelling issue: it is important for users to understand the quality of the information they consume online. We report on our experiment aimed at understanding if workers from the crowd can be a suitable alternative to experts for information quality assessment. Results show that the data collected by crowdsourcing seem reliable. The agreement with the experts is not full, but in a task that is so complex and related to the assessor’s background, this is expected and, to some extent, positive.

,
2018 Conference on Information and Knowledge Management Workshops, CIKM 2018, CEUR Workshop Proceedings, INRA 2018
Human-Centered Data Analytics

Maddalena, E., Ceolin, D., & Mizzaro, S. (2018). Multidimensional news quality: A comparison of crowdsourcing and nichesourcing. In Proceedings of the 6th International Workshop on News Recommendation and Analytics (INRA 2018).