In the age of fake news and of filter bubbles, assessing the quality of information is a compelling issue: it is important for users to understand the quality of the information they consume online. We report on our experiment aimed at understanding if workers from the crowd can be a suitable alternative to experts for information quality assessment. Results show that the data collected by crowdsourcing seem reliable. The agreement with the experts is not full, but in a task that is so complex and related to the assessor’s background, this is expected and, to some extent, positive.

Additional Metadata
Keywords Information quality, Quality dimensions
Conference International Workshop on News Recommendation and Analytics
Citation
Maddalena, E, Ceolin, D, & Mizzaro, S. (2018). Multidimensional news quality: A comparison of crowdsourcing and nichesourcing. In Proceedings of the 6th International Workshop on News Recommendation and Analytics (INRA 2018).