The increase of the amount of misinformation spread every day online is a huge threat to the society. Organizations and researchers are working to contrast this misinformation plague. In this setting, human assessors are indispensable to correctly identify, assess and/or revise the truthfulness of information items, i.e., to perform the fact-checking activity. Assessors, as humans, are subject to systematic errors that might interfere with their fact-checking activity. Among such errors, cognitive biases are those due to the limits of human cognition. Although biases help to minimize the cost of making mistakes, they skew assessments away from an objective perception of information. Cognitive biases, hence, are particularly frequent and critical, and can cause errors that have a huge potential impact as they propagate not only in the community, but also in the datasets used to train automatic and semi-automatic machine learning models to fight misinformation. In this work, we present a review of the cognitive biases which might occur during the fact-checking process. In more detail, inspired by PRISMA – a methodology used for systematic literature reviews – we manually derive a list of 221 cognitive biases that may affect human assessors. Then, we select the 39 biases that might manifest during the fact-checking process, we group them into categories, and we provide a description. Finally, we present a list of 11 countermeasures that can be adopted by researchers, practitioners, and organizations to limit the effect of the identified cognitive biases on the fact-checking activity.

, , ,
Information Processing & Management
The eye of the beholder: Transparent pipelines for assessing online information quality , AI, Media & Democracy Lab
Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands

Soprano, M., Roitero, K., La Barbera, D., Ceolin, D., Spina, D., Demartini, G., & Mizzaro, S. (2024). Cognitive biases in fact-checking and their countermeasures: A review. Information Processing & Management, 61(3), 103672:1–103672:29. doi:10.1016/j.ipm.2024.103672