A key factor in reducing Research Waste is to make the scientific process more efficient by taking previous studies into account in prioritizing, designing and interpreting new research. However, conventional methods for meta-analysis do not allow that study series size and meta-analysis timing depend on previous results in the study series. Hence, our methods do not permit that promising initial results are more likely to develop into (large) series of studies than their disappointing counterparts or that conclusive studies are more likely to trigger a meta-analysis than not so noteworthy findings. Since efficient accumulation of scientific knowledge needs such dependencies, it introduces Accumulation Bias, a term introduced in this paper to study all possible dependencies potentially involved in meta-analysis. Fortunately, all dependencies characterized by our Accumulation Bias framework are statistically manageable by testing meta-analyses with Safe Tests. These tests are an extension of the Test Martingales introduced in Shafer, Shen, Vereshchagin & Vovk (2011) and they include some Bayes Factor tests (e.g. Bayesian T-test), but certainly not all. Safe Tests are very flexible towards many efficient decision procedures that expose meta-analyses to new dependencies. Safe Tests are easily interpretable in comparison to standard nullhypothesis significance testing. Safe Tests also pave the way to interesting research into Safe Estimation that counteracts empirical phenomena like "inflated", "Proteus", and "fading effects" in meta-analysis. We introduce Safe Tests for meta-analysis that are ready to use and thus allow valid assessment of previous studies to reduce Research Waste.

Safe Bayesian Inference: A Theory of Misspecification based on Statistical Learning
6th World Conference on Research Integrity 2019
Machine Learning

ter Schure, J. (2019, September 6). Research waste: why we need to rethink meta-analysis. F1000Research. doi:10.7490/f1000research.1117474.1.