Estimating the prevalence of missing experiments in a neuroimaging meta-analysis

Pantelis Samartsidis, Silvia Montagna, Angela R. Laird, Peter T. Fox, Timothy D. Johnson, Thomas E. Nichols

Research output: Contribution to journalArticlepeer-review

Abstract

Coordinate-based meta-analyses (CBMA) allow researchers to combine the results from multiple functional magnetic resonance imaging experiments with the goal of obtaining results that are more likely to generalize. However, the interpretation of CBMA findings can be impaired by the file drawer problem, a type of publication bias that refers to experiments that are carried out but are not published. Using foci per contrast count data from the BrainMap database, we propose a zero-truncated modeling approach that allows us to estimate the prevalence of nonsignificant experiments. We validate our method with simulations and real coordinate data generated from the Human Connectome Project. Application of our method to the data from BrainMap provides evidence for the existence of a file drawer effect, with the rate of missing experiments estimated as at least 6 per 100 reported. The R code that we used is available at https://osf.io/ayhfv/.

Original languageEnglish (US)
Pages (from-to)866-883
Number of pages18
JournalResearch Synthesis Methods
Volume11
Issue number6
DOIs
StatePublished - Nov 2020

Keywords

  • meta-analysis
  • neuroimaging
  • publication-bias
  • zero-truncated modeling

ASJC Scopus subject areas

  • Education

Fingerprint Dive into the research topics of 'Estimating the prevalence of missing experiments in a neuroimaging meta-analysis'. Together they form a unique fingerprint.

Cite this