Home > Research > Publications & Outputs > Quantifying and addressing the prevalence and b...

Links

Text available via DOI:

View graph of relations

Quantifying and addressing the prevalence and bias of study designs in the environmental and social sciences

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
  • A.P. Christie
  • D. Abecasis
  • M. Adjeroud
  • J.C. Alonso
  • T. Amano
  • A. Anton
  • B.P. Baldigo
  • R. Barrientos
  • J.E. Bicknell
  • D.A. Buhl
  • J. Cebrian
  • R.S. Ceia
  • L. Cibils-Martina
  • S. Clarke
  • J. Claudet
  • M.D. Craig
  • D. Davoult
  • A. De Backer
  • M.K. Donovan
  • T.D. Eddy
  • J.P.A. Gardner
  • B.P. Harris
  • A. Huusko
  • I.L. Jones
  • B.P. Kelaher
  • J.S. Kotiaho
  • A. López-Baucells
  • H.L. Major
  • A. Mäki-Petäys
  • B. Martín
  • C.A. Martín
  • P.A. Martin
  • D. Mateos-Molina
  • R.A. McConnaughey
  • M. Meroni
  • C.F.J. Meyer
  • K. Mills
  • M. Montefalcone
  • N. Noreika
  • C. Palacín
  • A. Pande
  • C.R. Pitcher
  • C. Ponce
  • M. Rinella
  • R. Rocha
  • M.C. Ruiz-Delgado
  • J.J. Schmitter-Soto
  • J.A. Shaffer
  • S. Sharma
  • A.A. Sher
  • D. Stagnol
  • T.R. Stanley
  • K.D.E. Stokesbury
  • A. Torres
  • O. Tully
  • T. Vehanen
  • C. Watts
  • Q. Zhao
  • W.J. Sutherland
Close
Article number6377
<mark>Journal publication date</mark>11/12/2020
<mark>Journal</mark>Nature Communications
Issue number1
Volume11
Number of pages11
Publication StatusPublished
<mark>Original language</mark>English

Abstract

Building trust in science and evidence-based decision-making depends heavily on the credibility of studies and their findings. Researchers employ many different study designs that vary in their risk of bias to evaluate the true effect of interventions or impacts. Here, we empirically quantify, on a large scale, the prevalence of different study designs and the magnitude of bias in their estimates. Randomised designs and controlled observational designs with pre-intervention sampling were used by just 23% of intervention studies in biodiversity conservation, and 36% of intervention studies in social science. We demonstrate, through pairwise within-study comparisons across 49 environmental datasets, that these types of designs usually give less biased estimates than simpler observational designs. We propose a model-based approach to combine study estimates that may suffer from different levels of study design bias, discuss the implications for evidence synthesis, and how to facilitate the use of more credible study designs.