VIDEO DOI: https://doi.org/10.48448/90g3-9927

technical paper

Peer Review Congress 2022

September 08, 2022

Chicago, United States

Comparing Numerical Results Between Preprints and Peer-Reviewed Publications of COVID-19 Trials

keywords:

pandemic science

preprints

research methods

Objective The COVID-19 pandemic introduced a surge in the dissemination of preprints due to demand for faster and wider access to scientific knowledge. However, questions were raised concerning the reliability of their results. 1,2 The aim of this study was to compare numerical results extracted from preprints vs related peer-reviewed publications to inform inclusion in living systematic reviews.

Design This cross-sectional study used data from the COVID-NMA (covid-nma.com) initiative, a living systematic review of randomized clinical trials (RCTs) evaluating preventive interventions, treatments, and vaccines for COVID-19. Pharmacological treatment RCTs originally posted as preprints and subsequently published in peer- reviewed journals were included. Trials that moved from interim to final analysis between sources were excluded. Effect size estimates extracted from the first preprint were compared with effect size estimates from the most recent peer-reviewed publication. Predefined COVID-NMA “critical outcomes” at 28 days 3 were considered (ie, clinical improvement, World Health Organization Clinical Progression Score level 7 or above, all-cause mortality, incidence of any adverse events, incidence of serious adverse events). The last search date was February 3, 2022.

Results A total of 425 RCTs were identified. Trials only available as peer-reviewed publications (n = 217 51%), preprints (n = 85 20%), and unpublished (n = 16 4%) were excluded, as well as trials reporting interim to final analysis between sources (n = 11 3%), no review-specific outcomes (n = 4 1%), and nonpharmacological treatments (n = 3 1%). Eighty-nine RCTs (230 outcomes) first available as preprints and subsequently as peer-reviewed publications were included. The median delay between preprint post and subsequent publication in a peer-reviewed journal was 112 days (range, 5-505 days). Seventy-two (81%) preprintpublication RCTs (168 outcomes) showed no discrepancies in outcomes reported. Eight (9%) RCTs had numerical discrepancies in 15 of the 22 outcomes reported in both sources; no change in the direction of effect size estimate between sources was found (Figure 8). Of these, 1 RCT also had 2 outcomes added in the peer-reviewed publication. Furthermore, in trials with no numerical discrepancies in outcomes reported, 1 (1%) RCT had 2 outcomes missing in the peer-reviewed publication and 8 (9%) RCTs had at least 1 outcome added in the peer-reviewed publication compared with the preprint.

Conclusions Numerical results were generally similar between COVID-19 preprints and related peer-reviewed publications in the majority of RCTs. However, some outcomes were added and deleted. We could not assess whether preprint trials that were never published as peerreviewed articles were problematic and whether peer review prevented journal publication due to unsupported conclusions.

References 1. Flanagin A, Fontanarosa PB, Bauchner H. Preprints involving medical research—do the benefits outweigh the challenges? JAMA. 2020;324(18):1840-1843. doi:10.1001/ jama.2020.20674

2. Carneiro CFD, Queiroz VGS, Moulin TC, et al. Comparing quality of reporting between preprints and peer- reviewed articles in the biomedical literature. Res Integr Peer Rev. 2020;5(1):16. doi:10.1186/s41073-020-00101-3

3. Boutron I, Chaimani A, Devane D, et al. Interventions for the treatment of COVID-19: a living network meta-analysis. Cochrane Database of Syst Rev. Published online November 3, 2020. doi:10.1002/14651858.CD013770

Conflict of Interest Disclosures Mauricia Davidson is funded by a PhD grant from the Université Paris Cité. Isabelle Boutron is a member of the Peer Review Congress Advisory Board but was not involved in the review or decision for this abstract.

Funding/Support This project is funded by the Université Paris Cité.

Role of the Funder/Sponsor The funder had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the abstract; and decision to submit the abstract for presentation.

Downloads

SlidesTranscript English (automatic)

Next from Peer Review Congress 2022

An Analysis of the History, Content, and Spin of Abstracts of COVID-19-Related Randomized Clinical Trials Posted as Preprints and Subsequently Published in Peer-Reviewed Journals or Unpublished
technical paper

An Analysis of the History, Content, and Spin of Abstracts of COVID-19-Related Randomized Clinical Trials Posted as Preprints and Subsequently Published in Peer-Reviewed Journals or Unpublished

Peer Review Congress 2022

Hannah Spungen

08 September 2022

Similar lecture

From amazing work to I beg to differ: analysis of bioRxiv preprints that received one public comment till September 2019
technical paper

From amazing work to I beg to differ: analysis of bioRxiv preprints that received one public comment till September 2019

PEERE

+1Juan Pablo AlperinMario Malicki
Mario Malicki and 3 other authors

30 September 2020

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved