poster

Peer Review Congress 2022

September 10, 2022

Chicago, United States

Adherence to Reporting Guidelines in Systematic Review Preprints and Their Corresponding Journal Publications

keywords:

reporting guidelines

preprints

peer review

Objective Previous research indicates that adherence to the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) checklist is suboptimal, with 9 items adhered to by fewer than 67% of published systematic reviews (SRs).1It is unknown whether adherence is similar in the preprint literature and whether adherence is improved in peer-reviewed journal articles. This study compared adherence to reporting standards in preprint SRs and their corresponding journal publications.

Design In 50 randomly sampled SR preprints uploaded to medRxiv between database inception and December 8, 2021, any preprint SR with at least 1 meta-analysis and a corresponding journal publication was included; the focus of the SR was not considered in the inclusion criteria. The PRISMA 2020 for Abstracts Checklist2 was used to assess adherence to reporting standards and discrepancies in reporting between preprint-publication pairs. The number of meta-analyses published by each journal in the last 3 years was quantified as a proxy for journal resources to critique meta-analyses. The SRs were classified as adherent if at least 9 PRISMA items (75%) were reported and nonadherent if less than 9 items were reported. Results Of 34,760 preprints on medRxiv on December 8, 2021, 922 were SRs, 373 of which were published. Of these, 220 included a meta-analysis, and from these a random sample of 50 preprints was obtained. The included preprints were published in 38 unique journals (median impact factor, 3.5 IQR, 2.9-4.8), publishing a median of 34.5 SRs (IQR, 4.5-387.0) with meta-analysis in the last 3 years. Nineteen (38%) were conducted in the US or Canada, and 24 (48%) had registered protocols. Despite 80% of journal publications stating adherence to the PRISMA checklist, 31 (62%) were nonadherent compared with 36 (72%) of the corresponding preprints (odds ratio, 0.63; 95% CI, 0.25-1.59). The items most frequently unreported in preprint-publication pairs were details of included studies (46 92%), risk of bias (39 78%), and funder (37 74%). The mean journal impact factor for nonadherent preprint-publication pairs was similar to the mean journal impact factor for adherent preprint- publication pairs.

Conclusions In this sample of SR preprints, adherence to the PRISMA 2020 checklist was low and improved only slightly in corresponding journal publications. Additional analyses examining whether lack of vigilance on the part of journals, journal formatting requirements, priorities of journals and/or authors potentially being out of sync with PRISMA guidelines, or other explanations may account for the lack of improvement with full publication.

References

  1. Page MJ, Moher D. Evaluations of the uptake and impact of the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement and extensions: a scoping review. Syst Rev. 2017;6(1):263. doi:10.1186/ s13643-017-0663-8
  2. Page MJ, McKenzie JE, Bossuyt PM, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71. doi:10.1136/bmj.n71

Conflict of Interest Disclosures None reported.

Funding/Support This work was funded by the Agency for Healthcare Research and Quality Effective Healthcare Program through a contract to the Scientific Resource Center (HHSA 2902017003C).

Additional Information The authors of this abstract are responsible for its content. Statements in the abstract do not necessarily represent the official views of or imply endorsement by the Agency for Healthcare Research and Quality, US Department of Health and Human Services.

Next from Peer Review Congress 2022

Assessment of Manuscripts Submitted to Annals of Internal Medicine That Were Posted as Preprints
poster

Assessment of Manuscripts Submitted to Annals of Internal Medicine That Were Posted as Preprints

Peer Review Congress 2022

Jill Jackson

10 September 2022

Similar lecture

Identification and comparison of key criteria of funding decision feedback to applicants: A funder and applicant perspective
technical paper

Identification and comparison of key criteria of funding decision feedback to applicants: A funder and applicant perspective

PEERE

+7Cephas A. S. BarretoJuan Pablo AlperinKathryn Fackrell
Kathryn Fackrell and 9 other authors

30 September 2020

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved