VIDEO DOI: https://doi.org/10.48448/rhp7-0f42

technical paper

Peer Review Congress 2022

September 10, 2022

Chicago, United States

Open Science Policies of Surgical Journals and the Use of Open Science Practices in Research Published in Surgical Journals

Objective Reproducibility and transparency are important considerations in medical research1; recent retractions of studies in several medical journals underscore the relevance of these issues.2 Many tools exist to promote research quality and transparency, including protocol preregistration sites for observational studies, EQUATOR Network reporting guidelines for most common study types, and preprint servers.3 However, the extent to which the surgical research ecosystem has adopted these tools is unknown. The purpose of this study was to describe the use of these quality- promoting practices in surgical research.

Design Use of 5 open science practices were measured (preprint publication before peer-reviewed publication; use of EQUATOR Network guidelines; study protocol preregistration before peer-reviewed publication; published peer review; and public accessibility of data, experimental methods, and/or code) in surgical journals and manuscripts. A distinction was made between preregistration of clinical trials in established trial registries (eg, ClinicalTrials.gov) and the emerging practice of preregistering outcomes and analysis plans for observational studies on newer platforms (eg, Protocols.io). The top 8 surgical journals by impact factor were included. A random sample of 240 research articles published from January 2019 to August 2021 by these journals (30 from each) were selected via random number generator and included in the study. The number of journals and studies that explicitly endorsed or used these practices was measured.

Results In their author guidelines, 7 of the 8 journals (88%) recommended the use of EQUATOR Network guidelines before journal submission. Five journals (63%) explicitly stated that they permitted submissions that were previously released as preprints. Only 3 journals (38%) recommended that authors preregister their protocols for observational studies. None published peer reviewer comments. Five (63%) explicitly recommended that authors make their methods, including all code, laboratory protocols, and data if possible, publicly available. Of 240 articles, 65 (27%) explicitly complied with the appropriate EQUATOR Network guideline. Only 30 observational studies (17%) preregistered their study protocols. None of the articles were posted on a preprint server before journal publication. Only 15 studies (6%) fully disclosed their methods in the form of making code public or publishing a separate protocol (Table 21). Research in the International Journal of Surgery exhibited the highest use of open science practices; studies in that journal used a mean of 1.9 open science practices vs 0.4 in the other journals (P < .001). Journals that recommended (but did not explicitly require) compliance with an open science practice, such as EQUATOR Network guideline use, had higher levels of open science practice use in their research vs journals that did not mention the practice at all (18% vs 4%; P < .001). There was a positive association between journal impact factor and use of open science practices in its published studies (P < .001).



Conclusions Surgical research is adapting slowly to open science practices in academia, leaving the field potentially vulnerable to poor research quality. There are many opportunities for improvement. The responsibility falls on both researchers and journals to consider a strategic model to adopt these new tools to promote high-quality research generation and dissemination.

References 1. Ioannidis JPA. Why most published research findings are false. PLoS Med. 2005;2(8):e124.

2. Ledford H, Van Noorden R. High-profile coronavirus retractions raise concerns about data oversight. Nature. 2020;582(7811):160.

3. PLOS. Transforming scientific communication through open science. Accessed January 26, 2022. https://plos.org/ open-science/

Conflict of Interest Disclosures None reported.

Funding/Support Jayson S. Marwaha is supported by a grant from the National Library of Medicine/National Institutes of Health (T15LM007092-30) and the Biomedical Informatics and Data Science Research Training (BIRT) Program of Harvard University.

Additional Information Jayson S. Marwaha and Hao Wei Chen are co–first authors and Harlan M. Krumholz and Jeffrey B. Matthews are co–senior authors.

Downloads

Transcript English (automatic)

Next from Peer Review Congress 2022

Characteristics of Studies of Research Reproducibility in Economics, Education, Psychology, Health Sciences, and Biomedicine: A Scoping Review
technical paper

Characteristics of Studies of Research Reproducibility in Economics, Education, Psychology, Health Sciences, and Biomedicine: A Scoping Review

Peer Review Congress 2022

David Moher

10 September 2022

Similar lecture

A Guide on US Funding Opportunities for International Research
panel

A Guide on US Funding Opportunities for International Research

Enago See the Future 2021 (Global - English)

+1Gina Della PortaClaire ChenChristine Chang
Christine Chang and 3 other authors

18 November 2021

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved