poster
Association of Peer Review with Completeness of Reporting, Transparency for Risk of Bias, and Spin in Diagnostic Test Accuracy Studies Published in Imaging Journals
keywords:
quality of reporting
peer review
bias
Objective To evaluate whether peer review of diagnostic test
accuracy (DTA) studies published by imaging journals is
associated with changes in completeness of reporting,
transparency of risk of bias, and spin, given that there is
limited evidence to support the concept that peer review
improves the completeness of research reporting.1,2
Design This retrospective cross-sectional study evaluated
articles published in the Journal of Magnetic Resonance
Imaging (JMRI; 2019 impact factor IF, 4.0), the Canadian
Association of Radiologists Journal (CARJ; IF, 1.7), and
European Radiology (EuRad; IF, 4.1) before March 31,
2020.3 Initial submitted and final versions of manuscripts
were screened consecutively in reverse chronological order to
include a minimum of 23 articles (based on power
calculation) per journal. At least 30 eligible articles from each
journal were collected when available to account for potential
exclusions. Primary studies evaluating the diagnostic
accuracy of an imaging test in humans were included. Studies
exclusively reporting on prognostic or predictive tests were
excluded. Studies were evaluated independently by 2
reviewers blinded to version for completeness of reporting
using the Standards for Reporting Diagnostic Accuracy
Studies (STARD) 2015 and STARD for Abstracts guidelines,
transparency of reporting for risk of bias assessment based on
the Quality Assessment of Diagnostic Accuracy Studies-2
(QUADAS-2), and actual and potential spin using modified
published criteria. Two-tailed paired t-tests and paired
Wilcoxon signed-rank tests were used for comparisons;
P < .05 was considered statistically significant.
Results Of 692 diagnostic accuracy studies screened, 84
articles published in 2014 to 2020 from 3 journals were
included: JMRI, 30 articles; CARJ, 23; and EuRad, 31.
Reporting by STARD 2015 increased between initial
submissions and final accepted versions (mean reported
items 16.67 vs 17.47; change, 0.80 95% CI, 0.25 to 1.17;
P = .002). From STARD, sources of funding and other
support (item 30.1) and role of funders (item 30.2) had the
largest change of 0.32 (P < .001). No difference was found for
the reporting of STARD for Abstracts (5.28 vs 5.25; change,
−0.03; 95% CI, −0.15 to 0.11; P = .74); QUADAS-2 (6.08 vs
6.11; 0.03; 95% CI, −1.00 to 0.50; P = .92); actual spin (2.36
vs 2.40; change, 0.04; 95% CI, 0.00 to 1.00; P = .39); or
potential spin practices (2.93 vs 2.81; change, −0.12; 95% CI,
−1.00 to 0.00; P = .23) (Figure 20).
Conclusions This retrospective cross-sectional study found
that peer review was associated with a marginal improvement
in completeness of full text; however, it was not associated
with abstract reporting in published imaging DTA studies nor
with improvement in transparency for risk of bias assessment
or reduction in spin. Considering that this study included
articles from only 3 radiology journals, the findings may not
be generalizable to other journals, other fields of DTA
research, or non-DTA study designs. Interventions such as
reviewer training and use of checklists should be evaluated.
References
1. Jefferson T, Rudin M, Brodney Folse S, Davidoff F. Editorial peer review for improving the quality of reports of biomedical studies. Cochrane Library. 2020. doi:10.1002/14651858.MR000016.pub3
2. Bruce R, Chauvin A, Trinquart L, Ravaud P, Boutron I. Impact of interventions to improve the quality of peer review of biomedical journals: a systematic review and metaanalysis. BMC Medicine. 2016;14(1):1-16. doi:10.1186/ s12916-016-0631-5
3. Clarivate Analytics. Journal Citation Reports. 2019 Journal Impact Factor. Accessed July 11, 2020. https://clarivate.com/blog/announcing-the-2019-journal-citation-reports/
Conflict of Interest Disclosures Mark Schweitzer, Yves Menu,
Michael Patlas, and Kelly D. Cobey have active affiliations with the
3 journals used as data sources but had no role in data extraction,
analysis, or interpretation, but reviewed and approved the work.
Michael Patlas reported an editorial honorarium from Springer
outside of the submitted work. No other disclosures were reported.
Funding/Support Funding support was received from the
Philips−Radiological Society of North America research seed
grant (RSNA Research & Education Foundation), Mitacs Research
Training Award, and the Department of Radiology MD Summer
Student Fund at the University of Ottawa. Study performance and
manuscript content were the sole task and responsibility of the
investigators and do not necessarily represent the official views of
the funders.
Role of the Funder/Sponsor The funders had no role in data
collection, analysis, interpretation, or manuscript composition.
Acknowledgments Sakib Kazi and Robert A. Frank contributed
equally to this work.