Peer Review Congress 2022

September 10, 2022

Chicago, United States

Rejection Rates for Manuscripts Uploaded to an Artificial Intelligence Driven Precheck Tool Compared With Manuscripts That Did Not Undergo a Precheck at a Multidisciplinary Medical Journal


publication metrics and performance indicators

artificial intelligence


Objective Online precheck tools identify common errors in grammar and formatting and are intended to help authors identify missing declarations and common language issues prior to first submission. The purpose of this study was to evaluate the use of an artificial intelligence–driven precheck tool and to examine the resulting association with initial rejection rates.

Design This cohort study involved original research manuscripts submitted to Medicine, an open access multidisciplinary medical journal, during a 7-month period from June 2021 to January 2022. Prior to submission, authors were encouraged to upload their manuscript to an online artificial intelligence–driven precheck tool, which understands the precise meaning of phrases within a document and automatically captures both semantic and syntactic variations. The tool is configured to check for language and grammar quality as well as the presence of ethics statements, conflicts of interest declarations, and adherence to word count limits. The precheck tool offers 2 levels of feedback: a free basic report, which summarizes issues that the system suggests should be addressed prior to submission, and a premium check (costing US $29), which provides the author with a downloadable Word document containing all suggested changes in detail. Authors were not mandated to use the precheck tool, and the choice to purchase the premium report was entirely at the author’s discretion. The resulting report was provided to the authors so that changes could be made prior to submission. The journal editors did not receive a copy of the report. All manuscripts were also subjected to a technical check carried out by the editorial office prior to the assignment of editors or reviewers. Articles uploaded to the precheck tool platform were then crosschecked against all articles submitted to the journal’s submission platform, allowing the journal to compare the proportions initially rejected (ie, decisions made prior to undergoing peer review) amongst the 3 distinct groups.

Results Among 7904 submitted manuscripts, author selections for the 3 groups of manuscripts (no precheck, basic precheck, and premium precheck) and numbers initially rejected are detailed in Table 51. Among manuscripts in the no precheck group, 2073 of 6062 (34.2%) were rejected following technical check compared with 333 of 1661 (20.1%) in the basic precheck group and 13 of 181 (7.3%) in the premium precheck group. Overall, 15.4% fewer manuscripts that underwent prechecking were rejected compared with those that underwent no prechecking (346 of 1842 18.8% vs 2073 of 6062 34.2%).

Conclusions The use of a precheck tool to assist authors in identifying language errors and missing manuscript elements prior to submission was associated with a decrease in initial manuscript rejections (Table 51).

Conflict of Interest Disclosures None reported.

Next from Peer Review Congress 2022

Mitigating Subjectivity in Peer Review via Artificial Intelligence

Mitigating Subjectivity in Peer Review via Artificial Intelligence

Peer Review Congress 2022

Nihar B. Shah

10 September 2022

Similar lecture

AI Meets Real-Time: Addressing Real-World Complexities in Graph Response-Time Analysis
technical paper

AI Meets Real-Time: Addressing Real-World Complexities in Graph Response-Time Analysis


+1James AndersonTanya AmertSergey Voronov
Sergey Voronov and 3 other authors

09 December 2021

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)


  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved