VIDEO DOI: https://doi.org/10.48448/jaq5-y973

technical paper

Peer Review Congress 2022

September 10, 2022

Chicago, United States

Data Sharing and Reanalysis for Main Studies Assessed by the European Medicines Agency | VIDEO

keywords:

data sharing and access

reproducible research

open science

Objective Transparency and reproducibility are expected to be normative practices in clinical trials used for decision- making on marketing authorizations for new medicines. A cross-sectional study was conducted aiming to assess inferential reproducibility for main trials (sometimes referred to as pivotal trials) assessed by the European Medicines Agency.

Design Two members of the team (J.G., M.S.) independently identified all studies on new medicines, biosimilars, and orphan medicines given approval by the European Commission between January 2017 and December 2019, categorized as main studies in the European Public Assessment Reports (EPARs). Sixty-two of 292 eligible studies were randomly sampled. One team member (J.G.) identified the sponsors and sent a standardized message to retrieve the individual patient data (IPD) for these studies. Up to 3 reminder messages were sent. A dossier for each study was prepared containing the IPD, the protocol, and information on the conduct of the study. A second team member (M.S.), who had no access to the study reports, used the dossier to run an independent reanalysis of each trial. All results of these reanalyses were reported in terms of each study’s conclusions, P values, effect sizes, and changes from the initial protocol. Two team members (J.G., F.N.) not involved in the reanalysis compared the results of the reanalyses with the published results of the trial.

Results A total of 292 main studies in 173 EPARs were identified. Among the 62 studies randomly sampled, IPD was received for 10 trials (16%). The median number of days between data request and data receipt was 253 (IQR, 182- 469). For these 10 trials, 23 distinct primary outcomes were identified for which the conclusions were reproduced in all reanalyses. Therefore, 10 of 62 trials (16%; 95% CI, 8%-28%) were reproduced. Regarding the 52 studies without available data, assessment of reproducibility was not possible. Forty- eight of the 52 sponsors replied to the request. The reasons for nonsharing can be found in Table 22. There was no change from the original study protocol regarding the primary outcome in any of the 10 studies. Spin (ie, interpretation bias) was observed in the report of 1 study.



Conclusions Despite their results supporting decisions that affect millions of people’s health across the European Union, most of the main studies used in EPARs lack transparency as data were not shared with external researchers to assess reproducibility. The limits of this approach lie in the small amount of IPD obtained. Nonetheless, reanalyses of the few trials with available data showed complete inferential reproducibility. Further studies with a larger sample size are necessary to estimate the reproducibility of clinical trials included in the marketing authorizations.

Conflict of Interest Disclosures David Moher is a member of the Peer Review Congress Advisory Board but was not involved in the review or decision for this abstract. No other disclosures were reported.

Funding/Support The project is funded by the Agence Nationale de la Recherche.

Role of the Funder/Sponsor The sponsor had no role design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the abstract; and decision to submit the abstract for presentation.

Downloads

SlidesTranscript English (automatic)

Next from Peer Review Congress 2022

Assessment of Postpublication Critique Policies and Practices at Top-Ranked Journals in 22 Scientific Disciplines | VIDEO
technical paper

Assessment of Postpublication Critique Policies and Practices at Top-Ranked Journals in 22 Scientific Disciplines | VIDEO

Peer Review Congress 2022

Robert Thibault

10 September 2022

Similar lecture

Classification of URL Citations in Scholarly Papers for Promoting Utilization of Research Artifacts
workshop paper

Classification of URL Citations in Scholarly Papers for Promoting Utilization of Research Artifacts

IJCNLP-AACL 2022

Masaya Tsunokake
Masaya Tsunokake and 1 other author

20 November 2022

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved