SAGE Journals
Browse

Addressing the quality of submissions to ClinicalTrials.gov for registration and results posting: The use of a checklist

Posted on 2020-08-06 - 12:07
Background:

US Federal regulations since the late 1990s have required registration of some clinical trials and submission of results for some of these trials on a public registry, ClinicalTrials.gov. The quality of the submissions made to ClinicalTrials.gov determines the duration of the Quality Control review, whether the submission will pass the review (success), and how many review cycles it will take for a study to be posted. Success rate for all results submitted to ClinicalTrials.gov is less than 25%. To increase the success of investigators’ submissions and meet the requirements of registration and submission of results in a timely fashion, the Johns Hopkins ClinicalTrials.gov Program implemented a policy to review all studies for quality before submission. To standardize our review for quality, minimize inter-reviewer variability, and have a tool for training new staff, we developed a checklist.

Methods:

The Program staff learned from major comments received from ClinicalTrials.gov and also reviewed the Protocol Registration and Results System review criteria for registration and results to fully understand how to prepare studies to pass Quality Control review. These were summarized into bulleted points and incorporated into a checklist used by Program staff to review studies before submission.

Results:

In the period before the introduction of the checklist, 107 studies were submitted for registration with a 45% (48/107) success rate, a mean (SD) of 18.9 (26.72) days in review, and 1.74 (0.78) submission cycles. Results for 44 records were submitted with 11% (5/44) success rate, 115.80 (129.33) days in review, and 2.23 (0.68) submission cycles. In the period after the checklist, 104 studies were submitted for registration with 80% (83/104) success rate, 2.12 (3.85) days in review, and 1.22 (0.46) submission cycles. Results for 22 records were submitted with 41% (9/22) success rate, 39.27 (19.84) days in review, and 1.64 (0.58) submission cycles. Of the 44 results submitted prior to the checklist, 30 were Applicable or Probable Applicable Clinical Trials, with 10% (3/30) being posted within 30 days as required of the National Institutes of Health. For the 22 results submitted after the checklist, 17 were Applicable or Probable Applicable Clinical Trials, with 47% (8/17) being posted within 30 days of submission. These pre- and post-checklist differences were statistically significant improvements.

Conclusion:

The checklist has substantially improved our success rate and contributed to a reduction in the review days and number of review cycles. If Academic Medical Centers and industry will adopt or create a similar checklist to review their studies before submission, the quality of the submissions can be improved and the duration of review can be minimized.

CITE THIS COLLECTION

DataCite
3 Biotech
3D Printing in Medicine
3D Research
3D-Printed Materials and Systems
4OR
AAPG Bulletin
AAPS Open
AAPS PharmSciTech
Abhandlungen aus dem Mathematischen Seminar der Universität Hamburg
ABI Technik (German)
Academic Medicine
Academic Pediatrics
Academic Psychiatry
Academic Questions
Academy of Management Discoveries
Academy of Management Journal
Academy of Management Learning and Education
Academy of Management Perspectives
Academy of Management Proceedings
Academy of Management Review
or
Select your citation style and then place your mouse over the citation text to select it.

SHARE

email
need help?