TY - JOUR
T1 - Assessment of North American Clinical Research Site Performance during the Start-up of Large Cardiovascular Clinical Trials
AU - Goyal, Akash
AU - Schibler, Tony
AU - Alhanti, Brooke
AU - Hannan, Karen L.
AU - Granger, Christopher B.
AU - Blazing, Michael A.
AU - Lopes, Renato D.
AU - Alexander, John H.
AU - Peterson, Eric D.
AU - Rao, Sunil V.
AU - Green, Jennifer B.
AU - Roe, Matthew T.
AU - Rorick, Tyrus
AU - Berdan, Lisa G.
AU - Reist, Craig
AU - Mahaffey, Kenneth W.
AU - Harrington, Robert A.
AU - Califf, Robert M.
AU - Patel, Manesh R.
AU - Hernandez, Adrian F.
AU - Jones, W. Schuyler
N1 - Funding Information:
Conflict of Interest Disclosures: Dr Granger reported receiving personal fees from Boehringer Ingelheim and Bristol Myers Squibb/Pfizer outside the submitted work. Dr Lopes reported receiving personal fees from Bayer (consulting), Boehringer Ingelheim (consulting), Bristol Myers Squibb (consulting), Daiichi Sankyo (consulting), GlaxoSmithKline, Medtronic (consulting), Merck (consulting), Pfizer (consulting), Portola (consulting), and Sanofi (consulting) and grants from Pfizer, Bristol Myers Squibb, GlaxoSmithKline, Medtronic, and Sanofi outside the submitted work. Dr Alexander reported receiving personal fees from AbbVie Data Safety Monitoring Board and Pfizer (honoraria), consulting fees from Bristol Myers Squibb Institutional Research Grant, and grants from Bayer Institutional Research Grant and CSL Behring Institutional Research Grant outside the submitted work. Dr Peterson reported receiving grants from Amgen, Janssen, Bristol Myers Squibb, and Esperion and personal fees from Cerner (consulting) and Livongo (consulting) outside the submitted work. Dr Rao reported receiving research funding from Bayer Institutional and Shockwave Institutional during the conduct of the study. Dr Green reported receiving grants from Boehringer Ingelheim/Lilly Alliance, Sanofi/Lexicon, Merck, Roche, and GlaxoSmithKline and personal fees from Boehringer Ingelheim/Lilly Alliance, Sanofi, AstraZeneca, Pfizer, Hawthorne Effect, Regeneron, and Novo Nordisk outside the submitted work. Dr Harrington reported receiving grants for randomized clinical trial planning from Janssen and Bristol Myers Squibb during the conduct of the study; personal fees from Gilead Scientific (consulting) and SignalPath (board member, randomized clinical trial software) outside the submitted work; and serving on the Board of Directors (unpaid) for the American Heart Association. Dr Califf reported receiving an employee salary from Verily Life Sciences and Google Health and serving on the Cytokinetics Board
Funding Information:
and Centessa Board during the conduct of the study. Dr Patel reported receiving grants from AstraZeneca, Bayer, Janssen, Procyrion Inc, and Heart Flow and serving on the advisory board for Bayer, Janssen, Mytonomy, and Procyrion Inc outside the submitted work. Dr Hernandez reported receiving personal fees from AstraZeneca, Amgen, Bayer, Boston Scientific, Cytokinetics, Myokardia, and Bristol Myers Squibb and grants from Boehringer Ingelheim, American Regent, Verily, and Merck outside the submitted work. No other disclosures were reported.
Funding Information:
Funding/Support: The individual studies were funded by industry sponsors, and this analysis was supported by the Duke Clinical Research Institute.
Publisher Copyright:
© 2015 Future Medicine Ltd.. All rights reserved.
PY - 2021/7/23
Y1 - 2021/7/23
N2 - Importance: Randomized clinical trials (RCTs) are critical in advancing patient care, yet conducting such large-scale trials requires tremendous resources and coordination. Clinical site start-up performance metrics can provide insight into opportunities for improved trial efficiency but have not been well described. Objective: To measure the start-up time needed to reach prespecified milestones across sites in large cardiovascular RCTs in North America and to evaluate how these metrics vary by time and type of regulatory review process. Design, Setting, and Participants: This cohort study evaluated cardiovascular RCTs conducted from July 13, 2004, to February 1, 2017. The RCTs were coordinated by a single academic research organization, the Duke Clinical Research Institute. Nine consecutive trials with completed enrollment and publication of results in their target journal were studied. Data were analyzed from December 4, 2019, to January 11, 2021. Exposures: Year of trial enrollment initiation (2004-2007 vs 2008-2012) and use of a central vs local institutional review board (IRB). Main Outcomes and Measures: The primary outcome was the median start-up time (from study protocol delivery to first participant enrollment) as compared by trial year and type of IRB used. The median start-up time for the top 10% of sites was also reported. Secondary outcomes included time to site regulatory approval, time to contract execution, and time to site activation. Results: For the 9 RCTs included, the median site start-up time shortened only slightly over time from 267 days (interquartile range [IQR], 185-358 days) for 2004-2007 trials to 237 days (IQR, 162-343 days) for 2008-2012 trials (overall median, 255 days [IQR, 177-350 days]; P <.001). For the top 10% of sites, median start-up time was 107 days (IQR, 95-121 days) for 2004-2007 trials vs 104 days (IQR, 84-118 days) for 2008-2012 trials (overall median, 106 days [IQR, 90-120 days]; P =.04). The median start-up time was shorter among sites using a central IRB (199 days [IQR, 140-292 days]) than those using a local IRB (287 days [IQR, 205-390 days]; P <.001). Conclusions and Relevance: This cohort study of North American research sites in large cardiovascular RCTs found a duration of nearly 9 months from the time of study protocol delivery to the first participant enrollment; this metric was only slightly shortened during the study period but was reduced to less than 4 months for top-performing sites. These findings suggest that the use of central IRBs has the potential to improve RCT efficiency.
AB - Importance: Randomized clinical trials (RCTs) are critical in advancing patient care, yet conducting such large-scale trials requires tremendous resources and coordination. Clinical site start-up performance metrics can provide insight into opportunities for improved trial efficiency but have not been well described. Objective: To measure the start-up time needed to reach prespecified milestones across sites in large cardiovascular RCTs in North America and to evaluate how these metrics vary by time and type of regulatory review process. Design, Setting, and Participants: This cohort study evaluated cardiovascular RCTs conducted from July 13, 2004, to February 1, 2017. The RCTs were coordinated by a single academic research organization, the Duke Clinical Research Institute. Nine consecutive trials with completed enrollment and publication of results in their target journal were studied. Data were analyzed from December 4, 2019, to January 11, 2021. Exposures: Year of trial enrollment initiation (2004-2007 vs 2008-2012) and use of a central vs local institutional review board (IRB). Main Outcomes and Measures: The primary outcome was the median start-up time (from study protocol delivery to first participant enrollment) as compared by trial year and type of IRB used. The median start-up time for the top 10% of sites was also reported. Secondary outcomes included time to site regulatory approval, time to contract execution, and time to site activation. Results: For the 9 RCTs included, the median site start-up time shortened only slightly over time from 267 days (interquartile range [IQR], 185-358 days) for 2004-2007 trials to 237 days (IQR, 162-343 days) for 2008-2012 trials (overall median, 255 days [IQR, 177-350 days]; P <.001). For the top 10% of sites, median start-up time was 107 days (IQR, 95-121 days) for 2004-2007 trials vs 104 days (IQR, 84-118 days) for 2008-2012 trials (overall median, 106 days [IQR, 90-120 days]; P =.04). The median start-up time was shorter among sites using a central IRB (199 days [IQR, 140-292 days]) than those using a local IRB (287 days [IQR, 205-390 days]; P <.001). Conclusions and Relevance: This cohort study of North American research sites in large cardiovascular RCTs found a duration of nearly 9 months from the time of study protocol delivery to the first participant enrollment; this metric was only slightly shortened during the study period but was reduced to less than 4 months for top-performing sites. These findings suggest that the use of central IRBs has the potential to improve RCT efficiency.
UR - http://www.scopus.com/inward/record.url?scp=85111431815&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85111431815&partnerID=8YFLogxK
U2 - 10.1001/jamanetworkopen.2021.17963
DO - 10.1001/jamanetworkopen.2021.17963
M3 - Article
C2 - 34297072
AN - SCOPUS:85111431815
SN - 2574-3805
VL - 4
JO - JAMA network open
JF - JAMA network open
IS - 7
M1 - 17963
ER -