Description | Representative Quote | |
---|---|---|
Advantage | ||
Incentivized recruitment | All sites receive the intervention and serve as their own control. | “To ask a site to engage in research for a year without receiving resources doesn’t seem possible.” (New York City) |
Staggered resource allocation | Resources (eg, personnel) can be allocated over a longer period. | “You can shift resources from one sequence to another, which eases workforce logistical concerns.” (ESCALATES) |
Statistical power | Power can be higher than parallel cluster randomized controlled trials under certain conditions. | “Stepped-wedge designs potentially have the power advantage over alternative designs, though the trade-off is the time issue.” (New York City) |
Challenge | ||
Time-sensitive recruitment | Recruitment must occur at all sites up front. | “Recruitment challenges were the main reason we chose the parallel cluster randomized trial and not a stepped-wedge design.” (Midwest) |
Retention | Sites might drop out owing to a long lag time between recruitment and the start of the intervention. | “By the time all partnership agreements were signed and sites were randomized, 47 had dropped out of the study.” (Northwest) |
Randomization requirements and practice preferences | It might be difficult to randomize sites into sequences, given that real-world practice priorities are often changing. | “The real-world environment does not really respect the randomization. Stepped-wedge designs are complicated by the fact that they have defined start and stop dates. That’s not how quality improvement works.” (North Carolina) |
Achieving treatment schedule fidelity | It might be difficult to deliver the intervention as prescribed (eg, sites might cross-talk across sequences). | “We held weekly meetings with all facilitators to deliberately talk about cross-contamination and staying with the timeline.” (Oklahoma) |
Intensive data collection | Sites might have difficulty contributing data for specified outcome measures for every time block of the implementation timeline. | “Practice burden is usually greater for SW-CRTs (compared to other designs) in terms of measurement because every practice has to report every measure for every time block.” (Southwest) |
Hawthorne effect | Sites might modify their behavior before the intervention begins. | “We anticipated that the sites would start preparing (before the intervention started).” (Midwest) |
Temporal trends | Effect of intervention might be confounded by underlying temporal trends. | “Ideally, we would use the stepped-wedge design in a scenario where there aren’t significantly different covariates across clusters” (Southwest) and “when an outcome isn’t already expected to be improving.” (New York City) |
ESCALATES = Evaluating System Change to Advance Learning and Take Evidence to Scale; SW-CRT = stepped-wedge cluster randomized trial.