Commentary on “Process evaluation of the data-driven quality improvement in primary care (DQIP) trial: Active and less active ingredients of a multi-component complex intervention to reduce high-risk primary care prescribing”

Mar 13,2017 | bpowell Featured Articles

Commentary: The development of a robust evidence-base for implementation strategies has been stymied by a number of factors. First, the inclusion of specific components in multifaceted implementation strategies is rarely justified theoretically, empirically, or pragmatically.1 Second, implementation strategies and their individual components are often poorly described.2,3 Third, there has been little emphasis on determining how strategy components work together in an additive or synergistic way.4 Finally, it has been difficult to determine which components of multifaceted implementation strategies are most important in driving the improvement of implementation and clinical outcomes.5 These limitations complicate the interpretation of primary research, make it difficult to synthesize primary studies in systematic reviews and meta-analyses, and preclude replication in both research and practice.

Grant and colleagues6 attempted to address some of these limitations by conducting a process evaluation to identify active and less active ingredients of a multicomponent quality improvement strategy to reduce high-risk prescribing in primary care. They developed a multifaceted quality improvement strategy that included professional education, financial incentives, and informatics components. The strategy was evaluated in a pragmatic cluster randomized controlled stepped wedge trial in 33 primary care practices. It was found to reduce high-risk prescribing, and reductions were sustained in the year after financial incentives stopped. This article enhances the utility and interpretability of the trial, as the authors specify the multifaceted strategy in detail using published reporting standards.7 That goes a long way in ensuring that the strategy is replicable and the rationale for including the specific components is understood. Further, they interview professionals who received the intervention to assess their perceptions of the multifaceted strategy and the phases (e.g., recruitment, implementation) in which the components were perceived to be more or less active. Ultimately, Grant and colleagues6 demonstrate that each of the included components was perceived as active, but at different stages of the implementation process. For example, financial incentives were seen to be important during recruitment and initial engagement, but did not seem to be important during later phases of implementation. The authors also make a theoretical contribution by applying Normalization Process Theory8 during the data collection and analysis phase of this process evaluation.

The study is a nice example of how process evaluations can provide value to rigorously designed studies designed to test the effectiveness of implementation strategies, and is novel in its assessment of active ingredients of a multifaceted quality improvement strategy. Hopefully, other investigators will follow suit by justifying the selection of the components of multifaceted implementation strategies, reporting strategies in detail, and lending more attention to how specific components of multifaceted implementation strategies exert their effects (which includes specifying and testing the multilevel mechanisms through which strategies influence implementation outcomes9).

Read the abstract.


1Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement Sci. 2010;5(14):1-6. doi:10.1186/1748-5908-5-14.

2Michie S, Fixsen DL, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. 2009;4(40):1-6. doi:10.1186/1748-5908-4-40.

3Proctor EK, Powell BJ, McMillen JC. Implementation strategies: Recommendations for specifying and reporting. Implement Sci. 2013;8(139):1-11. doi:10.1186/1748-5908-8-139.

4Weiner BJ, Lewis MA, Clauser SB, Stitzenberg KB. In search of synergy: Strategies for combining interventions at multiple levels. JNCI Monogr. 2012;44:34-41. doi:10.1093/jncimonographs/lgs001.

5Alexander JA, Hearld LR. Methods and metrics challenges of delivery-systems research. Implement Sci. 2012;7(15):1-11. doi:10.1186/1748-5908-7-15.

6Grant A, Dreischulte T, Guthrie B. Process evaluation of the data-driven quality improvement in primary care (DQIP) trial: Active and less active ingredients of a multi-component complex intervention to reduce high risk primary care prescribing. Implement Sci. 2017;12(4):1-11. doi:10.1186/s13012-016-0531-2.

7Hoffman TC, Glasziou PP, Boutron I, et al. Better reporting of interventions: Template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348(g1687):1-12. doi:10.1136/bmj.g1687.

8May CR, Mair F, Finch T, et al. Development of a theory of implementation and integration: Normalization process theory. Implement Sci. 2009;4(29). doi:10.1186/1748-5908-4-29.

9Williams NJ. Multilevel mechanisms of implementation strategies in mental health: Integrating theory, research, and practice. Adm Policy Ment Health Ment Health Serv Res. 2016;43(5):783-798. doi:10.1007/s10488-015-0693-2.