Commentary On “Five years’ experience of an annual course on implementation science: an evaluation among course participants”

Sep 8,2017 | Rohit Ramaswamy Featured Articles

Commentary:  The importance of implementation science to address critical research-to-practice gaps is no longer in debate. From HIV research 1 to pharmacy 2  to global health 3, researchers have called for implementation research to improve the effectiveness and sustainability of healthcare and public health programs. As the field expands, the need to train implementation scientists who systematically use findings from implementation research, instead of just “rolling out” a program or intervention, becomes more urgent. However, to date, implementation science training has primarily targeted post-doctoral researchers. Academic programs to train graduate students in the field are only now being established, and their needs are different from those of researchers. A recent evaluation of current and prospective student needs in a master’s program in implementation science at the University of Heidelberg 4 indicated that these students are more interested in using implementation science to solve practical problems than in theoretical academic training.

There is, therefore, a need to develop training programs that provide students with a sound knowledge of the theories of implementation science as well as give them the skills to apply these theories in the field. If not implementation science will simply replicate the research-to-practice gap that it is intended to address. At the same time, there is also the need to evaluate how students use these skills and the results they achieve in improving implementation and health outcomes. To date, data available for evaluation is limited as academic implementation science training programs have only had a few cohorts.

The featured paper by Carlfjord and colleagues is the first to evaluate students from multiple cohorts; therefore the authors have been able to assess the program across all levels of the Kirkpatrick learning evaluation model. 5 At the lower levels of the model (student reaction and learning), the evaluation results are consistent with those from other evaluations of training programs reported in the literature. Uniformly, students found the content delivered useful and felt that they learned new skills from the course. More importantly, at the higher levels of the model (behavior and results), 60% of the respondents felt that the skills they had learned were useful in their work and 37% felt that they used the skills in other contexts as well.

By confirming results from other training evaluations and by demonstrating preliminary evidence of use, this paper continues to make the case for the importance of flexible, practical training in implementation science. It also describes some emerging good practices in conducting training programs. Instead of imposing the use of a particular model, theory or framework, allowing students to select one and reflect on their choice seems to be commonly followed by a majority of training programs. Similarly, the use of problem-based learning methods (e.g., involving self-study of the literature and sense making through discussion and application to students’ own projects) allows learners to “connect-the-dots” for themselves and seems to be a good approach to training students in this emerging field.

More experience with training programs and more detailed evaluation is necessary to ultimately determine whether implementation science training programs make a difference outside of academic research. Carlfjord and colleagues did not observe the actual use that the students made of their training; the evaluation was limited to students’ self-assessment of the use of knowledge. Moreover, as a significant majority of the students were at the doctoral level, a greater percentage reported using their knowledge in their research than in other work. Despite these limitations, Carlfjord and colleagues have contributed to the growing body of literature demonstrating the importance and value of providing training in implementation science through academic graduate programs.

Read the abstract.


1El-Sadr WM, Philip NM, Justman J. Letting HIV Transform Academia – Embracing Implementation Science. N Engl J Med. 2014;370(18):1679-1681. doi:10.1056/NEJMp1314777.

2Curran, G.M. and Shoemaker SJ. Advancing pharmacy practice through implementation science. Res Social Adm Pharm. 2017;13(5):889–891. http://www.rsap.org/article/S1551-7411(17)30512-0/fulltext.

3Ridde V. Need for more and better implementation science in ghttp://www.rsap.org/article/S1551-7411(17)30512-0/fulltextlobal health. BMJ Glob Heal. 2016;1(2):e000115. doi:10.1136/bmjgh-2016-000115.

4Ullrich C, Mahler C, Forstner J, Szecsenyi J, Wensing M. Teaching implementation science in a new Master of Science Program in Germany: a survey of stakeholder expectations. Implement Sci. 2017;12(1):55. doi:10.1186/s13012-017-0583-y.

5Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels.; 1994.