December 2019 Commentary: Reflections from D&I

Dec 16,2019 | ldmartino Commentary

I spent the first week of December with many of my colleagues attending the 12th Annual Conference on the Science of Dissemination and Implementation in Health (D&I).  I started attending D&I as a PhD student in 2015, and each year I have had the great opportunity to observe what the current “state of the science” is in the field and how D&I continues to evolve.  This year did not disappoint.  The theme of “Raising the Bar on the Rigor, Relevance, and Rapidity of Dissemination and Implementation Science” echoed throughout the Conference, and several subthemes emerged.  Adaptation to achieve better implementation and sustainability, system-level determinants of implementation, novel ways for assessing implementation strategies over time, and innovative new methods to analyze complexities (e.g., coincidence analysis) were just a few of my highlights.  Plenary sessions provided some key takeaways about study designs and measurement for D&I, including:

  • The need for more integration of implementation science into the healthcare system and use of the PRECIS-2 tool to achieve more pragmatic trials.
  • SMART (sequential, multiple assignment, randomized trials) designs can address the challenge of organizational heterogeneity by building adapted interventions to guide how to implement over time. An important takeaway was that these designs are intended to guide decision-makers/implementers and not researchers.
  • We need to reduce the number of “use once, throw away” measures, develop measures using a more rapid and pragmatic fashion, and use theory when doing so.
  • Using common measures and frameworks has been critical for large multi-site initiatives, such as the VA QUERI program. This comment really resonated with me –I have found in my role on several NIH-funded consortiums it can be quite challenging to harmonize implementation science measurement across disparate sites and projects!

Relatedly, one of the featured articles in this month’s newsletter by Stanick and colleagues1 is focused on the development of pragmatic measures for implementation.  The authors generated stakeholder driven rating criteria to assess whether a measure is pragmatic.  The rating scale was developedwithin the mental health context and is called PAPERS (Psychometric and Pragmatic Evidence Rating Scale).  It is intended for use in developing implementation measures and assessing existing measures.

On another note, the UNC/RTI Consortium for Implementation Science held its annual D&I Networking Event at the Conference.  For those of you who were able to join us, thank you for coming.  We look forward to seeing you again in 2020 and hearing about what you have been up to!  This was a great way to wrap up 2019.  Sending best wishes to all for a happy, healthy holiday season and New Year.


1Stanick, C. F., Halko, H. M., Nolen, E. A., Powell, B. J., Dorsey, C. N., Mettert, K. D., . . . Lewis, C. C. (2019). Pragmatic measures for implementation research: development of the Psychometric and Pragmatic Evidence Rating Scale (PAPERS). Transl Behav Med. doi:10.1093/tbm/ibz164