Topic 2: Establishing Usable Innovations

“Implementation is defined as a specified set of activities designed to put into practice an activity or program of known dimensions.  According to this definition, implementation processes are purposeful and are described in sufficient detail such that independent observers can detect the presence and strength of the ‘specific set of activities’ related to implementation.  In addition, the activity or program being implemented is described in sufficient detail so that independent observers can detect its presence and strength.  When thinking about implementation the observer must be aware of two sets of activities (innovation-level activity and implementation-level activity) and two sets of outcomes (innovation outcomes and implementation outcomes)”
 — Fixsen, Naoom, Blase, Friedman, & Wallace. (2005). Implementation Research: A synthesis of the literature.

Usable Innovation criteria assure that “the activity or program being implemented is described in sufficient detail.”

For example, to be useful to students and functional across thousands of educators and schools operating in locations across states, Implementation Teams need to know what to train, what to coach, and what performance to assess to make full and effective use of an effective practice.  Implementation Teams need to know WHAT is intended to be done (innovation components) so they efficiently and effectively can assure proper use of the innovation now and over time.

The PDSA Cycle

To establish usable innovations, Implementation Teams make intentional use of the plan, do, study, act (PDSA) cycle.  As an improvement cycle in the Active Implementation Frameworks, the PDSA trial-and-learning approach allows Implementation Teams to identify the essential components of the innovation itself.  For example, in highly interactive education settings, the PDSA approach can help Implementation Teams evaluate the benefits of components, retain effective components, and discard non-essential components of an innovation or standard practice.

 

Plan

Identify barriers or challenges, using data whenever possible, and specify the plan to move programs or interventions forward as well as the outcomes that will be monitored.

The “plan” is the innovation as practitioners educators intend it to be used in practice.

Do

Carry out the strategies or plan as specified to address the challenges.

The “plan” needs to be operationalized (what we will do and say to enact the plan) so it is doable in practice.  This compels attention to the core innovation components and provides an opportunity to begin to develop a training and coaching process (e.g. here is how to do the plan) and to create a measure of fidelity (e.g. did we “do” the plan as intended).

Study

Use the measures identified during the planning phase to assess and track progress.

As a few newly trained practitioners begin working with children and families, the budding fidelity measure can be used to interpret the outcomes in the “study” part of the PDSA cycle (e.g. did we do what we intended; did doing what we intended result in desired outcomes).

Act

Make changes to the next iteration of the plan to improve implementation.

The Implementation Team uses the experience to help develop a new plan where the essential components are better defined and operationalized.  In addition, the fidelity assessment is adjusted to reflect more accurately the essential components and the items are modified to make the assessment more practical to conduct in the education setting.

Cycle

The PDSA process is repeated until the innovation is specified well enough to meet the usable innovation criteria.  At that point, the intervention is ready to be used by multiple educators, the fidelity assessment is deemed practical, and the correlation between the essential components and intended outcomes is high.

 

Implementation Teams may employ the PDSA cycle many times over to arrive at a functional version of an innovation that is effective in practice and can be implemented with fidelity on a useful scale (e.g. Fixsen et al., 2001; Wolf et al., 1995).  Once the components of an innovation have been identified, functional analyses can be done to determine empirically the extent to which key components contribute to significant outcomes.  As noted previously, the vast majority of standard practices and innovations do not meet the Usable Innovation criteria.  Implementation Teams will need to make use of PDSA improvement cycle to establish the essential innovation components before they can proceed with broader scale implementation.