Fidelity Assessment

Implementation Drivers Triangle; Left side (selection, training, coaching); right side (systems intervention, facilitative admin, data systems); Base (leadership); top (fidelity). Fidelity is circled.

Definition

Let's start at the top of the triangle with the first Competency Driver, Fidelity. Fidelity assessment refers to measuring the degree to which teachers or staff are able to use the innovation or instructional practices as intended. Fidelity assessment measures the extent to which an innovation is implemented as intended. Did we do what we said we would do?

Assessing fidelity at the teacher/practitioner level is imperative to interpreting outcomes. If we don’t assess fidelity, then we cannot:

  1. be sure an innovation was actually used,
  2. attribute outcomes to the use of the innovation, or
  3. know what to focus on to improve.

If outcomes are not what we’d hoped for, but we have no fidelity data, it’s difficult to develop an improvement plan. Are results poor because we chose the wrong innovation or because the innovation is not yet being used as intended? We need to know the answers to these questions in order to create a functional improvement plan.

Assessing fidelity also provides direct feedback regarding how well the other Implementation Drivers are functioning. Fidelity data and information, as well as innovation outcomes, are a direct reflection of the how well the Competency, Organization and Leadership Drivers are working together to support teachers and staff as they attempt to use interventions or innovations.

Rationale

In 2011, a fidelity study from the U.S. Department of Education found less than half (44.3%) of research-based prevention programs examined met minimal fidelity standards. Also, because only 7.8% of the prevention programs were found to be research-based, it was estimated that only 3.5% of all curriculum programs were both research-based and met fidelity. As the study notes:

“This information suggests that a tremendous amount of resources, in classroom time for prevention programming alone, is being allocated to school-based prevention efforts that either lack empirical support for their effectiveness or are implemented in ways that diminish the desired effect.” - US Department of Education1

We have to do better. We can do better by developing organizations that support effective practice and use performance assessment as a positive tool to connect infrastructure supports to outcomes.

Key Functions

Fidelity assessment, through an active implementation lens focuses on how well the innovation is being implemented and is not only about the fidelity of the educator, but also is about the quality of the selection, training and coaching systems.

Fidelity data also are impacted by the Organization Drivers.  How well is the administration at the building level supporting the new program or innovation?  What broader education system supports are in place or hindering implementation?  And how are data, fidelity and outcome, being used to make decisions that can improve fidelity?

Many studies indicate higher fidelity is positively correlated with better outcomes. In addition to providing feedback to teachers, measures of fidelity also provide useful feedback to principals, district superintendents, evaluators, coaches, and purveyors regarding implementation progress.

What impacts high fidelity? How do we support it?

Fidelity is not the burden a teacher bears, but rather a product of a thoughtful recruitment and selection process, effective training, and supervision and coaching systems that focus on strengths and build competence and confidence.

This means that fidelity assessment processes and fidelity data help to inform and engage everyone from district staff, to instructional coaches, to building administrators and teachers as new skills are implemented and refined. Results can be strengths-based, reinforcing the progress that has been made. Likewise, results can help responsively guide a staff and organizational development plan for improved practices and skills.

1. U.S. Department of Education Office of Planning Evaluation and Policy Development and Policy and Program Studies Service. (2011). Prevalence and implementation fidelity of research-based prevention programs in public schools: Final report (pp. 58).. Washington, D.C.: U.S. Department of Education.