Articles, Books and Reports

Allison Metz, Tamara Halle, Leah Bartley, A. Blasberg
Becci Akin, Stephanie Bryson, Mark Testa, Karen Blase, Tom McDonald, Heidi Melz
The field of child welfare faces an undersupply of evidence-based interventions to address long-term foster care. The Permanency Innovations Initiative is a five-year federal demonstration project intended to generate evidence to reduce long stays in foster care for those youth who encounter the most substantial barriers to permanency. This article describes a systematic and staged approach to implementation and evaluation of a PII project that included usability testing as one of its key activities. Usability testing is an industry-derived practice which analyzes early implementation processes and evaluation procedures before they are finalized. This article describes the iterative selection, testing, and analysis of nine usability metrics that were designed to assess three important constructs of the project's initial implementation and evaluation: intervening early, obtaining consent, and engaging parents. Results showed that seven of nine metrics met a predetermined target. This study demonstrates how findings from usability testing influenced the initial implementation and formative evaluation of an evidence-supported intervention. Implications are discussed for usability testing as a quality improvement cycle that may contribute to better operationalized interventions and more reliable, valid, and replicable evidence.
Allison Metz, Karen Blase, Leah Bartley, Dawn Wilson, Phil Redmond, Karin Malm
This is the second brief in a series, Building a Post-Care Service System in Child Welfare: Lessons Learned from the Frontlines of Implementation Science in Catawba County. This brief describes how implementation science principles informed technical assistance strategies used in Catawba County to support the full and effective use of evidence-based and evidence-informed practices (EBPs/EIPs). Topics include building the capacity of local implementation teams, conducting stage-appropriate activities, and creating an implementation infrastructure to sustain new interventions.
Allison Metz, Douglas Easterling
This article focuses on two specific tools from implementation science: the practice profile and the Implementation Drivers Assessment. The practice profile answers the question, "What does the strategy require of particular foundation staff?" The implementation drivers analysis explores the broader question, "What does the strategy require in the way of organizational change within the foundation?”.
Sandra Naoom
Students cannot benefit from what they do not experience. Multiple reasons exist for why an intervention may not be delivered as it was designed. In this era of educational accountability and limited dollars to go around, understanding how an intervention is delivered in the classroom is key to understanding program outcomes. In order to assess whether a program has been implemented as intended, an assessment of fidelity is needed. However, assessing fidelity is complex given varying conceptual interpretations, which then fosters inconsistent application of methods to measure the construct. Additionally, the methods for validating fidelity measures are still unclear. The current study evaluated the reliability and validity of the student Instructional Pedagogical (10 items) and Instructional Student Engagement (15 items) scores for use in assessing teachers' fidelity of implementation on the participant responsiveness component of fidelity. The sample consisted of over 5,000 responses from students and 242 teachers in Mathematics and Science across three school districts and 41 schools to an online fidelity of implementation questionnaire. Given that students were nested within teachers, the data structure was multilevel, which warranted that the psychometric analyses be conducted using a multilevel framework. Instructional Pedagogy is represented by 10 items that measure three factors. Multilevel confirmatory factor analysis was used to test a two-level model that had three factors at the student-level and three factors at the teacher-level. Instructional Student Engagement is represented by 15 items that measure four factors. Multilevel confirmatory factor analysis was used to test a two-level model that had four factors at the student-level and four factors at the teacher-level. The psychometric results of the student questionnaire assessing the student engagement components of fidelity were mixed. Support for the factorial validity of the multilevel student models was mixed, with model fit indicating that some of the measured variables did not load strongly on their respective factors and some of the factors lacked discriminant validity. Lastly, the correlations between students' and teachers' scores for both the observed and latent variables (ranging from -.15 to .72 in math; -.07 to .41 in science) displayed limited convergent validity.
Allison Metz, Bianca Albers
Over the last 20 years, there has been a growing emphasis on developing and identifying evidence-based programs and practices for children and families and within the last decade an increasing number of federally funded initiatives have been dedicated to replicating and scaling evidence-based programs with the hope of achieving socially meaningful impact. However, only recently have efforts to promote high-fidelity implementation been given the attention needed to ensure evidence-based practices are used as intended and generate the outcomes they were designed to produce. In this article, we propose that the wide-scale implementation of evidence-based practices requires: (1) careful assessment and selection of the “what”; (2) a stage-based approach that provides adequate time and resources for planning and installation activities; (3) the co-creation of a visible infrastructure by a triad of key stakeholders including funders and policymakers, program developers, and implementing sites; and (4) the use of data to guide decision-making and foster curiosity into continuous improvement among grantees. Each of these strategies is explored in greater detail through the lens of the Teen Pregnancy Prevention (TPP) Program, a $100 million initiative overseen by the Office of Adolescent Health (OAH) in the U.S. Department of Health and Human Services.
Dean Fixsen, Vicky Scott, Karen Blase, Sandra Naoom, Lori Wagar
As the evidence-based movement has advanced in public health, changes in public health practices have lagged far behind creating a science to service gap. For example, science has produced effective falls prevention interventions for older adults. It now is clearer WHAT needs to be done to reduce injury and death related to falls. However, issues have arisen regarding HOW to assure the full and effective uses of evidence-based programs in practice. Summary: Lessons learned from the science and practice of implementation provide guidance for how to change practices by developing new competencies, how to change organizations to support evidence-based practices, and how to change public health systems to align system functions with desired practices. The combination of practice, organization, and system change likely will produce the public health benefits that are the promise of evidence-based falls prevention interventions. IMPACT ON PUBLIC HEALTH: For the past several decades, the emphasis has been solely on evidence-based interventions. Public health will benefit from giving equal emphasis to evidence-based implementation. Impact on Industry: We now have over two decades of research on the effectiveness of fall prevention interventions. The quality of this research is judged by a number of credible international organizations, including the Cochrane Collaboration (http://www.cochrane.org/), the American and British Geriatrics Societies, and the Campbell Collaboration (http://www.campbellcollaboration.org/). These international bodies were formed to ponder and answer questions related to the quality and relevance of research. These developments are a good first step. However, while knowing WHAT to do (an evidence-based intervention) is critical, we also need to know HOW to effectively implement the evidence. Implementation, organization change, and system change methods produce the conditions that allow and support the full and effective use of evidence-based interventions. It is time to focus on utilization of implementation knowledge in public health. Without this focus the vast amount on new evidence being generated on the prevention of falls and related injuries among older adults will have little impact on their health and safety.
Syndicate content