Articles, Books and Reports

Dean Fixsen, Karen Blase, Melissa Van Dyke
Evidence-based programs have struggled for acceptance in human service settings. Information gleaned from these experiences indicates that implementation is the missing link in the science to service chain. The science and practice of implementation is progressing and can inform approaches to full and effective uses of youth violence prevention programs nationally. Implementation Teams that know (a) innovations, (b) implementation best practices, and (c) improvement cycles are essential to mobilizing support for successful uses of evidence-based programs on a socially significant scale. The next wave of development in implementation science and practice is underway: establishing infrastructures for implementation to make implementation expertise available to communities nationally. Good science, implemented well in practice, can benefit all human services, including youth violence prevention.
Lauren Supplee, Allison Metz
Despite a robust body of evidence of effectiveness of social programs, few evidence-based programs have been scaled for population-level improvement in social problems. Since 2010 the federal government has invested in evidence-based social policy by supporting a number of new evidence-based programs and grant initiatives. These initiatives prioritize federal funding for intervention or prevention programs that have evidence of effectiveness in impact research. The increased attention to evidence in funding decision making is promising; however, to maximize the potential for positive outcomes for children and families, communities need to select programs that fit their needs and resources, the programs need to be implemented with quality, and communities need ongoing support. Drawing on experiences scaling evidence-based programs nationally, the authors raise a number of challenges faced by the field to ensure high-quality implementation and discuss specific proposals, particularly for the research and university communities, for moving the field forward. Recommendations include designing and testing intervention and prevention programs with an eye towards scaling from the start, increased documentation related to implementation of the programs, and working toward an infrastructure to support high-quality, effective dissemination of evidence-based prevention and intervention programs.
Lauren Supplee and Allison Metz
Drawing on experiences scaling evidence-based programs nationally, the authors of this Social Policy Report raise a number of challenges faced by the field to ensure high-quality implementation and discuss specific proposals, particularly for the research and university communities, for moving the field forward.
Allison Metz
The purpose of this piece is to provide the research and rationales behind Practice Profiles. To achieve outcomes and develop effective implementation supports, innovations need to be “teachable, learnable, doable, and assessable.” Practice Profile methodology facilitates the development of innovations and their necessary infrastructure. Specific training on NIRN’s Practice Profile methodology can be found on the Active Implementation Hub.
William Erchul, Caryn Ward
Dean Fixsen, Karen Blase, Allison Metz, Sandra Naoom
Melanie Barwick, Katherine Boydell, Elaine Stasilius, H. Ferguson, Karen Blase, Dean Fixsen
Children with emotional and behavioural disorders should be able to count on receiving care that meets their needs and is based on the best scientific evidence available, however, many do not receive these services. Implementation of evidence-based practice (EBP) relies, in part, on the research utilization practices of mental health care providers. This study reports on a survey of research utilization practices among 80 children's mental health (CMH) service provider organizations in Ontario, Canada. Methods: A web-based survey was distributed to 80 CMH service provider organizations, to which 51 executive directors and 483 children's mental health practitioners responded. Research utilization was assessed using questions with Likert-type responses based on the Canadian Health Services Research Foundation's Four-A's approach: access, assess, adapt, apply. Results: There was general agreement among executive directors and practitioners regarding the capacity of their organizations to use – access, assess, adapt, and apply – research evidence. Overall, both groups rated their organizations as using research information 'somewhat well.' The low response rate to the practitioner survey should be noted. Conclusion: These findings provide a useful benchmark from which changes in reported research utilization in the Ontario CMH sector can be tracked over time, as a function of EBP training and implementation initiatives, for instance. The need to improve access to research evidence should be addressed because it relates to the eventual implementation and uptake of evidence-based practices. Communities of practice are recommended as a strategy that would enable practitioners to build capacity in their adaptation and application of research evidence.
Dean Fixsen, Karen Blase, Rob Horner, George Sugai
Students cannot benefit from education practices they do not experience. While this seems obvious (and it is), education systems have yet to develop the capacity to help all teachers learn to make good use of evidence-based practices that enhance the quality of education for all students. The purpose of this brief is to provide a framework that state leadership teams and others can use to develop the capacity to make effective, statewide, and sustained use of evidence-based practices and other innovations.
Syndicate content