Chapter 12

Dissemination and Implementation of Behavioral Treatments for Anxiety in ASD

Amy Drahota1,2, Colby Chlebowski2,3, Nicole Stadnick2,3, Mary Baker-Ericzén2,4 and Lauren Brookman-Frazee2,3,    1Michigan State University, East Lansing, MI, United States,    2Child & Adolescent Services Research Center, San Diego, CA, United States,    3University of California, San Diego, CA, United States,    4Rady Children’s Hospital, San Diego, CA, United States

Abstract

Anxiety is a common co-occurring condition in ASD that contributes to the complexity of a youth’s clinical presentation, functioning, and service needs. Although there is a rapidly growing body of evidence for the efficacy of cognitive behavioral therapy for anxiety in ASD, there is a well-documented gap between research-based interventions and routine care. This gap illustrates the potential limited public health impact of interventions developed in laboratory settings. In this chapter, we first provide recommendations for the consideration of anxiety intervention developers to accelerate the bidirectional translation between research and routine care. Second, we introduce dissemination and implementation science (and provide a glossary of implementation terms) as an additional method to address this gap by focusing on developing and testing strategies to improve the systematic process of implementing evidence-based interventions (EBIs) in routine care settings. Specifically, research-based implementation frameworks highlight the multiple phases of implementation (initial exploration and adoption through sustained delivery) and the multi-level contexts to be considered during implementation of EBIs in routine care settings (systems, service organizations, providers, children/families). Lastly, we discuss opportunities at each of these levels and phases of implementation to facilitate EBI implementation through the use of the ACT SMART Implementation Toolkit, a comprehensive implementation strategy developed for routine care settings delivering services to individuals with ASD.

Keywords

Autism spectrum disorder; evidence-based intervention; ACT SMART Implementation Toolkit; cognitive behavioral therapy; implementation; dissemination

Gaps Between Research and Routine Care for Anxiety in Youth With Autism

Anxiety is a common co-occurring condition in autism spectrum disorder (ASD), that contributes to the complexity of youth’s clinical presentation, functioning, and service needs (Joshi et al., 2010). Although there is a rapidly growing body of evidence for the efficacy of cognitive behavioral therapy for anxiety in youth with ASD (Wood et al., 2015a), there is also a well-documented gap between research-based interventions and routine care. Evidence-based interventions (EBIs) for ASD have traditionally been difficult to transport from university laboratory settings to community settings (Garland et al., 2013; Dingfelder & Mandell, 2011), and recent small-scale studies have found that EBIs are not consistently implemented in routine, community settings (Brookman-Frazee et al., 2010). Contributing to this research-to-practice gap, youth with ASD and anxiety are often provided services simultaneously from multiple community service systems targeting core and associated symptoms of ASD (Brookman-Frazee et al., 2009; Goin-Kochel et al., 2007). For example, providers from multiple disciplines deliver services to youth with ASD including psychologists, behavioral therapists, educators, pediatricians, nurses, speech-language pathologists, occupational therapists, physical therapists, audiologists, neurologists, and social workers (McLennan et al., 2008), each with diverse and unique characteristics, such as education level and training and attitudes toward delivery of EBIs. In addition, each provider may use different terminology or identify different needs for intervention, bringing their own perspectives on community care for youth with ASD (Volkmar et al., 2011). For instance, mental health clinicians may have less specialized diagnostic or ASD intervention-specific training, which poses challenges to treating youth with ASD who have co-occurring psychiatric conditions, while community providers who specifically serve populations with ASD or developmental disabilities are not often trained in providing mental health services.

Additionally, the service systems in which these providers work vary in their policies, organizational structure, intervention utilization, and funding support. As a result, youth with ASD and co-occurring anxiety are often provided services from systems operating in a balkanized fashion that all too frequently yield fragmented care (Christon et al., 2015; Cidav et al., 2013; Swiezy et al., 2008). A recent study found that clinicians across disciplines reported using or recommending intervention elements based on Applied Behavioral Analysis (ABA) principles (e.g., reinforcement, visual supports and task analysis) most frequently for youth with ASD and co-occurring anxiety, with cognitive-behavior therapy (CBT) used or recommended the least (Christon et al., 2015). Even more surprising, clinicians reported providing and recommending play therapy significantly more than CBT for youth with ASD despite the efficacy literature supporting CBT with youth with ASD and co-occurring symptoms (see Scarpa et al., 2013 and Wood et al., 2011 for efficacy data for CBT).

While it is unclear if clinicians do not provide or recommend CBT due to lack of provider knowledge and training or due to a lack of readily available treatment models within the community, these findings demonstrate the gap between research and practice that has been a prominent discussion within mental health literature for over a decade (Warren et al., 2010), and highlight the ongoing challenges in translating research knowledge into clinical practice (Chambers et al., 2013; Garland et al., 2013). Fortunately, recent paradigm shifts in the field of psychology have pushed researchers to look beyond the question of efficacy—the extent to which interventions achieve outcomes under ideal circumstances—to translational science, which focuses on testing the effectiveness of interventions, the success of community clinicians who use them, and implementation of EBIs within community agencies and systems (Dingfelder & Mandell, 2011; Feldman, 2008; Wood et al., 2015b). Two specific time points are particularly pivotal for the translation of EBI into community settings: 1. during intervention development and 2. when facilitating implementation of existing EBIs within community care settings.

Developing Effective Anxiety Interventions for Youth With Autism

In an effort to accelerate the bidirectional translation between research and routine care delivery, care must be taken at the intervention development stage to ensure the fit between the intervention and community settings (Dingfelder & Mandell, 2011). In order to do so, we offer the following recommendations to intervention developers:

1. Consider the potential end users of an intervention at the outset of intervention development and/or adaptation. What are the characteristics and needs of youth/families, clinicians, organizations and systems that may impact intervention uptake, sustained delivery, and effectiveness? How can a given intervention address the needs of youth, clinicians, and organizations of systems? How might the intervention fit within the service context and are adaptations to the intervention necessary? How may the clinical specialty (ASD vs mental health disorders) of clinicians impact the delivery of the intervention? What existing funding mechanisms are available to cover the cost of the intervention? Partnering with community stakeholders early in the intervention development and implementation process and systematically collecting data on the service context should help facilitate understanding of the needs and constraints of the service system (Drahota et al., 2016).

2. Assume adaptions are necessary and will occur. Ascertainment of efficacy research samples may result in relatively homogenous samples that are not representative of the target population. Furthermore, the context of routine care is likely very different than tightly controlled research contexts. As such, assume that adaptations may be needed to facilitate fit and adoption. Which adaptations are fidelity-consistent versus fidelity-inconsistent (Wiltsey-Stirman et al., 2013)? Are there ways to simplify the intervention? Consider what the hypothesized core mechanisms of change for the intervention are and pare down the intervention to only include those components (e.g., exposure, behavioral rehearsal).

3. Combine clinical interventions with implementation strategies. Translation from research to practice does not simply involve providing a treatment manual or conducting workshops with clinicians. Implementation strategies (Powell et al., 2012; Powell et al., 2015; Proctor et al., 2011; Drahota et al., 2014b) are used in combination with clinical interventions to systematically implement them in routine care. Consider dissemination and implementation frameworks and research-based implementation strategies at multiple levels to facilitate the adoption, initial uptake, and sustained use of EBIs in ASD community care settings.

Implementing Existing Anxiety Interventions With Youth With Autism

Born out of the need to minimize the substantial delay (up to 17 years) in translating only 14% of research findings to practice to increase the public health impact of EBIs (Balas, 1998; Balas & Boren, 2000), dissemination and implementation (D&I) science (see Table 12.1) is an emerging field that provides theoretical and empirical infrastructure for the promotion of EBI transportation to community settings (Rabin & Brownson, 2012). Specifically, dissemination is defined as “the targeted distribution of information and intervention materials to a specific public health or clinical practice audience. The intent is to spread knowledge and the associated evidenced-based interventions” (National Institutes of Health Program Announcement, 2016). Implementation science “is the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and thereby improve the quality and effectiveness of health services and care” (p.1; Eccles & Mittman, 2006). Recognizing the complex interplay of variables integral in transporting an EBI to community practice, dissemination and implementation (D&I) science necessitates consideration of multiple levels of influence including the youth and families accessing care, the clinicians delivering care, the agencies or organizations through which care is delivered, and healthcare policy that effects service delivery (Bauer et al., 2015).

Table 12.1

Glossary of implementation science terms

Term Definition Reference
Adaptation Planned or purposeful changes to the design or delivery of an EBI, as opposed to modifications that may be unintentional deviations Wiltsey-Stirman et al. (2013)
Clinician (or Provider) Individuals providing a broad array of services in any usual care service section (therapists, clinicians, case managers, paraprofessional, educational staff)  
De-Implementation To reduce the use of strategies and interventions that are not evidence-based, have been prematurely widely adopted, yield sub-optimal benefits for patients, or are harmful or wasteful National Institutes of Health Program Announcement (2016)
Dissemination Active, intentional efforts aimed to encourage specified groups to adopt an innovation, which can be a new clinical program or practice Greenhalgh et al. (2004)
Implementation Active, intentional efforts to embed a clinical program or practice (innovation) within an organization or service system Greenhalgh et al. (2004)
Implementation Model (or D & I Framework) A conceptual framework of distinct factors that are hypothesized to strongly influence the implementation of evidence-based practices in service systems Aarons et al. (2011), Tabak et al. (2012)
Implementation Science The scientific study of methods to promote the systematic uptake of EBIs into routine clinical care settings with the overarching aim of improving the quality and effectiveness of health services Eccles and Mittman (2006)
Implementation Strategies Methods or techniques used to enhance the adaptation, implementation, and sustainability of an evidence-based clinical program or practice Powell et al. (2015), Proctor et al. (2013)
Implementation Team Agency leaders, designated agency staff members (e.g., clinicians), and other stakeholders who work through the multiple phases of implementation often by meeting, completing activities, and conducting actions necessary for implementation and sustainment of an EBI  
Sustainment The (a) capacity to deliver and/or (b) maintain core elements of an intervention (e.g., remain recognizable or delivered at a sufficient level of fidelity or intensity to yield desirable results) after initial implementation support has been withdrawn Wiltsey-Stirman et al. (2012)

Several comprehensive, phased, and multilevel implementation frameworks have been developed (e.g., Aarons et al., 2011; Damschroder et al., 2007; Greenhalgh et al., 2004) that denote specific “inner” (e.g., consumer, provider characteristics) and “outer” (e.g., policy, fiscal environment) context factors that may facilitate or hinder the adoption, implementation, and sustainment of EBIs in community care settings. An overarching goal of D&I frameworks is to maximize the “fit” between the EBI and the service context in which it is to be implemented. This is done through comprehensive study of routine care practices and service systems (Drahota et al., 2012), including characteristics specific to the healthcare policy and legislative landscape, service system (outer context), and the organization, providers, and individual with ASD (inner context). One promising method that is highlighted to facilitate “fit” between EBIs and organizational contexts is the inclusion of strong, well-defined community–academic partnerships with community stakeholders (e.g., caregivers, service providers, clinicians, clinical supervisors, agency leaders) (Garland & Brookman-Frazee, 2015; Drahota et al., 2016).

Comprehensive understanding and ongoing monitoring of the contextual landscape of a service system is essential to facilitate adoption, uptake, and sustained delivery of EBIs without compromised treatment adherence, and to facilitate the de-adoption of practices that are not evidence-based (Bauer et al., 2015). For example, Drahota and colleagues (2015) utilized a mixed methods design to evaluate hindering and facilitating factors to implementation of EBI in community settings for youth with ASD. Agency leaders reported that there is not an existing systematic implementation process that fits the ASD usual care setting resulting in a lack of structure and consistency in implementation efforts. Further, agency leaders reported uncertainty about which implementation strategies best facilitate EBI adoption, uptake, and sustained use.

Utilizing a theory-based implementation framework and an ASD-specific implementation toolkit that addresses the unique client, provider, and contextual factors of ASD community service systems is a promising method for building upon existing EBIs while also accelerating D&I efforts for this population. The following section describes a D&I framework and corresponding implementation toolkit—the ACT SMART Implementation Toolkit—used to improve the implementation of EBIs within ASD community agencies.

Adapted EPIS Model of Implementation

The EPIS (Exploration, Preparation, Implementation, Sustainment) model of implementation (Aarons et al., 2011) is a multi-phased, multi-level framework of implementation processes that was used to guide the development of the ACT SMART Implementation Toolkit. Recent discussions with ASD community stakeholders yielded an adapted EPIS conceptual model to better fit the needs of ASD community agencies (Drahota et al., 2012). Taken together, the five phases of the adapted EPIS (see Table 12.2) provide useful guidance for consideration of both inner and outer contextual factors that may influence the implementation of an EBI within a specific service setting and organizational context.

Table 12.2

Adapted EPIS implementation framework (Adapted from Aarons, Hurlburt and Horwitz, 2011) with ACT SMART implementation toolkit steps and activities image

Adapted EPIS phases ACT SMART implementation toolkit
Web-Based interface Facilitation meetings
Steps Activities
Phase 1: Exploration Step 1: Conduct agency assessment Activity 1. Encourage staff participation in the organizational needs assessment Activity 2. Form implementation team, if needed

• 12 monthly 30–60 minute meetings

• Agency implementation team and ACT SMART facilitator collaborate to move through ACT SMART phases and activities

• Structured facilitation meetings to review steps, phases and activities; troubleshoot previous action items; introduce next steps and phases; and plan for future steps

Step 2: Evaluate receptivity to implementing new EBI
Phase 2: Adoption Decision Step 1: Identify appropriate EBI(s) Activity 1: Identify EBI(s) to meet agency need
Step 2: Evaluate EBI and provider factors Activity 1: EBI fit
Activity 2: EBI feasibility
Activity 3: Clinical value and research validity
Activity 4: Training requirements
Activity 5: Funding source
Activity 6: Benefit-cost estimator
Step 3: Adoption decision Activity 1: Synthesize and weigh factors Activity 2: Formally make an adoption decision
Phase 3: Preparation Step 1: Develop prospective adaptation plan Activity 1: Gather EBI materials
Activity 2: Evaluate possible adaptations to EBI
Activity 3: Adaptation planning worksheet
Step 2: Develop training plan Activity 1: Training plan worksheet
Step 3: Develop implementation plan Activity 1: Implementation plan worksheet
Phase 4: Implementation Step 1: Conduct adaptation plan Activity 1: Develop concrete tasks and establish due dates
Step 2: Conduct training plan
Step 3: Conduct implementation plan
Step 4: Task evaluation Activity 1: Evaluate tasks from Steps 1–3
Phase 5: Sustainment Step 1: Evaluate implementation success Activity 1: Synthesize task evaluations
Step 2: Evaluate current sustainment Activity 1: Identify current sustainment practices
Step 3: Develop sustainment plan Activity 1: Sustainment planning

Image

Outer Context. Outer contextual factors include sociopolitical context, policy, advocacy, funding, and inter-organizational networks. For example, recent healthcare legislation, namely the Patient Protection and Affordable Care Act (Patient Protection and Affordable Care Act, 2010), has a core goal of improving access to and quality of healthcare while lowering per capita costs. Moreover, mental health parity is highlighted within the Affordable Care Act and has significant potential to shape mental health service access, quality, and receipt for children with ASD. Policies such as these shape the outer context to facilitate greater mental health coverage for individuals, including youth with ASD, who also are experiencing co-occurring anxiety disorders. As a result, service systems have been pushed by policy to provide evidence-based interventions to intervene and ameliorate the symptoms of anxiety and related interference. A second example of important outer contextual factors for the ASD population includes parent networks. Parent networks have played an important role in advocating for greater intervention services to treat the challenging behaviors and anxiety disorders often comorbid with ASD. The strength of ASD advocacy networks can been seen by the increase in public and private funding through organizations such as Autism Speaks and the Interagency Autism Coordinating Committee, which offer guidance to the National Institutes of Health on funding and research priorities.

Inner Context. The inner context of the EPIS varies by phase, and commonly includes factors such as organizational-, leadership-, provider-, and client-level characteristics. For example, organizational culture—an organizational-level characteristic—has been defined as “the behavioral expectations that members of an organization are required to meet in their work environment” (Verbeke et al., 1998; Glisson et al., 2013), and is associated with receptivity of an organization to a new EBI. That is, if the organizational culture is receptive to identifying new EBI to benefit clinicians or clients, it is more likely that it will be implemented and sustained successfully, thereby increasing the quality of service delivery and positive patient outcomes (Aarons et al., 2012; Glisson & Green, 2006; Glisson & Hemmelgarn, 1998). The converse may be true as well; organizations with a culture of low receptivity to innovations are likely to experience greater barriers to adoption, uptake, and sustained use, despite the EBI meeting client needs or being advantageous for the agency. Additional inner contextual factors important for implementation include staffing and staff characteristics, EBI fit with agency values and needs, and EBI adaptability and training availability (Aarons et al., 2012).

While the adapted EPIS provides guidance for the implementation of EBIs within ASD community settings by emphasizing the need to consider both outer and inner contextual factors prior to and throughout the implementation process (Aarons et al., 2011), it does not offer concrete tools to support agency leader’s implementation of EBIs. As a result, the ACT SMART Implementation Toolkit was developed in collaboration with a community-academic partnership to provide a systematized approach for progressing through the multi-phased EPIS (Table 12.2).

The ACT SMART Implementation Toolkit

The Autism Community Toolkit: Systems to Measure and Adopt Research-Based Treatments (ACT SMART Implementation Toolkit) (Drahota et al., 2014a) is an evidence-informed, comprehensive, implementation toolkit designed to assist ASD agency leaders and clinicians to efficiently and effectively implement EBIs and de-implement unsupported strategies. It is comprised of a web-based implementation interface that has steps and activities to guide clinicians through the adapted EPIS phases and is accompanied by monthly facilitation meetings with ACT SMART facilitators who provide consultation about the process of implementation, de-implementation and sustainment. Facilitation is a consultation method that emphasizes change through encouragement and action promotion (Kitson et al., 1998; Kitson & Harvey, 2015), and has been found to help staff change work practices and behaviors successfully (Rycroft-Malone et al., 2002; Stetler et al., 2006) and improve the capacity of community-based agencies to plan, implement, and evaluate new EBIs in previous implementation studies (Hunter et al., 2009).

Phase 1: Exploration. The aims of the first phase of the ACT SMART Implementation Toolkit are to 1. identify practice and service delivery gaps within organizations by conducting a comprehensive organizational needs assessment to determine areas of strength and possible growth for the agency, 2. identify recommendations for next steps, and 3. assist with prioritizing agency goals. Factors important to consider when assessing an organization include client needs, effectiveness of interventions being delivered within the agency, perceived competency in the delivery of current interventions, staff attitudes toward innovations, staff training and attributes, agency resources, fidelity and performance measurements, organizational context, climate and culture, and previously used implementation and sustainment strategies. Using a structured and systematic method of conducting an organizational needs assessment allows for the most utility and ease. The ACT SMART Implementation Toolkit utilizes an assessment battery to achieve these goals. Numerous surveys are available for use and, fortunately with the continued development of implementation science as a discipline, measures and assessment tools are being placed in web-based repositories for use by researchers and clinicians (see the National Cancer Institute’s Grid-Enabled Measures Database and the Society for Implementation Research Collaboration—Instrument Review Project).

The ultimate goal of Phase 1 is to determine an agency’s receptivity to implementing a new EBI. This is an important consideration because if an agency is not receptive to implementing a new EBI, resources would be better spent improving the quality and organizational structure of the agency by recommending quality improvement literature and tools (c.f., Belson, 2014; Ovretveit, 2014). For agencies receptive to implementing new EBIs, an agency implementation team is formed and an ACT SMART facilitator is assigned to guide the implementation team through the continuation of the ACT SMART Implementation Toolkit.

Phase 2: Adoption Decision. The goals of Phase 2 is for the implementation team to identify an EBI that matches agency-identified needs and to make an informed decision about whether to adopt the EBI for use within the agency. Using the information identified in Phase 1, ACT SMART facilitators guide agency implementation teams through three steps: 1. identify an appropriate EBI to meet the needs derived from the organizational assessment; 2. evaluate EBI characteristics to determine whether it would be appropriate to meet client, clinician, and agency needs; and 3. synthesize this information to make a systematic decision whether or not to adopt the EBI. Specifically, during Step 1, the ACT SMART facilitator supports implementation team members’ efforts to explore the literature and available EBI repositories to identify an intervention that will match the need identified through the organizational assessment. For example, if the organizational assessment indicates that the agency's clients with ASD are experiencing anxiety and the clinicians are not effectively treating anxiety symptoms, implementation teams may wish to explore current literature and available manuals to meet this agency-level need. By reading the literature and obtaining anxiety CBT manuals, the implementation team will be able to evaluate important factors of the intervention before making a formal intervention adoption decision (Step 3).

Once an EBI has been identified and intervention resources have been obtained, implementation team members evaluate treatment and provider factors (Step 2) that have been found to influence the EBI adoption decision. Adoption decision factors include EBI fit and feasibility with the agency’s context and structure, the clinical value and research validity of the EBI, training requirements and availability, EBI funding sources, and the benefit of delivering a new EBI as compared with the cost. While these factors have been identified through interviews with agency leaders (Drahota et al., 2015), each factor may have variable influence over the final adoption decision for an implementation team. Thus, the ratings from each of these factors are synthesized and weighted by implementation teams to assist with making a systematic and informed adoption decision. Once an adoption decision is made, implementation teams either move onward to Phase 3 or return to evaluate a different EBI for possible adoption.

Phase 3: Preparation. The goals of the preparation phase are to 1. plan for EBI adaptation prior to staff training and implementation; 2. develop the staff training; and 3. select and plan for specific implementation strategy use. Especially with complex interventions developed in university settings, adaptations will be needed for the successful implementation of the EBI in community agencies. Literature suggests that clinicians often do not use EBIs due to disbelief in the research base, difficulty with implementation, requirement to change behaviors, manuals unable to meet complexity of usual care clinical population, and lack of infrastructure to support EBI implementation (Fixsen et al., 2005). Moreover, strict protocol adherence may be at odds with the implementation of EBIs in usual care settings (Aarons et al., 2012) by limiting the opportunity to tailor EBIs to client’s needs and culture and the community in which the service is being delivered (Nock et al., 2003; Lyon et al., 2014; Lau, 2006; Berwick, 2003). The purpose of EBI adaptation is to promote a better fit between the context in which the EBI was tested and the service setting where it will be implemented, as well as the target population of the EBI (Aarons et al., 2011; Lundgren et al., 2011; McHugh et al., 2009). However, caution should be taken to ensure that adaptations do not interfere with core elements of the EBI (Aarons et al., 2012). Therefore, it is important to classify the types of EBI adaptations needed and examine how these changes may potentially impact key outcomes in a broad set of circumstances and a variety of settings (Wiltsey-Stirman et al., 2012, 2013).

Specifically, implementation teams are asked to consider whether adaptations to the EBI will be necessary for its use within their agency and to consider whether adaptations are necessary to the content of the EBI and/or to the delivery context. Examples of adaptations to the content include, for example, tailoring or refining the intervention (e.g., changing terms or language, modifying worksheets), shortening session duration, frequency or number, or adjusting the order of the intervention modules, topics, or segments. Adaptations to the context may include changing the setting (e.g., delivering the EBI in a school rather than a clinic) or changing the format by which the treatment is delivered (e.g., offering an individual treatment in a group or telephone format). Once implementation teams consider the possible adaptations needed to deliver the EBI within their agency, they are asked to consider the reasons for making the adaptation, identify any specific concerns about the adaptation, identify how the concern will be addressed, and then specify how the adaptation will be made and by whom.

That Preparation phase also includes developing a comprehensive staff training plan. Training the staff who will be involved in delivering the EBI is a crucial step in the implementation process. The quality of staff training can greatly impact the implementation, effectiveness, and ongoing fidelity of the EBI being delivered and is often a difficult task (Beidas et al., 2012; Drahota et al., 2014a). To this end, use of training planners, such as the ACT SMART Training Plan worksheet, can guide implementation teams through systematically planning each component of training.

The third step in the Preparation phase is to develop an implementation plan. Passive implementation efforts often lead to limited implementation or discontinuation of EBIs (Powell et al., 2015). Therefore, many evidence-based implementation strategies have been developed and researched to support the successful implementation of EBIs (Fixsen et al., 2005; Fixsen et al., 2009; Powell et al., 2012; Powell et al., 2015) (Table 12.3).

Table 12.3

Implementation and sustainment strategies grouped by domain

Relationship-Based Strategies The focus of relationship-based strategies is to identify agency leader characteristics, staff buy-in, and partnerships that encourage and support the use of the selected practice or strategy and implementation effort.

• Build agency leader characteristics to support innovation acceptance and use

• Build buy-in among agency clinicians and clients

• Develop relationships to support the implementation process

Financial Strategies These are financial strategies that help to incentivize the use of new practices or strategies and provide resources for training and ongoing support for clinical staff.

• Modify incentives

• Facilitate financial support

Restructuring Strategies These strategies include altering staffing, professional roles, the physical structure of the agency or service setting, intervention equipment, materials and resources, and data systems.

• Change staffing or professional roles

• Change physical structure or service setting

• Change or update data and reminder systems

Implementation Testing Strategies These strategies focus on systems that rollout or scale up interventions within an agency.

• Pilot test implementation effort on a small scale, gradually moving to system-wide roll out

• Model or simulate changes that will be implemented prior to system-wide implementation

• Implementation changes in a recursive, cyclical fashion

Quality Management Strategies These strategies involve developing support networks or data systems that act to continually evaluate and enhance client’s quality of care. These systems also ensure that new practices or strategies are delivered with fidelity.

• Develop performance and fidelity monitoring systems for the intervention

• Evaluate the implementation effort

Sustainment Strategies The focus of these strategies are to facilitate the continued use of new practices or strategies. These are strategies that are distinctly different from implementation strategies but may fall in similar domains as the implementation strategies, such as:

• Relationship-building sustainment strategies

• Financial sustainment strategies

• Quality management sustainment strategies.

Image

Adapted from Powell et al. (2012) and Powell et al. (2015)

These strategies vary greatly in their level of involvement, time commitment, cost and feasibility. Yet, research suggests that when discrete and feasible implementation strategies are utilized, successful implementation outcomes are observed, such as knowledge and use of specific EBIs when appropriate, intervention fidelity, provider and client satisfaction with EBI, improved client outcomes, and improved organizational outcomes (Proctor et al., 2011). Therefore, it is critical for implementation teams to select and plan for use of 1–3 specific implementation strategies in order to facilitate the initial implementation of the EBI.

Phase 4: Implementation. The purpose of Phase 4 is for implementation teams to track progress toward completing the adaptation, training, and implementation plans as designed in the previous phase, and evaluate the progress and satisfaction of completing the action steps. For the ACT SMART Implementation Toolkit, implementation teams are asked to complete a brief evaluation survey after each action step has either been completed or when the scheduled due date for the action step has passed. The evaluation survey allows the implementation team to reflect on the progress being made. If the progress is considered satisfactory, the implementation team may wish to continue to follow the plans as previously designed. If the progress is considered to be neutral or challenging, the evaluation with facilitator assistance will guide the implementation team to consider making changes to the adaptation, training, and implementation plans. This is designed to fit the adaptive and dynamic process inherent in the implementation of an EBI within community-based agencies (Aarons et al., 2012).

Phase 5: Sustainment. The purpose of the sustainment phase is to transition efforts from initial EBI implementation to ongoing EBI sustainment within an agency. Sustainment is defined as the (a) capacity to deliver and/or (b) maintain core elements of an intervention (e.g., remain recognizable or delivered at a sufficient level of fidelity or intensity to yield desirable results) after initial implementation support has been withdrawn (Wiltsey-Stirman et al., 2012). For the ACT SMART Implementation Toolkit, once the majority of the implementation tasks have been completed, the ACT SMART facilitator works with the implementation team to identify specific sustainment strategies that will facilitate the continued use of the EBI. This may include continued training in the EBI to new staff hires and ongoing support or supervision of the use of the EBI with current staff and providers. In addition, structural changes to the organization may be necessary to support the sustained use of the EBI and implementation strategies. For example, if fidelity monitoring was the implementation strategy selected to facilitate the uptake of an anxiety CBT protocol within an agency, allowing providers to have time to self-monitor in a systematic manner will be necessary to continue to sustain both the use of the implementation strategy and the EBI. However, the period of sustainment is not entered into until active implementation has been completed and ACT SMART facilitators have concluded their meetings with the agency implementation team.

Conclusion

In conclusion, the ultimate goal for intervention developers, ASD and anxiety researchers, and community agency leaders and clinicians is to increase the quality of life for youth with ASD and co-occurring anxiety disorders and their families. There are a growing number of evidence-based interventions for anxiety and ASD. However, similar to other populations, there is a concerning gap between ASD interventions delivered in research studies and routine care. Traditional unidirectional models of research to practice translation have resulted in minimal public health impact; however, practice-based approaches rooted in implementation science have the potential to bridge this gap. This chapter highlighted the importance of explicit attention to the routine care context in which an intervention could be used and to the process through which evidence-based interventions are developed, selected, implemented, and sustained in routine care. Based on the ASD services and broader dissemination and implementation science research described in this chapter, we recommend that intervention developers consider implementation outcomes during development of innovative interventions for youth with ASD, and community agency leaders utilize a systematic process to facilitate the implementation and sustainment of EBI within their agencies.

References

1. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38(1):4–23.

2. Aarons GA, Green AE, Palinkas LA, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implementation Science. 2012;7(1):32.

3. Balas EA. Appropriate care to evidence-based medicine. Pediatric Annual. 1998;27:581–584.

4. Balas EA, Boren SA. Managing clinical knowledge for health care improvement. In: Bemmel J, McCray AT, eds. Yearbook of Medical Informatics 2000: Patient-Centered Systems. Stuttgart, Germany: Schattauer Verlagsgesellschaft; 2000.

5. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychology. 2015;3:32.

6. Beidas RS, Edmunds JM, Marcus SC, Kendall PC. Training and consultation to promote implementation of an empirically supported treatment: A randomized trial. Psychiatric Services. 2012;63(7):660–665.

7. Belson, D., 2014. Quality improvement methods for use in QUERI research proposals and grant projects (2nd Ed).

8. Berwick DM. Disseminating innovations in health care. JAMA. 2003;289(15):1969.

9. Brookman-Frazee L, Baker-Ericzen M, Stahmer A, Mandell D, Haine RA, Hough RL. Journal of Mental Health Research Intellect Disability. 2009;2(3):201–219.

10. Brookman-Frazee L, Taylor R, Garland A. Characterizing community-based mental health services for children with autism spectrum disorder and disruptive behavior problems. Journal of Autism and Developmental Disorders. 2010;40(10):118–1201.

11. Chambers D, Glasgow R, Stange K. The dynamic sustainability framework: Addressing the paradox of sustainment amid ongoing change. Implementation Science. 2013;8:117.

12. Christon LM, Arnold CC, Myers BJ. Professionals’ reported provision and recommendation of psychosocial interventions for youth with autism spectrum disorder. Behavior Therapy. 2015;46(1):68–82.

13. Cidav Z, Lawer L, Marcus SC, Mandell DS. Age-related variation in health service use and associated expenditures among children with autism. Journal of Autism and Developmental Disorders. 2013;43(4):924–931.

14. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science. 2007;4(1):50.

15. Dingfelder HE, Mandell DS. Bridging the research-to-practice gap in autism intervention: An application of diffusion of innovation theory. Journal of Autism and Developmental Disorders. 2011;41(5):597–609.

16. Drahota A, Aarons GA, Stahmer AC. Developing the Autism Model of Implementation for autism spectrum disorder community providers: Study protocol. Implementation Science. 2012;7(1):85.

17. Drahota A, Stadnick N, Brookman-Frazee L. Therapist perspectives on training in a package of evidence-based practice strategies for children with autism spectrum disorders served in community mental health clinics. Administration and Policy in Mental Health and Mental Health Research. 2014a;41(1):114–125.

18. Drahota, A., Meza, R., & Martinez, J.I. (2014b). The Autism-Community Toolkit: Systems to Measure and Adopt Research-Based Treatments. www.actsmarttoolkit.com.

19. Drahota, A., Martinez, J.I., Meza, R., Brikho, B., Gomez, E., Stahmer, A.C., et al., 2015. ACT SMART Toolkit: Developing and pilot testing a comprehensive implementation strategy for ASD service providers. Paper presented at the 49th Annual Association for Behavioral and Cognitive Therapies Convention, Chicago, IL.

20. Drahota A, Meza R, Brikho B, et al. Community-Academic Partnerships: A systematic review of the state of the literature and recommendations for future research. Milbank Quarterly. 2016;94(1):163–214.

21. Eccles M, Mittman B. Welcome to implementation science. Implementation Science. 2006;1(1):1.

22. Feldman AM. CTS: A new discipline to catalyze the transfer of information. Clinical and Translational Science. 2008;1(1):1–2.

23. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature Tampa, FL: National Implementation Research Network; 2005.

24. Fixsen D, Blase K, Naoom S, Wallace F. Core implementation components. Research on Social Work Practice. 2009;19(5):531.

25. Garland AF, Brookman-Frazee L. Therapists and reseachers: Advancing collaboration. Psychotherapy Research. 2015;25(1):95–107.

26. Garland AF, Haine-Schlagel R, Brookman-Frazee L, Baker-Ericzen M, Trask E, Fawley-King K. Improving community-based mental health care for children: Translating knowledge into action. Administration and Policy In Mental Health And Mental Health Services Research. 2013;40(1):6–22.

27. Glisson C, Green P. The effects of organizational culture and climate on the access to mental health care in child welfare and juvenile justice systems. Administration and Policy in Mental Health and Mental Health Services Research. 2006;33(4):433–448.

28. Glisson C, Hemmelgarn A. The effects of organizational climate and inter-organizational coordination on the quality and outcomes of children’s service systems. Child Abuse & Neglect. 1998;22(5):401–421.

29. Glisson C, Hemmelgarn A, Green P, Williams NJ. Randomized trial of the Availability, Responsiveness and Continuity (ARC) organizational intervention for improving youth outcomes in community mental health programs. Journal of the American Academy of Child & Adolescent Psychiatry. 2013;52(5):493–500.

30. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: Systematic review and recommendations. Milbank Quarterly. 2004;82(4):581–629.

31. Goin-Kochel RP, Mackintosh VH, Myers BJ. Parental reports on the use of treatment and therapies for children with autism spectrum disorder. Research in Autism Spectrum Disorders. 2007;1(3):195–209.

32. Hunter SB, Chinman M, Ebener P, Imm P, Wandersman A, Ryan G. Technical assistance as a prevention capacity-building tool: A demonstration using the Getting To Outcomes framework. Health Education and Behavior. 2009;36(5):810–828.

33. Joshi G, Petty C, Wozniak J, et al. The Heavy Burden of Psychiatric Comorbidity in Youth with Autism Spectrum Disorders: A Large Comparative Study of a Psychiatrically Referred Population. Journal of Autism and Developmental Disorders. 2010;40(11):1361–1370.

34. Kitson A, Harvey G. Translating evidence into healthcare policy and practice: Single versus multi-faceted implementation strategies–Is there a simple answer to a complex question? International Journal of Health Policy and Management. 2015;4(3):123–126.

35. Kitson A, Harvey G, Mccormack B. Enabling the implementation of evidence based practice: A conceptual framework. Quality and Safety in Health Care. 1998;7(3):149–158.

36. Lau A. The why’s, when’s, what’s, and how’s of cultural adaptation of evidence-based treatments. PsycEXTRA Dataset 2006.

37. Lundgren L, Krull I, Zerden LD, McCarty D. Community-based addiction treatment staff attitudes about the usefulness of evidence-based addiction treatment and CBO organizational linkages to research institutions. Evaluation and Program Planning. 2011;34(4):356–365.

38. Lyon AR, Lau AS, McCauley E, Vander Stoep A, Chorpita BF. A case for modular design: Implications for implementing evidence-based interventions with culturally diverse youth. Professional Psychology: Research and Practice. 2014;45(1):57–66.

39. McHugh RK, Murray HW, Barlow DH. Balancing fidelity and adaptation in the dissemination of empirically-supported treatments: The promise of transdiagnostic interventions. Behaviour Research and Therapy. 2009;47(11):946–953.

40. McLennan JD, Huculak S, Sheehan D. Brief report: Pilot investigation of servicereceipt by young children with autistic spectrum disorders. Journal of Autism and Developmental Disorders. 2008;38(6):1192–1196.

41. National Institutes of Health Office of Behavioral and Social Science Research. Dissemination and implementation 2016; http://obssr.od.nih.gov/scientific_areas/translation/dissemination_and_implementation/index.aspx. Accessed September 21, 2016.

42. Nock MK, Goldman JL, Wang Y, Albano AM, Jellinek MS. From science to practice: The flexible use of evidence-based treatments in clinical settings. Journal of the American Academy of Child & Adolescent Psychiatry. 2003;43(6):777–780.

43. Ovretveit J. Evaluating Improvement and Implementation for Health McGraw-Hill 2014.

44. Powell BJ, McMillen JC, Proctor EK, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review. 2012;69(2):123–157.

45. Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science. 2015;10:21.

46. Proctor EK, Silmere H, Raghavan R, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38(2):65–76.

47. Proctor EK, Powell BJ, McMillen J. Implementation strategies: Recommendations for specifying and reporting. Implementation Science. 2013;8(1):139.

48. Rabin BA, Brownson RC. Developing the terminology for dissemination and implementation research. Dissemination and Implementation Research in Health Translating Science to Practice 2012;23–52 Section 1.

49. Rycroft-Malone J, Kitson A, Harvey G, et al. Ingredients for change: Revisiting a conceptual framework. Quality and Safety in Health Care. 2002;11(2):174–180.

50. Scarpa A, Wells A, Attwood T. Exploring feelings for young children with high-functioning autism or Asperger’s disorder: The STAMP treatment manual Jessica Kingsley Publishers 2013.

51. Stetler CB, Legro MW, Wallace CM, et al. The role of formative evaluation in implementation research and the QUERI experience. Journal of General Internal Medicine. 2006;21.

52. Swiezy N, Stuart M, Korzekwa P. Bridging for success in autism: Training and collaboration across medical, educational, and community systems. Child and Adolescent Psychiatric Clinics of North America. 2008;17(4):907–922.

53. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventive Medicine. 2012;43(3):337–350.

54. Verbeke W, Volgering M, Hessels M. Exploring the conceptual expansion within the field of organizational behaviour: Organizational climate and organizational culture. Journal of Management Studies. 1998;35(3):303–329.

55. Volkmar FR, Reichow B, Doehring P. Evidence-based practices and treatments for children with autism New York: Springer Science Business Media, LLC; 2011.

56. Warren JS, Nelson PL, Mondragon SA, Baldwin SA, Burlingame GM. Youth psychotherapy change trajectories and outcomes in usual care: Community mental health versus managed care settings. Journal of Consulting and Clinical Psychology. 2010;78(2):144–155.

57. Wiltsey-Stirman S, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: A review of the empirical literature and recommendations for future research. Implementation Science. 2012;7(1):12.

58. Wiltsey-Stirman S, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implementation Science. 2013;8(1):65.

59. Wood JJ, Fujii C, Renno P. Autism symptom severity during school recess: A preliminary randomized, controlled trial. Journal of Autism and Developmental Disorders. 2011;44(9):2264–2276.

60. Wood JJ, Ehrenreich-May J, Alessandri M, et al. Cognitive behavioral therapy for early adolescents with autism spectrum disorders and clinical anxiety: A randomized, controlled trial. Behavior therapy. 2015a;46(1):7–19.

61. Wood JJ, McLeod BD, Klebanoff S, Brookman-Frazee L. Toward the implementation of evidence-based interventions for youth with autism spectrum disorders in schools and community agencies. Behavior Therapy. 2015b;46(1):83–95.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset