Applying Implementation Drivers to Child Welfare

0 downloads 0 Views 228KB Size Report
Sep 24, 2012 - Informa Ltd Registered in England and Wales Registered Number: ..... Reinforcement) change management framework (Hiatt, 2006) to monitor ...
This article was downloaded by: [Diane DePanfilis] On: 24 September 2012, At: 15:16 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Journal of Public Child Welfare Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/wpcw20

Applying Implementation Drivers to Child Welfare Systems Change: Examples From the Field a

b

b

Sarah Kaye , Diane DePanfilis , Charlotte Lyn Bright & Cathy Fisher

b

a

Kaye Implementation & Evaluation, LLC, Baltimore, MD, USA

b

University of Maryland, Baltimore, MD, USA

To cite this article: Sarah Kaye, Diane DePanfilis, Charlotte Lyn Bright & Cathy Fisher (2012): Applying Implementation Drivers to Child Welfare Systems Change: Examples From the Field, Journal of Public Child Welfare, 6:4, 512-530 To link to this article: http://dx.doi.org/10.1080/15548732.2012.701841

PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand, or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

Journal of Public Child Welfare, Vol. 6:512–530, 2012 Copyright © Taylor & Francis Group, LLC ISSN: 1554-8732 print/1554-8740 online DOI: 10.1080/15548732.2012.701841

Applying Implementation Drivers to Child Welfare Systems Change: Examples From the Field SARAH KAYE

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

Kaye Implementation & Evaluation, LLC, Baltimore, MD, USA

DIANE DePANFILIS, CHARLOTTE LYN BRIGHT, and CATHY FISHER University of Maryland, Baltimore, MD, USA

Implementation science is a relatively young field of research that studies efforts to put improved policies or programs into routine practice. In recent years, the Children’s Bureau has placed increasing emphasis on implementation of child welfare system reforms to promote improved outcomes for children, youth, and families. Using an implementation framework from the National Implementation Research Network, this article reviews the relevance of core drivers to child welfare and describes selected examples to illustrate how public agencies can apply and adapt concepts from implementation science to drive and sustain their system reforms. KEYWORDS change

child welfare, implementation science, systems

Since the accountability provisions of the Adoption and Safe Families Act (ASFA) were developed in 1997, state child welfare agencies have been held accountable to national standards set and monitored by the Children’s Bureau (CB) through the Child and Family Services Review (CFSR). Following analysis of population-level administrative data, statewide assessments, case Received: 12/01/11; revised: 03/17/12; accepted: 05/06/12 This article is based on work from the Atlantic Coast Child Welfare Implementation Center, supported by Cooperative Agreement number HHS-2008-ACF-ACYF-CO-0058 from the US DHHS, ACF, ACYF, Children’s Bureau to the University of Maryland School of Social Work. The assertions and findings are not to be construed as those of the Children’s Bureau. Address correspondence to Sarah Kaye, Owner, Kaye Implementation & Evaluation, LLC, Baltimore, MD, USA. E-mail: [email protected] 512

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

Implementation Science in CWS Change

513

record reviews, and interviews with stakeholders, jurisdictions develop Program Improvement Plans (PIPs) which outline initiatives they will implement to improve performance. Two rounds of reviews nation-wide over a ten year period reveal several advances and many areas where agencies continue to struggle to achieve desired outcomes (USDHHS/ACY/ACYF/CB, 2010). The CB has learned that attempted improvements are most successful when they are part of a systems change, or ‘‘a large, cohesive, and comprehensive vision for change that permeates the child welfare system at all levels of responsibility and leadership’’ (USDHHS/ACY/ACYF/CB, 2008). Recognizing that systems-level improvements cannot be realized without successful implementation, CB has placed an increased focus on supporting implementation processes to promote systems change. At the 2nd National Child Welfare Evaluation Summit, CB described implementation as a ‘‘series of evidence-informed steps and strategies guided by theory and research-based frameworks that are expected to increase the likelihood that well-defined, promising practices will be adopted and spread with fidelity’’ (Deakins, Morgan, Nolan, & Shafer, 2011, p. 7). CB’s commitment to implementation is evident in the incorporation of implementation language and principles into demonstration projects, evaluations of demonstrations, workforce leadership training, program improvement planning, and technical assistance (Deakins et al., 2011). One of the most substantial CB investments in implementation was the addition of five regional Implementation Centers to the CB’s Training and Technical Assistance Network in 2008. The Network is designed to provide information, training, and consultation to build the capacity of child welfare systems to prepare for Federal monitoring activities and, ultimately, support systemic change that yields better outcomes for children, youth, and families. Implementation Centers were designed to fill a gap in the national training and technical assistance network by providing ‘‘in-depth and long-term consultation and support to States and Tribes . . . with a focus on the implementation of strategies that their child welfare systems have identified to improve the quality and effectiveness of child welfare services for children, youth, and families’’ (USDHHS/ACY/ACYF/CB, 2008). Collectively, the Implementation Centers have supported 25 projects with child welfare jurisdictions to implement a range of systems changes (McCrae, Graef, Armstrong, & DePanfilis, 2011), including direct practice interventions (n D 17), organizational interventions (n D 8), and improving child welfare practice in tribal jurisdictions (n D 7).1 These diverse projects are reflective of the kinds of reforms state and tribal child welfare agencies have initiated as part of their PIP or other improvement efforts. Implementation Centers provide technical assistance to support states and tribes to use implementation frameworks to drive system changes, including direct practice and organizational interventions. Because most implementation research has been conducted on discrete evidence-based practices

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

514

S. Kaye et al.

(EBP) in clinical settings (Fixsen, Blase, Naoom, & Wallace, 2009), applying implementation science in the context of child welfare systems change requires creative application of implementation frameworks to make them ‘‘fit.’’ Several theory- and research-supported implementation models and frameworks have been applied in child welfare, including the Breakthrough Series Collaborative (Miller & Ward, 2008) and Getting to Outcomes Framework (Barbee, Christensen, Antle, Wandersman, & Cahn, 2011). The Atlantic Coast Child Welfare Implementation Center (ACCWIC) selected a framework developed by the National Implementation Research Network (NIRN) to inform our technical assistance approach because its ‘‘active efforts’’ to drive implementation appeared best aligned with the interests of the CB and state agencies to promote sustainable change.2 The framework describes eight implementation drivers that have demonstrated (through previous research across an array of interventions in varying contexts) a positive impact on sustainable implementation. Implementation drivers are core sets of strategies that support the implementation of a new practice model, policy, or program change. Competency drivers focus on strengthening the workforce, including:  Staff recruitment and selection,  Training, and  Coaching and supervision. Organizational drivers focus on infrastructure, including  Decision support data systems,  Facilitative administration (e.g., policies, procedures, and organizational structures), and  Systems intervention (e.g., collaboration with external agencies). Performance assessment focuses on collecting information about practice and outcomes and feeds it back to practitioners, supervisors, and managers. Leadership champions the change and supports all other implementation strategies. Child welfare systems often undergo changes to their policies, programs, and practices but may not utilize research-supported implementation activities to facilitate a successful change process. Assisting agency leaders to plan for change involves helping them focus on all parts of their system (e.g., personnel, quality assurance, and training) as implementation drivers. This implementation framework is helpful for identifying concrete activities that child welfare agencies should plan and conduct to achieve the large, comprehensive, and cohesive systems changes that are called for by the CB. Leveraging relevant drivers ensures that implementation efforts are sustainable, and embedded into ongoing management of child welfare systems.

Implementation Science in CWS Change

515

The purpose of this article is to illustrate the challenges and opportunities in applying the implementation drivers framework to child welfare systems change.

METHODS

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

We take a two-pronged approach to illustrate the application of implementation drivers in child welfare systems change. First, we review recent literature to synthesize challenges and strategies for leveraging drivers in child welfare. Second, we provide concrete examples of implementation drivers utilized by child welfare agencies as they implemented CB-supported systems change.

Literature Review We used the original definitions of implementation drivers (Fixsen et al., 2005) and searched electronic databases to identify recent peer-reviewed articles that described an implementation driver, its use as an implementation strategy, or the evaluation of its impact on implementation success. Preference was given to studies conducted in public child welfare systems, when available.

Selection of Examples The examples presented here were all drawn from implementation projects supported by one of the five CB-supported regional implementation centers, a member of the National Training and Technical Assistance Network. States applied to be selected as implementation project sites and received intensive technical assistance for at least two years. None of the projects followed a prescriptive ‘‘recipe’’ for implementation defined by an implementation framework. Instead, with TA from the Implementation Center, project leadership teams adapted and applied principles from the National Implementation Research Network framework (Fixsen et al., 2005) to develop strategies that were individualized for each project’s unique goals and context. Implementation Center staff observed and documented these strategies by: (1) facilitating teams to plan implementation strategies in comprehensive implementation plans, and (2) communicating frequently (in-person and electronically) with state project teams to support planned strategies, discuss implementation barriers, and strategize solutions. To identify the best illustrations for this article, we reviewed reports from the Center’s six projects and used a consensus generating process to make the final selection. Table 1 describes the intervention and scope of implementation for each project supported by the Implementation Center. Illustrations were selected while

516

S. Kaye et al.

TABLE 1 List of Systems Change Projects Supported by the Implementation Center A B C D E

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

F

State A is implementing differential response in designated Innovation Zones before expanding statewide. State B is implementing a youth engagement initiative in several pilot jurisdictions before expanding statewide. State C is implementing an organizational intervention to improve readiness and organizational capacity in target jurisdictions prior to implementing a family centered practice model statewide. State D is implementing a statewide technical assistance model as an organizational intervention designed to support counties across the state. State E is implementing an in-home practice model that combines direct practice interventions to improve family engagement and individualized planning with organizational interventions to improve service array. State F is implementing a safety assessment and management system statewide that structures the entire case management process from intake through case closure.

projects were in the process of implementation. Local site evaluations were still ongoing.

RESULTS Results of the literature review reveal substantial gaps in implementation knowledge in the field of child welfare. Few studies systematically test implementation strategies; instead, retrospective studies of implementation successes and implementation barriers were more common. We define each implementation driver and synthesize previous research to discuss the challenges and strategies for leveraging it in child welfare. Then, we detail examples that illustrate how public child welfare agencies utilized or adapted each driver as an implementation strategy to support their intended systems change.

Leadership Driver Organizational leaders provide the foundation for changes and adjustments to all other implementation drivers, described in the sections that follow. Leadership can be exemplified in many ways, including: demonstrating commitment to the implementation process, creating an implementation team, accepting suggestions for changing organizational practice, and dedicating resources to support implementation (Fixsen et al., 2005). LEADERSHIP

STRATEGIES

Recent studies in child welfare suggest that leadership is an important contributor to the success of implementation. Implementation strategies identified in the literature include:

Implementation Science in CWS Change

517

 Agency leaders communicating and continually reinforcing a strong vision, priorities, and implementation goals (Aarons & Palinkas, 2007; Connolly & Smith, 2010);  Managers assisting staff to translate training into ‘‘real world examples’’ (Aarons & Palinkas, 2007);  Agency leaders ‘‘mandating’’ that practice change in accordance with the new model (Crea, Crampton, Abramson-Madden, & Usher, 2008);  Agency leaders communicating confidence in organizational ability to create positive change (Connolly & Smith, 2010).

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

LEADERSHIP

EXAMPLE

State F is implementing a new safety assessment and management system that spans intake through case closure. This is a new practice model, or overarching set of practices and principles that guide agency behaviors and responses (DHHS/ACY/ACYF/CB, n.d.). The Commissioner of State F was involved at inception and remained active and engaged throughout implementation. He participated in early training sessions and visited local implementation sites to provide visible support. He engaged key stakeholder groups by communicating the rationale and organizational benefits of the new intervention model. His consistent participation on the implementation and management teams provided leadership direction and facilitated feedback between the state and field operations. He consistently sought fidelity and outcome data and ensured that data were available for managers throughout the state to inform decisions about how to improve implementation. These actions clearly demonstrated the importance of this initiative and reinforced numerous efforts to mobilize implementation drivers to support the state’s goals for change.

Competency Drivers SELECTION The selection driver focuses on ‘‘who’’ is qualified to carry out the evidencebased practice or program (Fixsen et al., 2009). This driver emphasizes the importance of defining selection criteria for recruiting personnel who will be qualified to deliver the organizational or direct practice intervention. Selection challenges. Finding the ‘‘right’’ staff can be particularly challenging in child welfare agencies that are often constrained in their recruitment, selection, and retention of a highly qualified workforce. Cumbersome civil service requirements and dwindling budgets set salaries at levels that will not attract and retain the most qualified staff for this complex job—ultimately having a significant impact on the delivery of child welfare services (GAO, 2003).

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

518

S. Kaye et al.

Selection strategies. Partly prompted by studies of IV-E Education for Public Child Welfare Partnerships intended to recruit and retain a professional child welfare workforce, there has been considerable research on the personal and organizational factors that will improve the retention of child welfare staff (DePanfilis & Zlotnik, 2008). Findings from this body of research point to factors that should be considered in the recruitment and selection of staff who will deliver higher quality child welfare services and stay longer in those service settings. For example, a national study of child welfare agency recruitment efforts identified the following types of strategies used to recruit professionally trained child welfare staff: (1) education incentives, (2) training opportunities, and (3) recruitment incentives (Gomez, Travis, Ayers-Lopez, & Schwab, 2010). Higher levels of commitment to social work practice in child welfare have been shown to influence decisions of social workers to accept positions in child welfare and their intentions to stay in child welfare once employed (Faller, Grabarek, & Ortega, 2010). Using findings from this extensive body of research on retention of staff, Ellett and colleagues (2009) have designed and tested an Employee Selection Protocol to be used to recruit and select a qualified child welfare workforce. The potential applicants watch a video about the realities of child welfare and then explore: their attitudes about work in public child welfare, their willingness to be on-call evenings and weekends, and their level of commitment to stay in public child welfare despite knowing that the work is stressful. The protocol also includes semi-structured interviews with candidates by an interview team, two writing samples, a set of on-site analyses of child welfare scenarios, and a juried process to identify potential candidates. Often, child welfare system changes require working with existing staff, rather than selecting a new workforce for every new initiative. Assessing the attitudes of child welfare staff is essential to selecting the ‘‘right’’ staff for implementing a new practice—even if this means selecting among staff already in the child welfare service system. Aarons and Palinkas (2007) propose several factors that support successful implementation of an EBP in child welfare: (1) acceptability of the EBP to the worker; (2) staff training in the EBP; and (3) perceptions of staff about the level of organizational support for EBP implementation. Workers who demonstrate perseverance, experience, and flexibility are also more likely to successfully implement (Aarons & Palinkas, 2007). Selection example. Like many child welfare agencies, civil servant hiring procedures posed a barrier to selecting new practitioners specifically for the new safety model in State F. Instead, the state initially focused on criteria for the selection of managers, trainers, and supervisors who would support workforce development. Those criteria were then adapted and used to develop practice standards and core competencies for caseworkers and supervisors statewide. All training and practicums in State F focus on building

Implementation Science in CWS Change

519

these competencies. The state is making changes to its personnel policies to integrate these competencies and improve the likelihood that new employees will enter the workforce with appropriate attitudes and qualities that match this intervention.

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

TRAINING The implementation of any new approach will require that agency staff and administration receive training on that particular organizational or direct practice intervention (Fixsen et al., 2005). Training should include adequate information to help practitioners understand why change is necessary and likely to be effective, as well as allow for exploration and feedback in employing new practices (Fixsen et al., 2009). Training challenges. Many child welfare agencies place too much emphasis on the ‘‘train and pray’’ (Fixsen et al., 2005) approach without sufficient attention to other implementation drivers. Knowledge-building alone is not enough to ensure quality child welfare practice (Austin et al., 2006). For example, if it is not part of a carefully staged implementation plan, training can appear ‘‘hurried’’ and staff could experience a lengthy wait between being trained and being expected to implement the new practice. Another challenge is introducing new practice paradigms that staff have difficulty relating to their current jobs (Crea et al., 2008). Training strategies. Training has been associated with a number of positive outcomes for child welfare professionals, including: increased knowledge, skills, and attitudes (Leung & Cheung, 1998; Turcotte, Lamonde, & Beaudoin, 2009) and improved job performance (Antle, Barbee, & van Zyl, 2008). Training also has been found to decrease burnout (Lloyd, King, & Chenoweth, 2002) and improve recruitment and retention of child welfare workers (Mor Barak, Nissly, & Levin, 2001). Recent research suggests that child welfare staff members will likely respond best to training that (a) involves application as well as didactic explanation, and (b) is provided by competent, experienced, and flexible training staff (Aarons & Palinkas, 2007). Training example. State A developed a training plan and curriculum designed to build middle managers’ competencies to support implementation of a new safety model, which includes differential response. The training plan introduced new skills that were then applied and reinforced with coaching and feedback from trainers and peers. During training, participants built knowledge in NIRN’s implementation framework; using differential response as an example, participants applied the framework and developed strategies to improve staff readiness, stage the implementation process, and drive implementation forward in a strategic manner. With feedback from trainers, participants developed a plan for working with staff and stakeholders on adopting the new intervention. Using the plans developed during training as a foundation, the managers will apply their knowledge and skills to the new differential response protocol.

520

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

COACHING

S. Kaye et al. AND SUPERVISION

Coaching goes beyond training to provide continuous information, consultation, and feedback as staff members implement practice changes (Fixsen et al., 2009). The coaching process may employ observation and feedback, teaching methods, role modeling, and processing of information (Burkhauser & Metz, 2009). Coaching allows staff members who are less experienced in a particular practice to develop their competence, and provides ongoing support for using new behaviors (Fixsen et al., 2005). In the context of public child welfare—and in the interest of developing agencies’ capacity to sustain change efforts beyond initial implementation—the role of coach may best be filled by supervisors. Therefore, we have adapted the original NIRN framework of coaching to include clinical and consultative supervision. Coaching challenges. One of the biggest challenges to using coaching as an implementation strategy may be that the culture of many child welfare agencies must change in order to prioritize coaching activities (Aarons, Hurlburt, & Horowitz, 2011; Burkhauser & Metz, 2009). Additionally, utilizing supervisors as coaches requires additional skill sets; supervisors must be proficient in the practice and understand how to apply concepts about coaching dynamics (Burkhauser & Metz, 2009). Coaching strategies. There is some evidence to support the use of coaching as an implementation strategy in child welfare. One study identified coaching through mentorship and case review activities as the primary strategy for ensuring effective use of a new practice model (Barbee et al., 2011). Another found that staff who received individual feedback about their performance during one-on-one consultation were less likely to leave the agency than staff without such support (Aarons, Sommerfeld, Hecht, Silovsky & Chaffin, 2009). Coaching as an implementation strategy could become self-sustaining; as coaching participants gain more experience, they may eventually become appropriate coaches for less-experienced staff members (Barbee et al., 2011). Coaching example. State C provides an example of how coaching can become a self-sustaining implementation strategy. The project implements an organizational intervention designed to improve organizational health and climate and increase readiness for the roll out of a family centered practice model. Regional Directors received 3 months of coaching to build capacity in listening, engaging, establishing goals, and ensuring accountability to more effectively support staff development. With monthly peer-to-peer support, Directors were trained to build the same capacities in supervisors. As a result, both managers and supervisors have the skills and abilities necessary to serve as coaches—and can apply those skills during supervision to ensure that all staff have access to ongoing coaching and consultation to support implementation of family-centered practice.

Implementation Science in CWS Change

521

Organizational Drivers

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

FACILITATIVE

ADMINISTRATION

Facilitative administration refers to enabling factors at the organizational level that allow for implementation of evidence-supported or innovative practices (Metz, Collins, Burkhauser, & Bandy, 2008). Organizations demonstrating facilitative administration will ensure consistency in all elements of organizational policy and procedure to support practice implementation (Fixsen et al., 2009). Elements of facilitative administration include a demonstrable administrative commitment to the practice change, support for staff to implement the change, active solicitation of staff and stakeholder feedback, and efforts to ensure a common vision (Metz et al., 2008). Facilitative administration requires proactive, vigorous, and enthusiastic attention by the administration to reduce implementation barriers and create an administratively hospitable environment for practitioners (Fixsen et al., 2005). Facilitative administration challenges. Recent studies in child welfare have identified implementation challenges when policies and procedures are not aligned with the intended practice change. For example, an agency attempting to implement Functional Family Therapy encountered substantial barriers because the existing client referral procedures did not generate appropriate children and families for the program (Zazzali et al., 2008). Another qualitative study of child welfare administrators explored the capacity of child welfare organizations to support practice innovations and determined that child welfare policy, leadership, and culture must shift to be more supportive of best practices (Jack et al., 2010). Facilitative administration strategies. Facilitative administration has been used to support implementation in child welfare. Agencies which implemented family team decision-making modified operating procedures with ‘‘firewalls’’ to ensure that cases could not move forward unless team meetings were held (Crae et al., 2008). Another state agency carefully selected members of the implementation team to ensure that multiple stakeholders could provide input during statewide implementation of its casework practice model (Barbee et al., 2011). Facilitative administration example. State D designed and implemented a new state-level technical assistance model to support continuous improvement at the county level. State leadership used many of the principles of facilitative administration to promote a ‘‘hospitable environment’’ for implementation by actively seeking staff involvement to identify and reduce barriers. As the new TA model was developed, State D conducted focus groups to better understand ‘‘what’’ needed to change. During implementation, the state adapted the ADKAR (Awareness, Desire, Knowledge, Ability, Reinforcement) change management framework (Hiatt, 2006) to monitor staff awareness of need for change, desire to change, and the knowledge and skills to make the changes needed to participate in the TA model. The

522

S. Kaye et al.

implementation team reviewed ADKAR data on a regular basis and adjusted implementation strategies to address barriers and improve staff support.

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

Systems Intervention The systems intervention driver includes strategies for working with external systems to ensure the availability of the financial, organizational, and human resources required to support implementation of the initiative (Fixsen et al., 2009). Working with external systems through interdisciplinary teams and collaborations has been promoted by the CB since the initial passing of the Child Abuse Prevention and Treatment Act (CAPTA) in 1974 (DePanfilis & Salus, 1992; Goldman & Salus, 2003). More recently, the CFSR process reinforced the importance of using statewide assessments and partnerships between service systems to improve outcomes for children and families (DHHS/ACY/ACYF/CB, 1996). SYSTEMS

INTERVENTION CHALLENGES

Discontinuity between system partners poses a major challenge for linking service systems. Research indicates that services for children and families are not well delineated among multi-agency service systems (Jonson-Reid, 2011). Further, members of citizen review panels (required by CAPTA) and professionals involved in the child welfare agency often do not agree about the needs of children and families (Bryan, Collins-Camargo, & Jones, 2011). Systems intervention strategies. Research supports two strategies for addressing these challenges. Openly and consistently sharing information among stakeholders could build trust among system partners (Crea, Crampton, Knight, & Paine-Wells, 2011). Developing strategic collaborations among stakeholders could unite multiple systems toward a common goal (Horwath & Morrison, 2011). Systems intervention examples. Two states have made noteworthy efforts to engage members of the broader child welfare system into their implementation projects. As part of an initiative to build an array of services to meet the diverse needs of families served through in-home services, State E has engaged internal and external stakeholders from private child welfare providers to participate on the implementation team. Members from other organizations are included on workgroups to: develop training and coaching materials, draft and disseminate communications about anticipated changes to the child welfare system, and assess gaps in community services and practices. As a result, multiple external perspectives are integrated into an array of other implementation strategies. State B’s project engaged youth as a core stakeholder in the development and implementation of a youth engagement model. Youth Advisory Board members gave feedback about the model principles and practices.

Implementation Science in CWS Change

523

Youth were key participants in developing the communication video that introduced the initiative statewide. Youth participated in planning community stakeholders’ orientation sessions to the new model and assisted in training child welfare staff. Youth representatives were also active participants on the implementation teams in pilot regions. Integrating youth in these ways has demonstrated leadership’s commitment to listening to youth voices while encouraging youth-friendly practices and implementation strategies.

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

DECISION

SUPPORT DATA SYSTEMS

Decision support data systems collect information necessary to assess the degree to which practice is changing as intended (i.e., fidelity), as well as the degree to which the intervention is achieving its intended outcomes for children and families (Fixsen et al., 2009). Implementation teams use data early in the implementation process to define the target population. As implementation the process continues, data are used to monitor the status of implementation by identifying practice areas that need additional attention to achieve practice standards. Decision support data systems challenges. This driver is inter-related with leadership, performance assessment, and facilitative administration. For decision support data systems to be effective, teams must be knowledgeable about how to use data to adjust implementation strategies to enhance other drivers such as selection, training, and coaching. Integrating data decision support systems into practice and policy decisions requires a multifaceted approach of creating organizational cultures that support utilization of implementation data and support leadership to feed findings into policy development (Jack et al., 2010). Decision support data systems strategies. State administrative child welfare information systems (SACWIS) (DeLeonardi & Yuan, 2000; English, Brandford, & Coghlan, 2000; Garnier & Poertner, 2000; Waldfogel, 2000) and quality assurance systems (Testa & Poertner, 2010) have been improving since ASFA, driven to some degree by ASFA requirements and the opportunities for earning incentives (Maza, 2000). With improved data quality, agencies are just beginning to go beyond CFSR reporting requirements to track the process of implementation of system reforms and to evaluate the degree to which implementation efforts achieve their intended outcomes. For example, Cash and Berry (2003) propose a method for tracking the focus and intensity of service delivery in a placement prevention program. Other researchers have used administrative data systems to assess implementation of family team meetings (Crea et al., 2008; Stuczynski & Kimmich, 2010). The future of improving the use of data to inform implementation will require staff at all levels to be better users of data to drive their decisions and practice (Collins-Camargo, Sullivan, & Murphy, 2011), including preparing supervisors to more effectively use data on case outcomes (Moore, Rapp,

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

524

S. Kaye et al.

& Roberts, 2000). Building a specialty in child welfare informatics within social work education may be one method for achieving this goal (Naccarato, 2010). Using portable computers in the home to enhance assessments and potentially track the service delivery process in ‘‘real time’’ is another future area for development (O’Connor, Laszewski, Hammel, & Durkin, 2011). Decision support data systems example. State F has used both fidelity and outcome data systems to inform decisions about how to allocate resources and leverage implementation strategies. Expert consultants have collected fidelity data at regular intervals throughout the statewide implementation to assess whether the intervention was being implemented as intended (see further description under Performance Assessment). Implementation teams used aggregate fidelity data to determine geographic and substantive areas where additional training or coaching should be deployed to bring areas of low fidelity up to practice standards. Leadership adjusted the timing and staging of implementation across the state to allow jurisdictions struggling with initial components of the intervention model more time to master those practice areas before taking on additional components of the intervention model. In addition to fidelity data, State F has developed an increased capacity to routinely run real-time reports from its SACWIS system to monitor performance on child welfare process indicators as well as outcomes. All managers have been trained to be more effective users of data. State and regional teams meet monthly to analyze and interpret trends.

Performance Assessment and Feedback Driver The performance assessment and feedback driver includes mechanisms for observing organizational or clinical practice against well-defined expectations. Assessments are used to monitor practice over time and provide feedback to practitioners. Examples of performance assessments in child welfare include national standards defined by the CFSR, state or local performance indicators, accreditation standards, and performance appraisals of practitioners. When used at multiple levels, performance assessment can and should be used to inform the other implementation drivers. PERFORMANCE

ASSESSMENT CHALLENGES

Perhaps one of the most pressing challenges in leveraging the performance assessment and feedback driver is defining and measuring appropriate performance indicators. Evidence-based practices developed in clinical or university settings generally include defined fidelity criteria and assessment instruments that provide a basis for measuring performance. Many child welfare interventions, however, are less specified and often need substantial

Implementation Science in CWS Change

525

work operationalizing practice principles into variables that can be readily observed and measured (Kaye & Osteen, 2011). Once defined, performance standards must be measured in a way that is fair, objective, and reliable (Wells & Johnson, 2011). Measurement of the CFSR national standards has been criticized for using methods that underestimate the performance of state child welfare systems (Courtney, Needell, & Wulczyn, 2004).

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

PERFORMANCE

ASSESSMENT STRATEGIES

There is strong evidence to support the use of performance assessment and feedback as an intervention to improve staff performance (Pritchard, Harrell, DiazGranados, & Guzman, 2008). For example, a nationwide systems change initiative in New Zealand relied on the development of transparent performance measures to continually reinforce the expected changes and measure progress toward improvement (Connolly & Smith, 2010). Child welfare agencies embarking on systems change have several natural opportunities to embed performance assessment into current practice by aligning performance measures with the intervention being implemented (e.g., quality assurance reviews, clinical supervision, and secondary analysis of administrative data). Studies suggest that fidelity assessments can assist child welfare caseworkers to improve their implementation of family team meetings (Stuczynski & Kimmich, 2010) and can be integrated into performance appraisals and criteria considered for promotion (Crae et al., 2008). Administrative case reviews may offer a promising method for collecting data to monitor the quality of service delivery (Whitaker, 2011). In fact, recent efforts to integrate the collection and use of fidelity data have provided early evidence that case reviews can provide reliable and valid data about fidelity of practice while providing feedback to practitioners (Kaye & Osteen, 2011). PERFORMANCE

ASSESSMENT EXAMPLE

With support from the Implementation Center, State F designed a fidelity monitoring system with fidelity reviews conducted by select managers, trainers and supervisors specially trained to be expert in the model as well as coaching and consultation. After coaches reviewed case files on key fidelity criteria, caseworkers received targeted coaching on specific practices that needed improvement. This behavioral oriented approach to practice improvement supported by coaching has helped to meet the individual development needs of caseworkers in the field. As State F moves forward through the final phase of implementation, they are transferring the fidelity review process to their quality assurance staff to support long-term sustainability.

526

S. Kaye et al.

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

SUMMARY AND CONCLUSIONS This article reviewed recent literature and provided examples to illustrate how NIRN’s implementation drivers framework can be applied to systems change initiatives in child welfare agencies. Results of the literature review revealed substantial gaps in implementation knowledge in the field of child welfare. Few studies systematically tested implementation strategies; instead retrospective studies of implementation successes and barriers were more common. Yet, the literature demonstrated support for the relevance of the NIRN implementation drivers in supporting child welfare change initiatives. The literature also identified some unique challenges child welfare agencies may face when attempting to leverage drivers as implementation strategies. This article contributes to the scant research on this subject by placing existing studies into a conceptual framework that can then be used to organize future studies. Future directions for scholarly work in this area should capitalize on the implementation drivers framework to describe implementation of change initiatives, similar to recent work using the Getting to Outcomes framework to describe implementation of practice models (Barbee et al., 2011). Evaluators should review and describe implementation strategies in evaluation designs—and assess implementation strategies as part of process evaluations, ultimately linking variations in implementation variables to outcomes. This article also provides ‘‘real world’’ examples that can be used (or adapted) by child welfare agencies seeking research-informed implementation strategies. While no single implementation framework is likely to provide the precise prescription necessary for successful change, using the implementation drivers framework to inform implementation strategies might substantially improve the effectiveness of systems changes. Future policy and practice work in this area should continue to create and disseminate innovative strategies that apply implementation science to unique organizational contexts and change initiatives child welfare agencies are undertaking every day. To see a greater impact of system reforms in child welfare, we believe that attention needs to be given to both what interventions are most effective in achieving the outcomes for children, youth and families, and what it takes to effectively implement and sustain positive changes in our child welfare agencies.

NOTES 1. Counts are not mutually exclusive; projects could fit into more than one category. 2. For more information about the framework, see the monograph ‘‘Implementation Research: Synthesis of the Literature’’ (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005).

Implementation Science in CWS Change

527

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

REFERENCES Aarons, G. A., & Palinkas, L. A. (2007). Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 34(4), 411–419. Aarons, G. A., Hurlburt, M., & Horowitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. Aarons, G. A., Sommerfeld, D. H., Hecht, D. B., Silovsky, J. F., & Chaffin, M. J. (2009). The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology, 77(2), 270–280. Adoption and Safe Families Act. (1997). P.L. 105–89. Antle, B. F., Barbee, A. P., & van Zyl, M. A. (2008). A comprehensive model for child welfare training evaluation. Children and Youth Services Review, 30, 1063–1080. Austin, M. J., Weisner, S., Schrandt, E., Glezos-Bell, S., & Murtaza, N. (2006). Exploring the transfer of learning from an executive development program for human services managers. Administration in Social Work, 30, 71–90. Barbee, A. P., Christensen, D., Antle, B., Wandersman, A., & Cahn, K. (2011). Successful adoption and implementation of a comprehensive casework practice model in a public child welfare agency: Application of the Getting to Outcomes (GTO) model. Children and Youth Services Review, 33(5), 622–633. Bryan, V., Collins-Camargo, C., & Jones, B. (2011). Reflections on citizen-state child welfare partnerships: Listening to citizen review panels. Children and Youth Services Review, 33, 612–621. Burkhauser, M., & Metz, A. J. R. (2009, February). Using coaching to provide ongoing support and supervision to out-of-school time staff. Publication #2009-06. Washington, DC: Child Trends. Cash, S. J., & Berry, M. (2003). Measuring service delivery in a placement prevention program: An application of an ecological model. Administration in Social Work, 27(3), 65–85. Child Abuse Prevention and Treatment Act (1974, as amended). P. L. 93-247. 42 USC 5101 et seq; 5116 et seq. Collins-Camargo, C., Sullivan, D., & Murphy, A. (2011). Use of data to improve performance and improve outcome achievement by public and private child welfare staff. Children & Youth Services Review, 33, 330–339. Connolly, M., & Smith, R. (2010). Reforming child welfare: an integrated approach. Child Welfare, 89(3), 9–31. Courtney, M. E., Needell, B., & Wulczyn, F. (2004). Unintended consequences of the push for accountability: the case of national child welfare performance standards. Children and Youth Services Review, 26, 1141–1154. Crea, T., Crampton, D., Abramson-Madden, A., & Usher C. L. (2008). Variability in the implementation of Team Decisionmaking (TDM): Scope and compliance with the Family to Family practice model. Children and Youth Services Review, 30(11), 1221–1232. Crea, T. M., Crampton, D. S., Knight, N., & Paine-Wells, L. (2011). Organizational factors and the implementation of Family to Family: Contextual elements of systems reform. Child Welfare, 90, 143–161.

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

528

S. Kaye et al.

Deakins, B., Morgan, O. J., Nolan, C. M., & Shafer, J. P. Revisiting research to practice: Children’s Bureau perspectives on implementation and evaluation in child welfare initiatives. Paper presented at the 2011 National Child Welfare Evaluation Summit Washington, DC. DeLeonardi, J. W., & Yuan, Y. T. (2000). Introduction: Using administrative data. Child Welfare, 79, 437–443. DePanfilis, D., & Zlotnik, J. L. (2008). Retention of child welfare staff in child welfare: A systematic review of research. Children and Youth Services Review, 30, 995– 1008. DePanfilis, D., & Salus, M. (1992). Child Protective Services: A guide for caseworkers. U.S. Government Printing Office No. 1992-625-670/60577. Washington, DC: National Center on Child Abuse and Neglect. Ellett, A. J., Ellett, C. D., Ellis, J., & Lerner, B. (2009). A research-based child welfare employee selection protocol: Strengthening retention of the workforce. Child Welfare, 88, 49–68. English, D. J., Brandford, C. C., & Coghan, C. C. (2000). Data-based organizational change: The use of administrative data to improve child welfare programs and policy. Child Welfare, 79, 499–515. Faller, K. C., Grabarek, M., & Ortega, R. M. (2010). Commitment to child welfare work: What predicts leaving and staying? Children and Youth Services Review, 32, 840–846. Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 521–540. Fixsen, D., Naoom, S., Blase, K., Friedman, R., & Wallace, F. (2005). Implementation research: A synthesis of the literature. FMHI Publication #231. Garnier, P. C., & Poertner, J. (2000). Using administrative data to assess child safety in out-of-home care. Child Welfare, 79, 597–613. Goldman, J., & Salus, M. K. (2003). A coordinated response to child abuse and neglect: The foundation for practice. Washington, DC: U.S. Department of Health and Human Services. Gomez, R. J., Travis, D. J., Ayers-Lopez, S., & Schwab, A. J. (2010). In search of innovation: A national qualitative analysis of child welfare recruitment and retention efforts. Children and Youth Services Review, 32, 664–671. Government Accountability Office (GAO). (2003). Child welfare: HHS could play a greater role in helping child welfare agencies recruit and retain staff [GAO-03357 ]. Washington DC: Author. Hiatt, J. M. (2006). ADKAR: A model for change in business, government and our community. Loveland, CO: Prosci Learning Center Publications. Horwath, J., & Morrison, T. (2011). Effective interagency collaboration to safeguard children: Rising to the challenge through collective development. Children and Youth Services Review, 33, 368–375. Jack, S., Dobbins, M., Tonmyr, L., Dudding, P., Brooks, S., & Kennedy, B. (2010). Research evidence utilization in policy development by child welfare administrators. Child Welfare, 89(4), 83–100. Jonson-Reid, M. (2011). Disentangling system contact and services: A key pathway to evidence-based children’s policy. Children and Youth Services Review, 33, 598–604.

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

Implementation Science in CWS Change

529

Kaye, S., & Osteen, P. J. (2011). Developing and validating measures for child welfare agencies to self-monitor fidelity to a child safety intervention. Children and Youth Services Review, 33, 2146–2151. Leung, P., & Cheung, K. M. (1998). The impact of child protective service training: A longitudinal study of workers’ job performance, knowledge, and attitudes. Research on Social Work Practice, 8, 668–684. Lloyd, C., King, R., & Chenoweth, L. (2002). Social work, stress and burnout: A review. Journal of Mental Health, 11, 255–265. Maza, P. L. (2000). Using administrative data to reward agency performance: The case of the Federal Adoption Incentive Program. Child Welfare, 79, 444–456. McCrae, J., Graef, M., Armstrong, M., & DePanfilis, D. 2011. Developing a shared process measurefor implementation projects in child welfare. USDHHS, ACY, ACYF, Children’s Bureau 2011 National Child Welfare Evaluation Summit. Washington, DC, August 30, 2011. Metz, A., Burkhauser, M., Collins, A., & Bandy, T. (2008, August). The role of organizational context and external influences in the implementation of evidence-based programs: An exploratory study (Report IV). Washington, DC: Child Trends. Retrieved from http://www.childtrends.org/Files/Child_Trends2008_12_17_FR_Implementation3.pdf Miller, O., & Ward, K. (2008). Emerging strategies for reducing racial disproportionality and disparate outcomes in child welfare: The results of a National Breakthrough Series Collaborative. Child Welfare, 87(2), 30. Moore, T. D., Rapp, C. A., & Roberts, B. (2000). Improving child welfare performance through supervisory use of client outcome data. Child Welfare, 79, 475– 497. Mor Barak, M. E., Nissly, J. A., & Levin, A. (2001). Antecedents to retention and turnover among child welfare, social work, and other human service employees: What can we learn from past research? A review and metanalysis. Social Service Review, 75, 625–661. Naccarato, T. (2010). Child welfare informatics: A proposed subspecialty for social work. Children and Youth Services Review, 32, 1729–1734. O’Connor, C., Laszewski, A., Hammell, J., & Durkin, M. S. (2011). Using portable computers in home visits: Effects on programs, data quality, home visitors and caregivers. Children and Youth Services Review, 33, 1318–1324. Pritchard, R. D., Harrell, M. M., DiazGranados, D., & Guzman, M. J. (2008). The productivity measurement and enhancement system: A meta-analysis. Journal of Applied Psychology, 93(3), 540–567. Stuczynski, A., & Kimmich, M. (2010). Challenges in measuring the fidelity of a child welfare service intervention. Journal of Public Child Welfare, 4(4), 406–426. Testa, M. F., & Poertner, J. (Eds). (2010). Fostering accountability: Using evidence to guide and improve child welfare policy. New York: Oxford University Press. Turcotte, D., Lamonde, G., & Beaudoin, A. (2009). Evaluation of an in-service training program for child welfare practitioners. Research on Social Work Practice, 19, 31–41. USDHHS/ACY/ACYF/Children’s Bureau (n.d.). Child welfare practice models. Retrieved from http://www.childwelfare.gov/management/reform/approaches/ practicemodels.cfm

Downloaded by [Diane DePanfilis] at 15:16 24 September 2012

530

S. Kaye et al.

USDHHS/ACY/ACYF/Children’s Bureau (2010). Child welfare outcomes 2004–2007 report to Congress. Washington, DC: author. Retrieved from http://www.acf.hhs .gov/grants/closed/HHS-2008-ACF-ACYF-CO-0058.html Waldfogel, J. (2000). Child welfare research: How adequate are the data? Children & Youth Services Review, 22, 705–741. Wells, S., & Johnson, M. (2001). Selecting outcome measures for child welfare settings: Lessons for use in performance management. Children and Youth Services Review, 23, 169–199. Whitaker, T. (2011). Administrative case reviews: Improving outcomes for children in out-of-home care. Children and Youth Services Review, 33, 1683–1708. Zazzali, J. L., Sherbourne, C., Hoagwood, K. E., Greene, D., Bigley, M. F., & Sexton, T. L. (2008). The adoption and implementation of an evidence-based practice in child and family mental health services organization: A pilot study of Functional Family Therapy in New York state. Administration and Policy in Mental Health and Mental Health Services Research, 35, 38–49.

CONTRIBUTORS Sarah Kaye, PhD, is the owner of Kaye Implementation & Evaluation, LLC in Baltimore, MD. Diane DePanfilis, PhD, MSW, is a Professor in the School of Social Work at the University of Maryland in Baltimore, MD. Charlotte Lyn Bright, PhD, MSW, is an Assistant Professor in the School of Social Work at the University of Maryland in Baltimore, MD. Cathy Fisher, MSW, is a Project Director in the School of Social Work at the University of Maryland in Baltimore, MD.