Ready, Willing, and Able: Developing a Support System to Promote ...

2 downloads 87 Views 410KB Size Report
May 23, 2012 - Abstract. While the number and scope of evidence-based health, education, and mental health services continues to grow, the movement of ...
Am J Community Psychol (2012) 50:428–444 DOI 10.1007/s10464-012-9520-z

ORIGINAL PAPER

Ready, Willing, and Able: Developing a Support System to Promote Implementation of School-Based Prevention Programs Paul D. Flaspohler • Cricket Meehan • Melissa A. Maras • Kathryn E. Keller

Published online: 23 May 2012  Society for Community Research and Action 2012

Abstract While the number and scope of evidence-based health, education, and mental health services continues to grow, the movement of these practices into schools and other practice settings remains a complex and haphazard process. The purpose of this paper is to describe and present initial support for a prevention support system designed to promote high-quality implementation of whole school prevention initiatives in elementary and middle schools. The function and strategies of a school-based prevention support system are discussed, including key structures and activities undertaken to identify, select, and provide technical assistance to school personnel. Data collected over a 5 year period are presented, including evidence of successful implementation support for 5 different evidence-based programs implemented with fidelity at 12 schools and preliminary evidence of goal attainment. Findings suggest the ongoing collection of information related to organizational readiness assists in the adoption and implementation of effective practices and initiatives and provide valuable insight into the development of results-oriented approaches to prevention service delivery. Problems, progress, and lessons learned through this

P. D. Flaspohler (&)  C. Meehan Department of Psychology and Center for School-Based Mental Health Programs, Miami University, Oxford, OH, USA e-mail: [email protected] M. A. Maras Department of Educational, School and Counseling Psychology, University of Missouri, Columbia, MO, USA K. E. Keller The Health Foundation of Greater Cincinnati, Cincinnati, OH, USA

123

process are discussed to frame future research and action steps for this school-based prevention support system. Keywords Interactive systems framework  Prevention support system  School mental health  Capacity building  Best practice process  Technical assistance

Introduction In the promotion of health and well-being and the prevention of mental and physical health problems, there is increased focus on narrowing the gap between unmet mental health needs among children and adolescents and effective evidence-based services to address them (Rones and Hoagwood 2000; U.S. Department of Health and Human Services 1999; U.S. Public Health Service 2000). The initial promise of evidence-based programs to address these unmet needs has begun to wane as programs that have been shown empirically to have the potential to be effective produce outcomes that are modest at best when disseminated broadly (Greenhalgh et al. 2004; Gottfredson and Bauer 2007; Social and Character Development Research Consortium 2010). The purpose of this paper is to describe the development of a prevention support system designed to promote implementation of whole school, evidence-based violence prevention programs in elementary and middle schools. We first present the rationale for school-based prevention, emphasizing both the continuing need for prevention programming for school violence and other significant problems and the availability of evidence-based practices that have shown evidence of effectiveness in addressing these areas of need. Challenges in the dissemination and implementation of these evidence-based strategies are then

Am J Community Psychol (2012) 50:428–444

addressed, concluding that systematic prevention support systems are needed to assure successful implementation (and adaptation) of evidence-based practices. Next, we review the recent emergence of dissemination/implementation frameworks and describe the Interactive Systems Framework for Dissemination and Implementation. Using findings from a 5-year study, we describe efforts to develop prevention structures and processes for a prevention support system for the Health Foundation of Greater Cincinnati’s Improving Student Behavioral Health: School-Wide Evidence-Based Prevention Programs initiative, a multi-year project designed to establish, evaluate, and sustain evidence-based programs within local elementary and middle schools. The prevention support system included: (1) a results-oriented grant making process designed to assure that participating schools have established the minimal level of readiness required to successfully implement evidence-based programs; and (2) technical support processes designed to build innovationspecific (program related) and general capacity (associated with planning, implementing, evaluating, improving, and sustaining all programs). We conclude by presenting preliminary evidence that the prevention support system led to high-fidelity implementation of evidence-based programs in schools and positively impacted school climate according to student and staff reports. We conclude by unpacking several aspects of the prevention support system that we believe are critical to the success of this initiative: Working with the ready, showing up with best practice process, mainstreaming evaluation and accountability, and building communities of practice.

Continuing Gap between Prevention Research and Practice Despite evidence that the prevalence of school violence and other major problems among youth may have leveled off or declined, there is a still a significant need for effective prevention programming. Recent data suggest 11.3 % of young people in this country, about 7.4 million youth altogether, have at least one diagnosable emotional, behavioral, or developmental condition; 40 % of these youth are diagnosed with two or more of these conditions (U.S. Department of Health and Human Services 2010). National surveillance data indicate that high school students continue to engage in a range of health-risk behaviors (e.g., substance use, unprotected sexual activity, suicide attempts, carrying a weapon at school, lack of physical activity) that contribute to the leading causes of morbidity and mortality among youth and adults (Centers for Disease Control and Prevention 2010). Fortunately, research has identified ameliorable risk and protective factors that can

429

be impacted through preventive interventions to address many mental health conditions and reduce risky behaviors (Mrazek and Haggerty 1994). Investing in prevention continues to make sense as a cost-saving alternative to reactive approaches that treat problems after they have developed. As such, there is a growing need for effective prevention programs to reduce risk factors and promote healthy development among youth. There is a growing emphasis in schools and communities to implement evidence-based programs, the underlying assumption being that adopting evidence-based programs will result in positive outcomes. Fortunately, there are now a number of available evidence-based programs that have been evaluated and shown to have the potential to be effective. For example, the Center for the Study and Prevention of Violence (CSVP 2008) at the University of Colorado has identified 11 violence prevention and intervention programs that meet a strict scientific standard of program effectiveness in reducing adolescent violent crime, aggression, delinquency, and substance abuse. When implemented with fidelity (Durlak and DuPre 2008; Elliott and Mihalic 2004), school violence programs have been shown to improve school indicators such as academic achievement, attendance, and student behavior (Haynes et al. 1997; Olweus and Limber 2002; Rutter et al. 1979; Wingspread Conference 2004); intrapersonal indicators such as students’ self-concept and overall sense of well-being and satisfaction (Anderson 1982; Haynes et al. 1997; Olweus 1994); and interpersonal indicators such as students’ pro-social behavior and relationships (Comer 1981; Olweus 1994; Olweus and Limber 2002; Rutter et al. 1979; Wingspread Conference 2004). There is clear potential for these programs to positively impact the developmental trajectory of young people. Evidence-based programs are not limited to school violence prevention. These programs are now so numerous that there are clearinghouses designed to organize identified evidence-based programs to facilitate identification of and access to these resources (e.g., What Works Clearinghouse, Blueprints for Violence Prevention, National Registry of Effective Programs). However, the availability of programs and evidence of their effectiveness does not guarantee programs will be adopted and implemented with intended results. Without appropriate infrastructure supports, evidence-based school mental health practices often are not implemented with sufficient consistency or quality to obtain outcomes documented during program development (Biglan and Taylor 2000; Feuer et al. 2002; Weist and Paternite 2006). Recent attention has focused on models and frameworks of the processes needed to transport evidence-based programs to schools and other settings.

123

430

Translation, Dissemination, and Implementation Research Successful promotion and implementation of effective mental health practices in schools involves more than exclusively championing evidence-based approaches, most of which have not demonstrated evidence of effectiveness, durability, affordability, sustainability, and transportability beyond the highly controlled settings in which initial evidence of effectiveness was demonstrated. Contributing to the gap between evidence-based and current, everyday practice is the dearth of research to inform how to best promote diffusion, dissemination, and the processes of change in movement toward effective implementation of school mental health practices (Jensen et al. 1999; Paternite 2005; Schoenwald and Hoagwood 2001; Wandersman 2003; Wandersman and Florin 2003). Recognizing the need for this type of scholarship, there has been an increase in translation, dissemination, and implementation research in the last decade (Elliott and Mihalic 2004; Glasgow and Emmons 2007; Green 2001; Greenhalgh et al. 2004; Ringeisen et al. 2003; Schoenwald and Hoagwood 2001; Wandersman 2003). Notably, models and frameworks have emerged that unpack and promote understanding of dissemination and implementation. These include frameworks from the National Implementation Research Network (NIRN; Fixsen et al. 2005), the RE-AIM group (Glasgow et al. 2001), the Interactive Systems Framework for Dissemination and Implementation (ISF; Fig. 1 The interactive systems framework for dissemination and implementation (Wandersman et al. 2008)

123

Am J Community Psychol (2012) 50:428–444

Wandersman et al. 2008), and the Multilevel Quality Implementation Framework (Domitrovich et al. 2008). These frameworks and models are windows into the key attributes, facilitators, and challenges of promoting implementation. Each is heuristic and approximate, providing alternative views to guide research and practice development. The emerging models and frameworks for dissemination and implementation highlight the importance of both capacity support and innovation support in the process of promoting effective prevention. A review and comparison of these frameworks is beyond the scope of this paper (see Fixsen et al. 2005; Flaspohler et al. 2008b; Glasgow and Emmons 2007; Greenhalgh et al. 2004) The frameworks highlight the need for support systems to facilitate the movement of research into practice. In short, there need to be systems and processes in place to work with schools and school districts implementing evidence-based prevention practices. The ISF most closely approximates this need. The Interactive Systems Framework for Dissemination and Implementation (ISF; Wandersman et al. 2008) is a heuristic framework for how to transfer preventive innovations from science to practice using a tiered systems approach. The Framework identifies three systems that are integral to the movement of innovation from research to practice: the Prevention Synthesis and Translation System, the Prevention Support System, and the Prevention Delivery System (see Fig. 1). The Prevention Synthesis and Translation System works to distill information generated

Am J Community Psychol (2012) 50:428–444

through research and to prepare it for dissemination and implementation in the field. The primary activities of this system are to synthesize existing research and translate it for use by practitioners. The Prevention Support System is conceptualized as carrying out two primary support functions: innovation-specific support (innovation-specific capacity-building) and general support (general capacitybuilding). Innovation-specific capacity-building is assistance that is related to using a specific innovation. General capacity-building is intended to enhance the infrastructure, skills, and motivation of an organization, but it does not focus on a specific innovation. The Prevention Delivery System carries out the activities necessary when implementing programs and services. Each of the three systems identified above is crucial for the successful dissemination and implementation of innovations in practice. The development of the ISF has led to substantive changes in research, funding, and practice in the prevention of youth violence and child maltreatment. The ISF is particularly useful when exploring the adoption, dissemination, and implementation of new programs, policies, practices, processes, knowledge, and innovations in the practice field (Wandersman et al. 2008). The ISF highlights the importance of prevention support systems in helping schools and other local agencies in the quality implementation of prevention programs. Although the framework was originally developed in relation to the prevention of youth violence and child maltreatment, it has direct application to the full spectrum of services promoting the development of healthy children and families.

Supporting Balance Between Fidelity and Adaptation Schools and school personnel are not (or should not be) a passive part of the process of implementation. With adequate support and accountability, schools and communities are capable of identifying the local needs for prevention, promising innovations for addressing those needs, determining the ‘‘fit’’ between the innovations and the host organization, and deciding whether and how to adopt the innovation (Kraft et al. 2000). Although implementation fidelity has been linked to the attainment of positive and intended outcomes, strict fidelity to the developers’ guidelines for implementation is often difficult to achieve in the complex and multifaceted contexts of schools. This tension between fidelity and adaptation has received attention. Durlak and Dupre (2008) have identified contextual factors that influence strict implementation of an evidence-based program. These factors include innovations, providers, communities, features related to organizational capacity, and training/technical assistance. Meta-analytic analyses suggest that measured adaptations

431

relevant to the local context can improve attainment of positive outcomes, warranting careful consideration in balancing fidelity and adaption issues. The balance between implementation fidelity and program adaptation merits attention to these contextual factors. Nation et al. (2003) described principles that enhance the effectiveness of program implementation: programs that were comprehensive and included varied teaching methods, provided sufficient dosage, were theory driven, provided opportunities for positive relationships, were appropriately timed, were socioculturally relevant, included outcome evaluation, and involved well-trained staff. Balancing the need to implement programs with fidelity, while holding true to the needs of the local context increases the likelihood that programs will meet the local need and attain desired outcomes.

Capacity and Readiness in the Prevention Support System The failure to translate and implement programs, and produce expected results, is often attributed to differences in capacity among research and practice contexts, and it is therefore imperative that barriers and facilitators of the adoption of evidence-based programs are examined (Wandersman 2003). As such, real-world settings are now examined in terms of their ‘‘readiness for innovation.’’ Such readiness is conceptually anchored in the literature of planned change (Fullan 1993, 2003; Zaltman and Duncan 1977; Zaltman et al. 1973), communication and diffusion of innovations (Havelock 1976; Rogers 2003), and knowledge transfer and change (Human Interaction Research Institute 1976). Research on implementation and readiness for change suggests that inattention to forces and factors that impact adoption seriously jeopardizes any project seeking to introduce a new idea into an organization such as a school district or building. Readiness, therefore, becomes a crucial planning and surveillance activity. Similarly, communities or organizations might be at different stages of readiness for program implementation if examined along a number of dimensions. Rogers (2003) summarized key factors and processes that impact the rate of innovation adoption (e.g., relative advantage of the innovation, complexity of the innovation), and that seem to be consistently addressed by other change researchers. Given the vital role of community or organizational readiness in implementing evidence-based interventions with fidelity, there has been a significant emphasis in recent years on building capacity to increase community and organizational readiness. In the ISF, the Prevention Support System assists and builds capacity among the individuals who implement the

123

432

innovation. Training and technical assistance are the primary media of innovation and capacity support. Mitchell et al. (2002) define technical assistance as ‘‘intermediary support’’ to complete with quality any of the various tasks involved in prevention. While the importance of quality training and technical assistance to effective prevention has been widely acknowledged (Altman 1995; Backer 1991; Mitchell et al. 2002; Wandersman and Florin 2003), there has been relatively little empirical research on the process and quality of training and technical assistance as it relates to outcomes. Chinman et al. (2005) reviewed the sparse technical assistance literature concluding that technical assistance to support an innovation requires at least a minimum level of general capacity and is likely to encounter resistance even when offered freely. Very little empirical research exists examining support for developing general capacities; however, Altman (1995) reports that a general capacity building approach was more successful and more beneficial to the community than was an attempt to develop a coalition focused on replicating a program. Further, using a systems dynamics computer simulation model, Homer and Milstein (2004) demonstrated how exclusive focus on innovation specific capacities (i.e. promoting programs aimed at fixing specific community problems) may reduce already sparse resources in communities with low levels of general capacity. In summary, emerging models of dissemination and implementation suggest that dissemination and implementation failure may occur due to the absence of formal prevention support systems and the potential for these support systems to assist with promoting both innovation specific capacities and general capacities to plan, implement, evaluate, and sustain innovations in practice settings. In the next section, we describe the structures and processes of a prevention support system designed to assist schools and school systems with implementation of evidence-based prevention programs, including a description of a results-oriented grant making process designed to assure selection of schools that are ‘‘ready’’ for implementation.

Am J Community Psychol (2012) 50:428–444

build both innovation specific and general capacities in order to assure that programs are implemented with quality and produce intended outcomes. Following a brief description of funding and staffing for the MU PSS, we describe the goals and activities of the MU PSS in four parts: first, we focus on efforts to recruit and select ‘‘ready and willing’’ schools to participate in the initiative. In the second and third sections, we describe the capacity building processes and activities undertaken with schools during the planning and implementation phases of the project. Finally, we describe efforts to monitor program fidelity and strategies to assure collection of evidence of effectiveness within and across participating schools. Funding and Staffing The MU PSS staffing structure was developed to support implementation of the Improving Student Behavioral Health: School-Wide Evidence-Based Prevention Programs initiative based on recruiting five schools each year and supporting each of those schools over 4 years (1 year of planning followed by 3 years of implementation support). The MU PSS is staffed by a project coordinator (Meehan). Funding for this position varies from 50 to 75 % full time employment based on the number of schools participating at any given time. A part time (10–20 %) project director (Flaspohler), one half-time funded graduate assistant, and a team of six to eight unfunded undergraduate research assistants (primarily engaged in survey packaging and data entry) support the project coordinator. Other significant expenses for the MU PSS include travel (project coordinator meets with each team monthly on site) and materials (MU PSS produces all assessment materials). The planning and implementation grants provide for up to $20,000 per year for participating schools. Funding from the grants is used primarily for program costs (training, technical assistance, materials, etc.…), release time for teachers (i.e. costs of substitute teachers), travel, and other expenses.

Recruiting and Selecting Ready and Willing Schools The Miami University Prevention Support System The primary purpose of this paper is to illustrate efforts to develop a functional and effective prevention support system geared toward supporting implementation of primary prevention programs and services in schools. The prevention support system capitalizes on several innovations in order to identify schools that are ready and willing (i.e. have sufficient capacity at the outset to engage in the process). Guided by the ISF, the Miami University Prevention Support System (MU PSS) works with schools to

123

The Health Foundation of Greater Cincinnati invited elementary and middle schools within its 20-county catchment area to participate in the Improving Student Behavioral Health: School-Wide Evidence-Based Prevention Programs initiative. Five-hundred-seventy-eight (578) public and private schools serving a K-8 population were eligible to apply. A competitive ‘‘Request for Proposals’’ (RFP) process was developed to select five schools each year to participate. The Health Foundation announced the RFP to every school in the catchment area through direct

Am J Community Psychol (2012) 50:428–444

and electronic mail. Several strategies were designed to assure that schools selected for participation in the project are ready. Built upon previous efforts to identify ready schools (Flaspohler et al. 2008a, b), these strategies included (1) initial contingencies to reduce the pool of potential applicants, (2) basic indicators of readiness and capacity, and (3) a results-oriented granting process. Initial Contingencies Any interested school was required to demonstrate readiness by sending a team of representatives to a Request for Proposals (RFP) meeting. The team of representatives formed the basis of the Core Planning Team (CPT) that was expected to drive the planning and implementation process at each school. Three members of the CPT were required to attend the RFP meeting, and the CPT had to include a school administrator (typically a principal) and a classroom teacher. Expectations were communicated to schools in the funding announcement sent by mail and posted on the foundation’s website. A school’s ability to mobilize a CPT and commit to attending the planning grant RFP meeting was the first indication that some level of readiness and capacity exists. Schools who did not meet this minimum first requirement were not eligible to apply for funding. The pool of applicants was reduced substantially by these requirements to include only those applicants who demonstrated the readiness deemed necessary to implement the overall process in their district. Readiness During the RFP meeting, participants were provided with an overview of the initiative, a description of expectations of participating schools, and a summary of the selection and implementation process, including the supports provided by the MU PSS. Teams that attended the RFP meeting were provided with the planning grant application and access to an online readiness assessment that included an adaptation of the Ohio Community Collaboration Model for School Improvement Readiness Assessment Tool (OCCMSI; Flaspohler et al. 2008a); the Collective Efficacy Scale (Goddard et al. 2000); and items from the StrengthsBased Practices Inventory (SBPI; Green et al. 2004). The assessment tools used in the readiness assessment measured perceptions about schools’ capacity and ability to implement a program presently (OCCMSI Readiness Assessment Tool), teachers’ ability to work together effectively and efficiently (Collective Efficacy Scale), and positive and supportive practices among staff members (Strengths-Based Practices Inventory). The CPT was charged with distributing the online readiness assessment to all staff members at the school and promoting

433

participation. Survey responses were scored and summarized; each school received a brief report that included frequencies and descriptive statistics for individual items from the OCCMSI and SBPI, as well as scale scores and norms for the Collective Efficacy Scale. The CPT then used results from their readiness assessment in developing their planning grant application. Schools were encouraged to examine their survey data, identify their strengths and weaknesses, share the results of their survey with their staff, and develop an action plan to reinforce their strengths and address their weaknesses. The completion rate for the online readiness assessment (i.e. number of staff who completed the survey divided by number of staff in the school) was used as an additional indicator of a school’s readiness, as it illustrated the ability of the CPT to mobilize personnel at their school. Planning Grant Application The final contingency of the recruitment and selection process was the submission of the planning grant application. The application was designed to allow schools to provide evidence of readiness and capacity that would predict later successful program adoption and implementation. Each school was expected to use data gathered from the readiness assessment to support their argument for funding, along with any other data that they could access (e.g., school disciplinary data, evidence of committed professional development funds, strong parent involvement in school). Their applications included: organizational/ demographic information; identification of problems, opportunities, and needs in their building; core and larger planning teams membership; a logic model; a plan for process evaluation; and a budget. Each application was reviewed based on the following criteria: commitment to the planning process, commitment to implementing an evidence-based program, strength of overall argument, staff survey completion rates, readiness assessment results, prevalence of behavioral health problems, percentage of students identified as being severely emotionally disturbed (SED), percentage of students who qualify for the Free/Reduced Lunch program, and budget. In addition to providing information about school readiness to implement primary prevention programs, the data collection and use in the recruitment and selection phase of this project focused on data use as a key element of success in the initiative as a whole. Figure 2 provides a summary of the winnowing process used to select ready and willing schools. The figure illustrates the number of schools who attended the RFP meeting, participated in the readiness assessment, completed a planning grant application, and were selected to receive planning grant funding. Attendance at the initial RFP

123

434

Fig. 2 The winnowing process to select ready and willing schools to participate in the Improving Student Behavioral Health: School-Wide Evidence-Based Prevention Programs initiative

meeting has been stable, averaging 17 schools per year. This represents about 3 % of schools eligible to participate making the RFP contingency the biggest cut in the winnowing process. We suspect several factors influence attendance at the RFP meeting. The most influential factor is likely awareness of the initiative. Although the Health Foundation of Greater Cincinnati uses multiple marketing strategies (e.g., direct mail, electronic mail) to promote the initiative, it is uncertain whether these strategies are effective in capturing the attention of busy teachers, administrators and other school-serving professionals. When aware, we suspect many self-select out based on different reasons including perceptions regarding qualifications (e.g., this is intended for lower income schools), need (e.g., we don’t need a prevention program), or burden (e.g., this will be too much work). Approximately half of the teams that attend the initial RFP meeting submit planning grant applications; about a third of the schools that opt out do so after the RFP meeting. The other two-thirds drop out after receiving the readiness assessment report. It may be that the readiness assessment report raises awareness of barriers to implementation or otherwise leads school teams to opt out of the implementation process. Several schools submitted planning grants but did not receive funding. These schools did not differ from funded schools structurally or in terms of level of need. Most received lower scores on indicators of readiness or had difficulty integrating data into their

123

Am J Community Psychol (2012) 50:428–444

applications. Reviewers suggested unfunded schools did a less compelling job of ‘‘telling their story’’ within the application. Each year at the culmination of the recruitment and selection phase, five schools are chosen to participate. Fifteen schools (12 from Ohio, 3 from Kentucky) have received planning grants. These schools vary considerably among a number of characteristics. Schools vary in ethnic diversity from less than 1 % minority to 100 % minority (predominantly African-American). The schools vary in size from small (208 students) to large (862 students). The Free/Reduced lunch percentages range from 19 to 92 %. Ohio State Report Cards were available for the 12 Ohio schools.1 Six out of seven group types from the Typology of Ohio School Districts are represented by funded schools. These include high poverty rural/agricultural (3), rural/ small town (1), high poverty major urban (4), and urban/ suburban school districts (2). Only very high median income, very low poverty districts are not represented among participating schools. Ohio schools’ academic designations ranged from Excellent with Distinctions (1) to Academic Emergency (1) with most (7) designated as effective. Seventy-five percent (9 out of 12) of the schools made Adequate Yearly Progress, with Performance Indices ranging from 62.5 to 102.0. Thirty-three percent (4 out of 12) of the Ohio schools have a ‘‘High’’ poverty status, twenty-five percent (3 out of 12) have a ‘‘Medium–High’’ status, twenty-five percent (3 out of 12) have a ‘‘Medium– Low’’ status, and seventeen percent (2 out of 12) have a ‘‘Low’’ poverty status. These data suggest the recruitment and selection process affords participation to a diverse array of schools.

Capacity Building During Program Planning Upon reward of planning grants, each cohort of five schools begins a structured planning process designed to build capacity and support the development of a successful implementation grant proposal. This planning process was based on the first five steps of the Getting to Outcomes model (Chinman et al. 2004): (1) assess needs and resources, (2) identify goals and objectives, (3) select an evidence-based program, (4) adapt the program to fit the local context, and (5) develop a plan for program implementation and evaluation. During this phase, all CPTs participated in a series of quarterly meetings focused on the stages of the planning process, and these regular meetings afforded the development of a larger community of practice involving all participating schools. Each CPT also 1

Data from the Kentucky School Report Cards are not directly comparable.

Am J Community Psychol (2012) 50:428–444

received regular, on-site independent consultation from the MU PSS. The following section describes the stages of the planning process with explicit focus on how the MU PSS worked to build capacity to assure high-quality implementation of prevention programs. Needs Assessment Schools are more likely to implement a prevention program successfully if the program matches an identified need within the school. The first phase of the planning process was to gather empirical evidence to illustrate the priority needs within a particular school. The CPTs received training and ongoing technical assistance to conduct a comprehensive needs and resource assessment with students, school personnel, and parents. The student needs assessment tool included the Collaborative for Academic, Social, and Emotional Learning (CASEL) Implementation Toolkit Survey (Devaney et al. 2006), the Strengths and Difficulties Questionnaire (SDQ, Goodman 1997), and Brief Multi-dimensional Students’ Life Satisfaction Scale (BMSLSS; Seligson et al. 2003). In all, students completed 105 items using a variety of response options (e.g., 5 point Likert scale from Strongly Agree to Strongly Disagree, four point Likert Scale from ‘‘No, never’’ to ‘‘Yes, all of the time’’). The student needs assessment tool yielded scale scores for student perceptions of student skills (seven domains including self-management, problem solving, and relationship skills), school climate (ten domains including classroom climate, discipline, bullying, and care and support), strengths and difficulties (five domains including prosocial behavior, emotional symptoms, and peer problems), and life satisfaction. School staff completed a 68 item needs assessment tool which includes portions of the CASEL implementation toolkit, the OCCMSI Readiness Assessment, and life satisfaction. The staff needs assessment yielded multiple scales assessing student behavior (e.g., bullying, disruptive behavior, anxiety), school climate (e.g., support, orientation to innovation, parent involvement), and readiness. Parents completed a 31 item needs assessment adapted from the CASEL implementation toolkit. Individual items assessed parents’ perceptions of various aspects of the school including how the school values parent involvement, how teachers treat students, and care and support provided by school staff. The MU PSS provided technical assistance including data entry, cleaning, and preliminary analyses of the needs assessment. Each CPT received a report summarizing results of the needs assessment. The report included frequencies and descriptive statistics for items and scales. The MU PSS worked with each CPT individually to assist with interpretation of the needs assessment results.

435

Selecting an Evidence-Based Prevention Program MU PSS provided in-depth consultation to each CPT to assist with selecting and implementing an evidence-based prevention program based on their unique needs, resources, and school context. Results from the needs and resource assessment revealed three common areas of need across all participating schools: concerns regarding school climate (e.g., student-staff relationships, sense of safety), disruptive behaviors (e.g., classroom disruptions, bullying), and emotional symptoms among students (e.g., anxiety, depression). In addition to the common needs identified among the participating schools, some specific areas were identified during the needs assessment including behavioral issues (such as bullying, teasing, and conduct problems), concerns about students’ safety while at school, students’ emotional functioning and well-being, relationship skills, pro-social behavior, and problem-solving skills. Each school received individualized, on-site consultation to assist with identifying and determining the most relevant needs within their building. Two factors constrained choice in selecting a program or service: (1) Programs or services must be universal (i.e. given to an entire population regardless of need) and (2) Programs or services must be evidence-based (i.e. developed and evaluated using prevailing scientific conventions). The MU PSS directed CPTs to standard resources for evidence-based prevention programs (e.g., What Works Clearinghouse, Blueprints for Violence Prevention, National Registry of Effective Programs). In addition, the MU PSS encouraged CPTs to consider other factors that could influence the quality of implementation of an evidence-based program. Two examples of those factors were whether or not the selected program would fit with already-existing programs and services and whether or not the selected program should be a curricular or school climate program. We discuss these factors in more detail below. Fit with Existing Programs and Services Throughout the planning process, many of the schools identified already-existing efforts targeting non-academic barriers to learning as potential confounding influences to the adoption of a new primary prevention program. Efforts such as Positive Behavior Intervention and Supports (PBIS), character education programs, social skills programs, and classroom behavior-management strategies were identified in some of the participating schools. In these cases, significant amounts of resources, time, and effort were already being exerted to address a specified need. In order to successfully adopt an additional program, schools with existing programs/interventions/strategies expressed a desire to link the new program within the

123

436

Am J Community Psychol (2012) 50:428–444

Table 1 Case examples of the process used by two participating schools to select their program One of the schools participating in the initiative, an urban school (grades PK-8), identified bullying, teasing, and other problematic behaviors as their primary area of need. This school desired a program that would reduce and prevent bullying and other aggressive behaviors among their student body. The CPT recognized that an already-existing character education program should be taken into consideration as a match to the newly selected program, so the teachers and staff who served on the character education program committee also served as the CPT. A second consideration of the CPT was the lack of classroom time within the school’s academic schedule for curriculum-based lesson plans. Based on these considerations, the school selected the Olweus Bullying Prevention Program, a school climate-changing bullying prevention program that allowed the school to incorporate the character education social values into the design of the new program’s anti-bullying school rules, policies, and procedures A small rural elementary school (grades 1–5) identified relationship skills and emotional symptoms as their primary areas of need. This school had previously implemented a school climate-changing anti-bullying program and had successfully reduced their levels of bullying behaviors. Their CPT had extensive knowledge about evidence-based primary prevention programming and desired a program that would complement the existing program that was already embedded in the school’s culture. Recognizing the benefit of additional social-emotional –behavioral programming, school administrators agreed to carve out significant amounts of classroom instruction time to be devoted to the new program. This school also expanded the role of the existing anti-bullying committee to include implementation of the new program. They adopted the Promoting Alternative Thinking Strategies Program, a curriculum-based program that targeted improvement of student relationships and emotional functioning

current efforts. Schools have been encouraged to integrate and coordinate interventions designed to address health, mental health, and educational needs to facilitate effective and efficient models of school supports (Weist 2005; Atkins et al. 2010; Doll and Cummings 2008). One of the major functions of frameworks and/or models such as Positive Behavior Intervention and Supports (PBIS: Sugai et al. 2000), Response to Intervention (RTI: Fuchs and Fuchs 2006), and Coordinated School Health Program (CSHP: Allensworth and Kolbe 1987) is to align student needs with an organized system of interventions. As such, an important consideration that influenced the selection of the evidence-based primary prevention program to be adopted was the match between already-existing strategies and the new program. Schools were encouraged to integrate the duties of their already existing committees with the goals of the CPTs to ensure that these concerns were addressed and to limit the redundancy in work being done by staff members. Feasibility: Curricular Versus School Climate Program Another important consideration was the availability (or lack of) of classroom time to facilitate program components. Many evidence-based primary prevention programs have been developed as a supplemental classroom curriculum in which various student skills and topics are taught in an effort to complement the standard academic content. A benefit of such curriculum-based programming is the standardization of social-emotional-behavioral content, ensuring each student participating in the program receives the same level and form of instruction. A downside of curriculum-based programming is the need for substantial amounts of classroom time devoted to the program. A second type of evidence-based primary prevention program targets change in the climate of a school. A benefit of

123

school climate programs is that they influence all aspects of a school’s culture, including the norms, values, and expectations within the building and ultimately become an integral part of the culture. A downside is that school climate programs take a significant amount of time and effort to become established and often require schools to develop and create compatible activities, tasks, policies, and procedures that match with the overall goals and objectives of the program. Schools participating in the initiative took these considerations into account during the selection of their programs. Although the preceding discussion has focused on all schools, it is important to examine how individual schools (with their idiosyncratic needs) arrived at the decision to select a particular program. To illustrate this process, two case studies from the current project describe how the schools examined their circumstances and selected the prevention program that best fit their situation (see case studies in Table 1). Across three cohorts, five evidencebased prevention programs were selected by participating schools. Two are curriculum-based programs: Promoting Alternative Thinking Strategies (PATHS, Kusche and Greenberg 1994) and Lifeskills (Botvin et al. 1990). Three are school climate programs: the Olweus Bullying Prevention Program (Olweus 1993), Caring School Communities (Battistich and Schaps 1996), and PeaceBuilders (Flannery et al. 2003). Implementation Grant Application The MU PSS supported the CPTs through the process of completing an application for a non-competitive implementation grant. The MU PSS worked with each team to synthesize needs assessment results (and other data sources such as school disciplinary data, academic achievement data, etc.) and develop their argument for selecting a

Am J Community Psychol (2012) 50:428–444

specific evidence-based primary prevention program. Once schools selected their program, the MU PSS offered technical assistance and support (e.g., development of a logic model and evaluation plan) as they wrote their implementation grant applications. Their applications included: organizational/demographic information; identification of problems, opportunities, and needs in their building; support for their selection of prevention program; implementation team and larger team membership; a logic model; a plan for process and outcome evaluation; and a budget. Implementation grants were reviewed by a team organized by the funder. To date, only three schools (of 15) were unable or unwilling to advance to the implementation stage. One school produced an implementation grant that failed to meet minimum criteria for acceptability. Another school underwent redesign (new principal and staff) after failing to meet minimum academic standards and did not submit an implementation grant application. A third school produced a successful implementation grant but withdrew from the project after being notified that they were merging with another school.

Capacity Building During Program Implementation Upon award of the grants, CPTs became core implementation teams (CITs). The MU PSS facilitated a series of shared learning sessions with all CITs to discuss the general and program-specific capacities that needed to be developed and maintained in order to implement and sustain their selected programs. General Capacities The MU PSS encouraged teams to develop general capacities in order to support the implementation of prevention programming, including professional development opportunities for school staff to learn about specific content areas that could assist them in addressing non-academic barriers to learning. In particular, the MU PSS engaged CITs in discussions about social marketing, cultural competency, and family engagement. These discussions were designed to increase CITs’ knowledge about issues pertinent to the implementation of a school-based prevention program and increase their abilities to engage various stakeholders throughout the implementation process. All participating CITs continued to meet quarterly with the MU PSS. In addition to the training on specific content pertinent to the project goals, each meeting was structured to provide teams with the opportunity for information exchange. CITs had the opportunity to share problems and concerns they were experiencing with regard to the grant

437

project, progress they were making in achieving their goals and objectives, and lessons learned through the process. The MU PSS emphasized the importance of this shared learning process within the community of practice. In addition, the MU PSS assisted schools with garnering buy-in from key groups such as staff, students, and parents; data collection, analysis, and synthesis activities; communication of progress made, objectives met, and goals achieved; and movement toward sustainability after funding ends. Each of these general capacities supported the core implementation teams’ progress toward successful program implementation and, hypothetically, sustainability. Program-Specific Capacities The MU PSS met regularly on site with each core implementation team to discuss how site-specific technical assistance and training objectives were being achieved. Schools contracted with external trainers/providers to obtain school-specific training and TA for their selected programs. The MU PSS conducted regular site visits and meetings with each CIT to ensure that schools were receiving adequate program-specific support from each of their respective trainers/providers. The MU PSS also attended program trainings and workshops allowing the MU PSS to interact directly with program trainers and to gain perspective on how best to adapt programs to fit local conditions. This level of support provided teams with proactive and reactive assistance with program specific issues related to training and technical assistance. By accruing experience with trainers, programs, and support systems, the MU PSS developed a knowledge base that promoted more effective training strategies. This included assistance with arranging for program specific training and technical assistance for their intervention. The MU PSS worked with schools to coordinate trainings in order to reduce costs and mediate challenges that emerged between schools and the providers of training in any specific evidence-based program. For example, the MU PSS worked with trainers to help the schools identify appropriate and theoretically sound cultural adaptations for programs. The MU PSS participated in program-specific trainings allowing for direct interface with trainers and for additional perspective on fit. When issues arose at schools, the MU PSS mediated as CIT members negotiated with program trainers. Overall, the role of the MU PSS regarding program-specific capacities was to hold the external trainers/providers accountable for providing core implementation teams with the tools, resources, and knowledge that they needed to be successful during implementation.

123

438

Am J Community Psychol (2012) 50:428–444

Fig. 3 Implementation fidelity percentages by program and year of implementation. Out of 12 schools implementing programs, n indicates the number of schools that have implemented each program by year

Evaluation and Accountability The MU PSS assisted each CIT in meeting its evaluation and accountability requirements. CITs defined process evaluation and outcomes evaluation plans in the form of a logic model submitted with their implementation grant. CITs were held accountable for achieving an acceptable level of implementation fidelity to the program they selected and for achieving specific positive social, emotional, and behavioral outcomes defined in the implementation grants. The MU PSS provided both proactive and reactive (problem-based) training and technical assistance to support schools in developing evaluation plans, collecting and using data, and meeting accountability requirements. The following sections describe the role of the MU PSS in assisting core implementation teams with implementation fidelity and outcome evaluations. Fidelity Faithful implementation is critical to obtaining program success. Implementation fidelity refers to the degree to which the teachers and other school staff implement the selected program the way that the program developers intended. A primary role of the MU PSS in this project was to build the capacity of schools to assess the extent to which they were implementing their selected program with fidelity. Out of the five evidence-based programs selected by the schools, only one (the Olweus Bullying Prevention Program) had a comprehensive implementation fidelity measure.2 The 2

One of the other prevention programs provided a fidelity tool that assessed individual class lessons but did not assess other components of the programs.

123

OBPP Schoolwide Implementation Checklist, was designed to assess several domains of implementation including: (1) buy-in from administrators and educators, (2) participation in program training, (3) procurement of program materials, (4) implementation of core program components, (5) assessment of program outcomes, and (6) achievement of program goals and objectives. In order to assess the implementation fidelity of the other evidence-based programs, the MU PSS developed a comparable implementation fidelity checklist for each of the programs using the Olweus tool as a model. Each of the programs’ core program components, outcomes, goals, and objectives were included in the implementation fidelity checklist. The MU PSS assisted schools in completing their respective implementation fidelity checklist at the end of each year of implementation. Implementation fidelity checklists were rated on a three-point Likert scale (‘‘completed’’, ‘‘making good progress’’, or ‘‘progress needed’’) with a ‘‘not applicable’’ option for each item. Items that were indicated to be ‘‘completed’’ received two points, items in which the school was ‘‘making good progress’’ received one point, and items that ‘‘needed progress’’ received zero points. Items that were considered ‘‘not applicable’’ were removed from the analysis of fidelity for that year (e.g., pre-program training items in subsequent years). Implementation fidelity percentages were calculated by dividing the total number of implementation fidelity points achieved by the total number of points possible and converting that score to a percentage. Figure 3 identifies the implementation fidelity percentages (by selected program) following one, two, and three years of program implementation.

Am J Community Psychol (2012) 50:428–444

As evidence of the importance of technical assistance to enhance schools’ capacity to be successful, 100 % of schools involved in this project monitored their implementation fidelity on a yearly basis and all meet a minimum threshold for implementation to achieve outcomes (60 % according to Botvin 2000) in their first year of implementation. Schools implementing curriculum-based programs (e.g., PATHS) tended to achieve higher levels of fidelity faster and to maintain higher levels of fidelity compared with schools implementing school climate programs (e.g., Olweus). Likely, this results from the fewer components and more prescriptive nature of curriculumbased programs. The more organic school climate programs (e.g., Olweus, Caring School Communities) showed differential patterns of implementation fidelity. Some schools (Caring School Communities, Peacebuilders) showed upward trajectories indicating the developmental nature of these programs (schools adding core components strategically and systematically rather than implementing the program all at once). Others (Olweus) showed declining patterns of fidelity which we believe reflects increased ownership and customization of the program to fit the context of individual schools. Durlak and Dupre (2008) showed that implementation fidelity is not an all or nothing proposition and that implementation thresholds may vary. A key role of the MU PSS involved working with CITs at each school to examine their fidelity assessments and determine whether and how to intervene in order to increase the probability of achieving outcomes. Self-report fidelity measures may be susceptible to biased reporting; however, schools participating in this initiative were held accountable for both high quality implementation and for producing outcomes. Our experience and data suggest that participating schools are willing to report implementation challenges and work to address those challenges for the sake of achieving outcomes. Outcomes Evaluation As part of the implementation grant, the MU PSS assisted each school with identifying goals and measurement plans in order to be able to demonstrate outcomes. Goals were chosen based on priorities within each school and typically included changes in student skills (e.g., peer relations, emotional vocabulary) and behaviors (e.g., bullying, aggression, substance abuse) as well as teacher/staff perceptions (e.g., school climate) and behaviors (e.g., time on task, time spent redirecting students). Outcomes measurement plans for schools included student and staff reports on program-specific tools (e.g., PATHS Student Evaluation, Olweus Bully-Victim Questionnaire), common outcomes tools (e.g., SDQ), and other forms of data (e.g., office referrals for negative behaviors). For example, one school

439

(implementing PATHS) identified reduction in aggressive behaviors, increases in cognitive and emotional skills, and reductions in classroom disruptions as anticipated outcomes of the program. The assessment plan included multiple informants (students, teachers) and multiple measures: PATHS student evaluation, CASEL implementation toolkit staff survey, and the Dynamic Indicators of Basic Early Literacy Skills (a widely used measure of early reading skills; Good and Kaminski 2002). The MU PSS assisted core implementation teams in developing, administering, and interpreting outcome evaluations. Specifically, the MU PSS developed customized evaluation tools, assisted teams with the development of an evaluation procedure, tabulated survey results, and provided teams with preliminary evaluation reports. Each school engaged in both common and program-specific outcomes assessment through data collected annually from students, school staff, and parents (if desired). The common elements that the MU PSS evaluated at all schools included classroom social climate, disruptive behaviors, and emotional problems. Although the evidence-based programs selected by schools differed on some specific elements, it was anticipated that the implementation of a primary prevention program fostered improvement in some common areas. The MU PSS entered and analyzed data collected annually on common and program-specific outcomes for each school and prepared a comprehensive outcomes report summarizing each school’s data. The MU PSS then met with each team to provide guidance for interpreting the data and to assist with communicating results to interested parties (i.e. their staff, the funder, other possible funders).

Conclusion of Ready, Willing, and Able The purpose of this paper was to describe the structures and process of a Prevention Support System (Miami University Prevention Support System; MU PSS), an innovative prevention support system designed to support high-quality implementation of universal prevention programs in schools. Grounded in the Interactive Systems Framework model, the MU PSS focused primarily on recruiting ‘‘ready’’ schools, building general capacities, and assisting the development of program/innovation specific capacities in a limited number of schools. As evidence of success, we presented data that shows participating schools are implementing a variety of prevention programs with high fidelity and tracking outcomes. This paper concludes with a discussion of several characteristics or operational principles that we believe are critical to the successful implementation of programs. They include systematic evaluation of existing capacity and

123

440

readiness in schools or organizations, focusing on the best practice process rather than solely on evidence-based practices, building general evaluation capacity to mainstream evaluation practice and accountability efforts, and fostering communities of practice to support ongoing prevention efforts. Stemming from lessons learned during the current project, each of these principles could be considered decision-making points for the work of other prevention support systems.

Work with the Ready A key consideration in developing a prevention support system involves defining the scope and range of action for the system. In any system, resources are limited and the scope of intervention is finite; funders and support systems face the challenge of determining how to distribute resources efficiently and effectively. When resources are limited, they should be deployed in a manner that ensures the greatest benefit for the investment. This intervention was designed to maximize the probability of successful implementation of prevention programs in a small number of schools. Consistent with previous research (Flaspohler et al. 2008a, b), recruitment strategies were designed to assure that selected schools demonstrated a minimum level of initial capacity to participate in the project. The approach to harnessing capacity and readiness includes traditional measurement tools in conjunction with a set of demonstrations or behavioral indicators of readiness and capacity. Triangulating findings from the measurement tools and behavioral indicators allow us to be confident of these minimal capacities in participating schools. The behavioral indicators (e.g., capacity to muster a team for the RFP meeting) may be the most important elements of the screening process; the greatest reduction in the size of the pool occurs through the RFP meeting. Early results from our program suggest (with few exceptions) that we are successful in recruiting ‘‘ready’’ schools. Beyond the initial hope to maximize probability of successful implementation, there are other proximal and distal benefits presumed to emerge from working with a limited number of ‘‘ready’’ schools. In the short term, engagement of a large percentage of school staff through readiness surveys alerts teachers and staff that change, in the form of new prevention programming, may be imminent. Feedback from the readiness and capacity assessments shed light on organizational strengths and challenges that may be useful for school administrators, whether they are selected to participate or not. In the long term, working with a limited number of ready schools maximizes the probability of successful implementation, assuring the development of successful local models of prevention

123

Am J Community Psychol (2012) 50:428–444

programs. Through diffusion, local models may create additional incentive for these prevention programs and the processes that support them. Working with the ready may have drawbacks. Strategies that focus on eliminating schools without sufficient preliminary readiness may lead to the allocation of resources outside the highest areas of need. Schools and communities with the highest need are likely to experience lower levels of initial capacity and readiness and may not be selected to participate. However, the limited research available on supporting implementation of evidence-based programs suggests that a minimum level of initial capacity must be present to assure successful uptake of training and technical assistance (Chinman et al. 2005). Establishing a prevention support system requires careful consideration of how best to manage available resources. Our approach provides one model that we believe is well suited to supporting implementation of evidence-based prevention programs in schools.

Show up with Best Practice Process The activities of a prevention support system should be both systematic and flexible. A second key ingredient to an effective prevention support system is the emphasis on best practice process over best practice program and striking the necessary balance between best practice process structures and relationship building activities undertaken by the PSS. Borrowing from the ‘‘best practices’’ vernacular typically used to describe empirically-supported programs or interventions, ‘‘best practice process’’ refers to a systematic decision-making process and set of strategies that can be used to drive implementation of evidence-based practices. That is, a best practice process offers a guide for how evidence-based practices should be implemented rather than what or which evidence-based practices should be implemented (Splett and Maras 2011). Examples of best practice processes include Getting to Outcomes (Chinman et al. 2004), Communities that Care (Hawkins and Catalano 1992), Prosper (Spoth et al. 2004), and Partnerships for Success (Julian 2006). Implementation of all evidence-based programs requires behavioral change on the part of individuals in the host setting (e.g., school, community-based organization). Successful behavior change strategies require attending to both skills and motivation, thus we would argue that prevention support systems must include both structural elements (e.g., best practice process) and relational elements (Wandersman and Florin 2003). As the structural element, best practice process supplies both the ‘‘what?’’ and ‘‘how?’’ of implementation. Showing up, the relational element, serves to foster motivation among individuals in

Am J Community Psychol (2012) 50:428–444

the host setting to engage in best practice process, supports them during skill acquisition, and continually draws attention to the ‘‘why?’’ underlying what they are doing. Regular meetings between members of the prevention support system and the host setting afford opportunities to develop strong supportive relationships, and, arguably, each relationship established between the prevention support system and the host setting is as important as a step in the best practice process. That is, the relationship and the process are both necessary, while neither is sufficient to assure successful implementation. All best practice process models emphasize needs and resource assessment and matching intervention to need as critical steps in the process of moving an evidence-based program to a new setting. Using local data to ground selection of interventions functions at both the structural and motivational level by undergirding the essential ‘‘why?’’ question of behavior change. Communities and schools are much more likely to change their behavior (e.g., implement a new program), when they can link that behavior change to clearly, empirically established need in the local community. Local data may also play a critical role in supporting the development of local community partnerships necessary to sustain programs after external support is exhausted. A key advantage of adopting a best practice process approach over the traditional, innovation-driven approach to implementing evidence-based programs is that the prevention support system is not allied or beholden to any particular program or intervention. Rather, the prevention support system is committed to structural and relational elements that are necessary to facilitate successful implementation of any innovation. The MU PSS was designed to assure that we are not allied with any particular innovation or evidence-based program, allowing the MU PSS freedom from advocating for one intervention or another. This flexibility affords greater local control over programming choices allowing schools to select interventions after establishing evidence of need. In addition, the Prevention Support System’s independence from any particular program affords impartial view of both programs and context, enhancing the credibility of the MU PSS when serving in a mediating role between schools and the innovation trainers.

441

practice as opposed to a marginalized activity. Federal, state, and local legislation demand evaluation and accountability in social service and educational activity (e.g., No Child Left Behind Act of 2001); however, our experience suggests that the skills and motivation required to fulfill accountability requirements or maximize the potential for evaluation use are rarely present in schools. The centrality of evaluation as a prevention support activity is evident in all aspects of the intervention. The MU PSS is designed to assist each school in meeting the accountability demands of the funder. By design, the MU PSS operates independent of the funder, working together with each school to assist them with meeting the funder’s requirements. The MU PSS provides assistance to each school in developing evaluation plans, collecting and analyzing data, and completing evaluation reports expected by the funder. The MU PSS scaffolds evaluation skill development within each core team, with the aim of leaving each participating school capable of sustaining evaluation activities without assistance. Expectations and incentives for data collection, analysis, and use are established early and replicated regularly. Even before schools are selected to participate, they are compelled to collect and use data. The process of data collection and use is replicated regularly within the process: readiness and capacity data is used in planning grants, needs and resource assessment data is used in implementation grants, and fidelity and outcomes data is used in annual reports to funders. Participating schools are socialized into an evaluation culture where data informs decisions both at each school and about each school. While evaluation is acknowledged as a critical ingredient of successful prevention programs (Nation et al. 2003), evaluation tends to be underemphasized in traditional dissemination and implementation efforts. Of particular concern was the paucity of user-friendly evaluation tools and procedures for many of the evidence-based programs (for example, only the Olweus Bullying Prevention Program provides tools and procedures for monitoring fidelity of implementation). Linking evaluation activities to the acquisition and maintenance of funding is one strategy to incentivize evaluation. Prevention support systems will need to continue to innovate in developing strategies to promote evaluation use.

Mainstream Evaluation and Accountability Building Communities of Practice We consider the mainstreaming of evaluation and accountability to be the single most important task of the prevention support system. Mainstreaming occurs when something becomes a part of an organization’s daily professional responsibilities. The MU PSS works to ensure establishment and maintenance of an evaluation culture within participating schools, making evaluation a part of good professional

A final ingredient believed to play a role in supporting implementation involves the development of communities of practice at multiple levels of the intervention. A community of practice is a group of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly (Wenger et al. 2002).

123

442

Throughout this initiative, the MU PSS supported the development of communities of practice at the school level, among schools participating in the initiative, and between these schools and the broader, national health and mental health communities of practice. By fostering regular interactions, shared activities, and common goals among each group, these communities of practice emerged as integral elements of the intervention. Each participating school developed a local community of practice by forming an interdisciplinary core team to participate in planning and implementation processes. An interdisciplinary team was designed to promote buy-in and support from teachers, administrators, and other schoolserving professionals. The MU PSS provided assistance to core teams in navigating relational, structural, or political barriers existing within schools. Providing grantees with an opportunity to learn from and advise one another helped to promote a second level of community of practice. This community of practice was established among core planning/implementation teams within and across the three cohorts. The MU PSS facilitated quarterly meetings with core teams and provided structured time to exchange information about successes and challenges, to engage in shared problem solving, and to become models and supports for each other. The multi-year structure of the initiative afforded the opportunity for schools in the early cohorts to provide guidance to schools in later cohorts. Having worked with multiple schools across multiple years, the MU PSS was in a unique position to aggregate lessons learned through the project and continually build new knowledge that could assist schools in overcoming challenges and barriers. The importance of professional development opportunities and sharing lessons learned from the local level with a national audience empowered grantees to understand the broader importance of the work they were doing. Each participating school received funding to participate in professional development activities, including attendance at national conferences. Grantees were encouraged to present (Meehan et al. 2007, 2009) at conferences sharing both general and program-specific lessons learned. The opportunities to interact with the national communities of practice provided an opportunity for team members to be a part of a something bigger than the work in their own school. Despite the value of participating in communities of practice, there are some downsides. Overall, the time spent in different levels of community of practice are integral to the success of the intervention, by fueling the motivation levels of participants and expanding the range of perspectives that can be brought to bear on problem solving. In order to participate in the communities of practice, however, participants must spend time away from classroom or direct service activities that are valued within the school context. In addition to the burden of time commitments, it

123

Am J Community Psychol (2012) 50:428–444

can be financially draining for schools (especially when outside funding is not available) to provide substitutes to fill in for staff on leave. In times of tight budgets, participating schools experienced pressure about spending funding to attend conferences or quarterly meetings, when many are cutting personnel and essential services. It is essential however to balance these challenges with the need for team members to have time to reflect in a way that supports engagement in reflective practice.

Final Thoughts It is important to note that the MU PSS is a well-resourced prevention support system model: the Health Foundation of Greater Cincinnati provided ample funding to support the activities of the MU PSS, schools/grantees, external trainers/training, foundation staff, and the external evaluation team. In light of the recent economic downturn, it is unlikely that foundation support and funding will be readily available to most schools seeking to implement prevention programming. As such, it is essential to find a way to institutionalize the work of prevention support systems so that all schools have access to critical support services. Current resource expenditures at the local, state, and federal levels focused on the dissemination of evidence-based prevention programs in schools should be evaluated for potential cost-saving measures. As prevention support systems continue to evolve and strengthen, it is imperative that all stakeholders, including funders, researchers, practitioners, and policy-makers, expand their understanding of the function and potential of models like the MU PSS in supporting effective and efficient prevention practices in schools and other organizations. In summary, there is little evidence to suggest that the physical, mental, and behavioral health of children in schools is improving dramatically, and the high level of unmet needs among young people in this country necessitates prevention as the primary focus in schools. However, research suggests that current models for supporting the implementation of evidence-based programs in schools are not successfully producing positive student outcomes as expected. This paper described the activities and ingredients of the MU PSS, a prevention support system based on the Interactive Systems Framework model, and provided initial evidence supporting how the PSS makes schools ready, willing, and able to implement universal prevention. References Allensworth, D. D., & Kolbe, L. J. (1987). The comprehensive school health program: Exploring an expanded concept. Journal of School Health, 57(10), 409–412.

Am J Community Psychol (2012) 50:428–444 Altman, D. G. (1995). Sustaining interventions in community systems: On the relationship between researchers and communities. Health Psychology, 14(6), 526–536. Anderson, C. S. (1982). The search for school climate: A review of the research. Review of Educational Research, 52, 368–420. Atkins, M., Hoagwood, K., Kutash, K., & Seidman, E. (2010). Toward the integration of education and mental health in schools. Administration and Policy in Mental Health and Mental Health Services Research, 37(1–2), 40–47. Backer, T. E. (1991). Drug abuse technology transfer. Rockville, MD: National Institute on Drug Abuse. Battistich, V., & Schaps, E. (1996). Prevention effects of the Child Development Project: Early findings from an ongoing multisite demonstration trial. Journal of Adolescent Research, 11, 12–35. Biglan, A., & Taylor, T. (2000). Why have we been more successful in reducing tobacco use than violent crime? American Journal of Community Psychology, 28, 269–302. Botvin, G. J. (2000). Preventing drug abuse in schools: Social and competence enhancement approaches targeting individual-level etiologic factors. Addictive Behaviors, 25, 887–897. Botvin, G. J., Baker, E., Dusenbury, L., Tortu, S., & Botvin, E. (1990). Preventing adolescent drug abuse through a multimodal cognitive-behavioral approach: Results of a 3-year study. Journal of Consulting and Clinical Psychology, 58, 437–446. Center for the Study and Prevention of Violence (CSPV). (2008, March 19). Blueprints for violence prevention. Retrieved March 19, 2008, from the World Wide Web: http://www.colorado.edu/ cspv/blueprints/. Centers for Disease Control and Prevention. (2010). Youth risk behavior surveillance—United States, 2009. Surveillance Summaries, June 4, 2010. MMWR 2010; 59 (No. SS-5). Chinman, M., Hannah, G., Wandersman, A., Ebener, P., Hunter, S. B., Imm, P., et al. (2005). Developing a community science research agenda for building community capacity for effective preventive interventions. American Journal of Community Psychology, 35, 143–157. Chinman, M., Imm, P., & Wandersman, A. (2004). Getting to outcomes 2004: Promoting accountability through methods and tools for planning, implementation, and evaluation. Santa Monica, CA: RAND Corporation Technical Report. Available at: http://www.rand.org/publications/TR/TR10/. Comer, J. P. (1981). Societal change: Implications for school management. Washington, DC: National Institute of Education. Devaney, E., O’Brien, M. U., Resnik, H., Keister, S., & Weissberg, R. P. (2006). Sustainable schoolwide social and emotional learning: Implementation guide and toolkit. Chicago: Collaborative for Academic, Social, and Emotional Learning. Doll, B., & Cummings, J. (2008). Why population-based services are essential for school mental health, and how to make them happen in your school. In B. Doll & J. Cummings (Eds.), Transforming school mental health services: Population-based approaches to promoting the competency and wellness of children. Thousand Oaks, CA: Corwin Press. Domitrovich, C. E., Bradshaw, C. P., Poduska, J., Hoagwood, K., Buckley, J., Olin, S., et al. (2008). Maximizing the implementation quality of evidence-based preventive interventions in schools: A conceptual framework. Advances in School Mental Health Promotion, 1, 6–28. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350. Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science, 5(1), 47–52.

443 Feuer, M. J., Towne, L., & Shavelson, R. J. (2002). Scientific culture and educational research. Educational Researcher, 31(8), 4–17. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Flannery, D. J., Liau, A. K., Powell, K. E., Vesterdal, W., Vazsonyi, A. T., Shenyang, G., et al. (2003). Initial behavior outcomes for the PeaceBuilders universal school-based violence prevention program. Developmental Psychology, 39(2), 292–308. Flaspohler, P., Anderson-Butcher, D., Bean, J., Burke, R., & Paternite, C. E. (2008a). Readiness and school improvement: Strategies for enhancing dissemination and implementation of expanded school mental health practices. Advances in School Mental Health Promotion, 1, 16–27. Flaspohler, P., Maras, M., Duffy, J., Wandersman, A., & Stillman, L. (2008b). Unpacking capacity: The intersection of research to practice and community centered models. American Journal of Community Psychology, 41(3–4), 182–196. Fuchs, L. S., & Fuchs, D. (2006). Identifying learning disabilities with RTI. Perspectives, 32, 39–43. Fullan, M. (1993). Change forces: Probing the depths of educational reform. Levittown, PA: The Falmer Press, Taylor and Francis Inc. Fullan, M. (2003). Change forces with a vengeance. New York: Falmer Press. Glasgow, R. E., & Emmons, K. M. (2007). How can we increase translation of research into practice? Types of evidence needed. Annual Review of Public Health, 28, 413–433. Glasgow, R. E., McKay, H. G., Piette, J. D., & Reynolds, K. D. (2001). RE-AIM framework for evaluating interventions: What can it tell us about approaches to chronic illness management? Patient Education and Counseling, 44, 119–127. Goddard, R. D., Hoy, W. K., & Woolfolk-Hoy, A. (2000). Collective teacher efficacy: Its meaning, measure, and impact on student achievement. American Educational Research Journal, 37, 479–508. Good, R. H., & Kaminski, R. A. (2002). Dynamic indicators of basic early literacy skills (6th ed.). Eugene, OR: Institute for the Development of Educational Achievement. Goodman, R. (1997). The strengths and difficulties questionnaire: A research note. Journal of Child Psychology and Psychiatry and Allied Disciplines, 38(5), 581–586. Gottfredson, D. C., & Bauer, E. L. (2007). Interventions to prevent youth violence. In L. S. Doll, S. E. Bonzo, J. A. Mercy, & D. A. Sleet (Eds.), Handbook of injury and violence prevention. New York, NY: Springer. Green, L. W. (2001). From research to ‘‘best practices’’ in other settings and populations. American Journal of Health Behavior, 25(3), 165–178. Green, B. L., McAllister, C. A., & Tarte, J. (2004). The strengthsbased practices inventory: A measure of strengths-based practices for social service programs. Families in Society: A Journal of Contemporary Human Services, 85(3), 326–334. Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly, 82(4), 581–629. Havelock, R. G. (1976). Planning for innovation through dissemination and utilization of knowledge. Ann Arbor: Institute for Social Research, The University of Michigan. Hawkins, J. D., & Catalano, R. F. (1992). Communities that care: Action for drug abuse prevention. San Francisco: Jossey-Bass, Inc.

123

444 Haynes, N. M., Emmons, C., & Ben-Avie, M. (1997). School climate as a factor in student adjustment and achievement. Journal of Educational and Psychological Consultation, 8, 321–329. Homer, J., & Milstein, B. (2004). Optimal decision making in a dynamic model of community health. Paper presented at the Hawaii International Conference on System Sciences, WaikoloaKona, Hawaii. Human Interaction Research Institute. (1976). Putting knowledge to use: A distillation of the literature regarding knowledge and change. Rockville, MD: National Institute of Mental Health. Jensen, P., Hoagwood, K., & Trickett, E. (1999). Ivory towers or earthen trenches? Community collaborations to foster ‘‘real world’’ research. Applied Developmental Science, 3, 206–212. Julian, D. (2006). A community practice model for community psychologists and some examples of the application of community practice skills from the partnerships for success initiative in Ohio. American Journal of Community Psychology, 37, 21–27. Kraft, J. M., Mezoff, J. S., Sogolow, E. D., Neumann, M. S., & Thomas, P. A. (2000). A technology transfer model for effective HIV/AIDS interventions. Science and practice, 12(5), 7–20. Kusche, C., & Greenberg, M. (1994). PATHS; promoting alternative thinking strategies. South Deerfield, MA: Developmental Research Programs Inc. Meehan, D. C., Tripp, L., Clements, C., Kaeff, J., Puckett, A., Vesey, J., Keller, K., & Flaspohler, P. D. (2009). Lessons learned with the implementation of a bullying prevention program. Workshop presented at the 6th annual International Bullying Prevention Association Conference, Pittsburgh, PA. Meehan, C., Vietze, D., & Keller, K. (2007). School-wide evidencebased prevention practices. Symposium presented at the Ohio School Based Health Care Association’s School-Based Health Centers: Integrating Mental and Physical Health for the Care of Children and their Families symposium, Blue Ash, OH. Mitchell, R. E., Florin, P., & Stevenson, J. F. (2002). Supporting community-based prevention and health promotion initiatives: Developing effective technical assistance systems. Health Education & Behavior, 29(5), 620–639. Mrazek, P. J., & Haggerty, R. J. (1994). Reducing risks for mental disorders: Frontiers for preventive intervention research. Washington, DC: National Academy Press. Nation, M., Crusto, C., Wandersman, A., Kumpfer, K., MorriseyKaner, E., et al. (2003). What works in prevention: Principles of effective prevention programs. American Psychologist, 58, 449–456. No Child Left Behind Act. (2001). PL. 107–110, 115 Stat. 1425. Olweus, D. (1993). Bullying at school: What we know and what we can do. Cambridge, MA: Blackwell. Olweus, D. (1994). Bullying at school: Long-term outcomes for the victims and an effective school-based intervention program. In L. R. Huesmann (Ed.), Aggressive behavior: Current perspectives (pp. 97–130). New York: Plenum. Olweus, D., & Limber, S. (2002). Blueprints for violence prevention: Bullying prevention program. Boulder, CO: University of Colorado, Institute of Behavioral Science. Paternite, C. E. (2005). School-based mental health programs and services: Overview and introduction to the special issue. Journal of Abnormal Child Psychology, 33, 657–663. Ringeisen, H., Henderson, K., & Hoagwood, K. (2003). Context matters: Schools and the ‘‘research to practice gap’’ in children’s mental health. School Psychology Review, 32, 153–168. Rogers, E. M. (2003). Diffusion of innovation. New York: Free Press. Rones, M., & Hoagwood, K. (2000). School-based mental health services: A research review. Clinical Child and Family Psychology Review, 3, 223–241.

123

Am J Community Psychol (2012) 50:428–444 Rutter, M., Maughan, N., Mortimore, P., Ouston, J., & Smith, A. (1979). Fifteen thousand hours: Secondary schools and their effects on children. Cambridge, MA: Harvard University Press. Schoenwald, S. K., & Hoagwood, K. (2001). Effectiveness, transportability, and dissemination of interventions: What matters when? Psychiatric Services, 52(9), 1190–1197. Seligson, J., Huebner, E. S., & Valois, R. F. (2003). Preliminary validation of the brief multidimensional students’ life satisfaction scale (BMSLSS). Social Indicators Research, 61, 121–145. Social and Character Development Research Consortium. (2010). Efficacy of schoolwide programs to promote social and character development and reduce problem behavior in elementary school children (NCER 2011–2001). Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education. Splett, J. W., & Maras, M. A. (2011). Closing the gap in school mental health: A community-centered model for school psychology. Psychology in the Schools, 48(4), 385–399. doi: 10.1002/pits.20561. Spoth, R., Greenberg, M., Bierman, K., & Redmond, C. (2004). Prosper community-university partnership model for public education systems: Capacity-building for evidence-based, competence-building prevention. Prevention Science, 5(1), 31–39. Sugai, G., Sprague, J. R., Horner, R. H., & Walker, H. M. (2000). Preventing school violence: The use of office discipline referrals to assess and monitor school-wide discipline interventions. Journal of Emotional and Behavioral Disorders, 8, 94–101. U.S. Department of Health and Human Services. (1999). Mental health: A report of the surgeon general—Executive summary. Rockville, MD: U.S. Department of Health and Human Services, National Institute of Health, National Institute of Mental Health. U.S. Department of Health and Human Services, Health Resources and Services Administration, Maternal and Child Health Bureau. (2010). The National Survey of Children’s Health 2007. Rockville, MD: U.S. Department of Health and Human Services. U.S. Public Health Service. (2000). Report on the surgeon general’s conference on children’s mental health: A national action agenda. Washington, DC: U.S. Government Printing Office. Wandersman, A. (2003). Community science: Bridging the gap between science and practice with community-centered models. American Journal of Community Psychology, 31, 227–242. Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., et al. (2008). Bridging the gap between prevention research and practice: An interactive systems framework for building capacity to disseminate and implement innovations. American Journal of Community Psychology, 41(3–4), 171–181. Wandersman, A., & Florin, P. (2003). Community interventions and effective prevention. American Psychologist, 58, 441–448. Weist, M. D. (2005). Fulfilling the promise of school-based mental health: Moving toward a public mental health promotion approach. Journal of Abnormal Child Psychology, 33, 735–741. Weist, M. D., & Paternite, C. E. (2006). Building an interconnected policy-training-practice-research agenda to advance school mental health. Education and Treatment of Children, 29(1), 173–196. Wenger, E., McDermott, R., & Snyder, W. (2002). Cultivating communities of practice: A guide to managing knowledge. Cambridge, MA: Harvard Business School Press. Wingspread Conference. (2004). Wingspread declaration on school connections. Journal of School Health, 74, 233–234. Zaltman, G., & Duncan, R. (1977). Strategies for planned change. New York: Wiley. Zaltman, G., Duncan, R., & Holbek, J. (1973). Innovations and organizations. New York: Wiley.

Suggest Documents