Psychology in the Schools, Vol. 52(8), 2015 View this article online at wileyonlinelibrary.com/journal/pits
⃝ C
2015 Wiley Periodicals, Inc. DOI: 10.1002/pits.21861
DATA-DRIVEN DELIVERY OF IMPLEMENTATION SUPPORTS IN A MULTI-TIERED FRAMEWORK: A PILOT STUDY LISA M. HAGERMOSER SANETTI
University of Connecticut MELISSA A. COLLIER-MEEK
University of Massachusetts—Boston
For multi-tiered systems of support, such as Response-to-Intervention and Positive Behavior Interventions and Supports, to effectively impact student outcomes, interventions delivered across the tiers must be implemented as planned (i.e., with adequate treatment integrity). However, research suggests that most school personnel struggle to deliver interventions with treatment integrity, which negatively impacts the potential effectiveness of these interventions. Numerous strategies to support treatment integrity have been developed, but no guidance has been provided regarding how to efficiently and effectively use them. The purpose of this study was to conduct a pilot evaluation of these strategies delivered through a Multi-Tiered Implementation Supports framework; that is, proactive, feasible treatment integrity strategies were initially delivered to all implementers and based on their responsiveness, increasingly intensive implementation supports were provided as needed. Results suggest that (a) all teachers responded to these supports, but response magnitude was different across teachers and supports; (b) higher levels of treatment integrity generally were associated with fewer disruptive behaviors; and (c) the duration of these support strategies increased across tiers. Future directions for research and implications for the feasible provision of C 2015 Wiley Periodicals, Inc. implementation support in schools are described. ⃝
Treatment integrity (i.e., the extent to which an intervention is delivered as planned) has gone from being largely ignored two decades ago to being regularly acknowledged as a critical aspect of education intervention research and practice (Cochrane & Laux, 2008; Sanetti & Kratochwill, 2009a). Although treatment integrity is relevant for all student interventions, the increased focus on this topic has been hastened by the widespread adoption of multi-tiered systems of support as a delivery model for academic and behavioral interventions (e.g., Response-to-Intervention, Positive Behavior Interventions and Supports; Kilgus, Collier-Meek, Johnson, & Jaffery, 2014). In this delivery model, high-quality curricula are delivered to all students, and regular assessment ensures that those students who are not making adequate progress receive interventions in a targeted and then individualized manner. Data-based decisions about the intensity (or tier) of student intervention are based on a student’s response to a research-based intervention implemented as planned (National Center on Response to Intervention, 2010); that is, educators must simultaneously evaluate the extent to which an intervention has been implemented as well as the student’s response to determine the appropriate level of support for a student. Unfortunately, research results consistently demonstrate that implementers display a variety of treatment integrity patterns and many, if not most, implementers struggle to consistently deliver interventions in the absence of systematic follow-up (Noell, Witt, Gilbertson, Ranier, & Freeland, 1997; Sanetti, Fallon, & Collier-Meek, 2013). When interventions are delivered without sufficient treatment integrity, they are not only less effective in improving student outcomes (Fryling, Wallace, & Yassine, 2012), but it is inappropriate to make decisions about their impact within a multi-tiered system of support (Kilgus et al., 2014; Noell & Gansle, 2006). These findings point to the importance The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R324A10005 to the University of Connecticut. The opinions expressed are those of the authors and do not represent the views of the Institute or the U.S. Department of Education. Correspondence to: Lisa M. H. Sanetti, University of Connecticut, Department of Educational Psychology, U-3064, Storrs, CT 06269-3064. E-mail:
[email protected]
815
816
Sanetti and Collier-Meek
of ensuring that research-based interventions are implemented as planned; lack of treatment integrity may be the biggest hurdle in realizing the full potential of tiered delivery models. As noted by Noell and Gansle (2006), educators can build a tiered model for service delivery according to best practices (e.g., screening, progress monitoring, data systems, decision rules), but if the interventions are delivered insufficiently, the model will just be a “hollow shell” (p. 34) that does not actually allow students to receive targeted or individualized intervention at the earliest point of need. To fulfill the promise of multi-tiered systems of support and provide students the opportunity to improve based on appropriately intense tiers of support, implementers need to be supported in ways that facilitate higher levels of treatment integrity. To meet this need, over the past 15 years, there has been a rapid increase in the development and evaluation of strategies to support implementers’ treatment integrity. Table 1 provides a brief list of implementation support strategies that have been shown, in at least one study, to improve treatment integrity. Thus, from the early 1990s to today, we have gone from scant data regarding the delivery of school-based interventions to a consensus that educators demonstrate a variety of treatment integrity patterns (many of which suggest a need for support to maximize student outcomes) and from no research-based strategies to a plethora of strategies. With this increase in knowledge about treatment integrity, however, comes the challenge of determining what support strategy to implement, when, and for whom. In an attempt to address this challenge, we propose the organization of treatment integrity support strategies within a Multi-Tiered Implementation Supports (MTIS) framework to facilitate decisions about efficient and effective strategy use. Employing a MTIS framework aligns with best practices in professional development, which indicates that both initial high-quality training and ongoing support are often needed (Joyce & Showers, 2002). Further, the MTIS framework acknowledges that implementers require different levels of support to be successful, which has been noted in initial investigations of increasingly intensive classroom management support to promote teachers’ rates of praise statements (Myers, Simonsen, & Sugai, 2011; Simonsen et al., 2014). Last, the MTIS framework was developed with the understanding that research-based treatment integrity support strategies have different intensities (e.g., one-time meeting, ongoing sessions) and may be appropriate at different stages of implementation (i.e., prior to delivery, during implementation) or for different implementation issues (e.g., missing one intervention step, low levels of all interventions steps; Sanetti, Kratochwill, Collier-Meek, & Long, 2014). We propose that within MTIS, treatment integrity support strategies can be organized by their characteristics and delivery format. At Tier 1, strategies should be feasible, widely relevant, and easily embedded into typical consultation. After Tier 1 strategies are provided, treatment integrity data should be formatively collected and evaluated to determine whether additional implementation support is necessary. For those implementers whose treatment integrity data indicate they need additional support, a Tier 2 implementation support strategy may be selected. Tier 2 strategies generally provide more intensive treatment integrity support that occurs over one or two sessions. For those implementers whose treatment integrity data indicate they need additional support, a Tier 3 implementation support strategy may be selected. Tier 3 strategies provide the most intensive support, which typically occurs on an ongoing (e.g., weekly, response-dependent) basis. Figure 1 illustrates how currently available treatment integrity support strategies might be organized within an MTIS framework, with consideration not only of strategy intensity (e.g., time required, resource use), but also current level of empirical support. P URPOSE
OF
S TUDY
The purpose of this study was to conduct a pilot evaluation of the effect of implementation supports delivered through an MTIS framework on teachers’ treatment integrity of a classroom Psychology in the Schools
DOI: 10.1002/pits
Multi-Tiered Implementation Supports
817
Table 1 Research-Supported Implementation Supports Implementation Support Intervention manual Test driving interventions Direct training Treatment planning protocol
Implementation planning
Instructional coaching Intervention scripts Role play Participant modeling Motivational interviewing
Self-monitoring Prompts Video support Performance feedback
Description Detailed, written description of the intervention and steps for implementation (Randall & Biggs, 2008) Implementer tries interventions before choosing the most acceptable intervention for ongoing implementation (Dart, Cook, Collins, Gresham, & Chenier, 2012) Series of training activities, including an introduction to the intervention, consultant modeling, implementer practice, and feedback (Sterling-Turner et al., 2002) Standardized three-step process to define an intervention, develop treatment integrity assessment, and create a treatment integrity self-assessment form (Sanetti & Kratochwill, 2009b) Detailed logistical planning related to intervention implementation (i.e., Action Planning) and proactive identification and development of solutions to address barriers (i.e., Coping Planning; Sanetti, Kratochwill, & Long, 2013) Intensive, differentiated activities conducted by an empathetic, communicative, and skilled coach (Knight, 2007) Written instructions and language for implementer to use during implementation (Ehrhardt, Barnett, Lentz, Stollar, & Reifin, 1996) After the consultant models the intervention, the implementer practices the intervention using actual intervention scenarios and then receives feedback (Trevisan, 2004) In vivo, the consultant models intervention implementation, and the implementer practices with support and independently (Tschannen-Moran & McMaster, 2009) Talking strategy founded in the consultant’s positive regard and empathetic use of open-ended questions, change talk, reflexive listening, and summarizing (Rosengren, 2009) Checklist completed by the implementer during or after implementation (Simonsen et al., 2013) Proactive reminders to implement components of an intervention (Petscher & Bailey, 2006) Video taken during implementation for implementer self-monitoring or consultant feedback (Pelletier, McNamara, Braga-Kenyon, & Ahearn, 2010). Verbal and/or graphic feedback around implementation and student outcomes. May also include other components, such as practice, prompting, or self-monitoring (Noell et al., 1997)
management plan (CMP). In the MTIS model for this study, all implementers received Direct Training as a Tier 1 treatment integrity support strategy. This was chosen as a Tier 1 strategy because highquality direct training that includes didactic training as well as role-play and feedback has been shown to be a prerequisite for high levels of treatment integrity (Joyce & Showers, 2002; Sterling-Turner, Watson, & Moore, 2002). Then Implementation Planning was provided to participants whose data suggested they needed additional support. This was chosen as a Tier 2 strategy because it is designed to explicitly define the logistics of intervention delivery, increase the contextual fit of the intervention in the classroom, and problem solve identified barriers to implementation; it can be completed during a single consultation meeting and has emerging support for its effectiveness (Sanetti, CollierMeek, Long, Kim, & Kratochwill, 2014). Participant Modeling was provided to participants whose data suggested they needed additional support. This was chosen as a Tier 3 strategy because it is designed to increase implementer skill and confidence related to delivering an intervention; it is completed during an out-of-class consultation meeting as well as an in-class demonstration of the Psychology in the Schools
DOI: 10.1002/pits
818
Sanetti and Collier-Meek
FIGURE 1. Implementation supports by level of intensity and level of research support.
implementation of specific intervention steps not being adequately implemented, and has systematic research support (Tschannen-Moran & McMaster, 2009). In addition to evaluating increasingly intensive implementation supports, this study examined the impact of these supports, and subsequent treatment integrity, on student outcomes (i.e., rates of disruptive behavior). Treatment integrity support duration data, as a marker of feasibility, are also reviewed. M ETHOD Participants and Setting Participants were 6 elementary school teachers from three suburban public schools in the Northeast. Teachers requested consultation to address classroom behavior and support their classroom management practices. All participation was voluntary. Teacher A was a Caucasian male, taught 22 fifth-grade students, held general education certification, had a Master’s degree plus additional credits, and had 13 years of teaching experience. Teacher B was a Caucasian female, taught 17 fifth-grade students, held general education certification, had a Master’s degree, and had 13 years of teaching experience. Teacher C was a Caucasian female, taught 14 fourth-grade students, held general education certification, had a Bachelor’s degree, and had 5 years of teaching experience. Teacher D was a Caucasian female, taught 14 fourth-grade students, held both general and special education certifications, had a Master’s degree plus additional credits, and had 18 years of teaching experience. Teacher E was a Caucasian female, taught 16 kindergarten students, held general education certification, had a Bachelor’s degree, and had 5 years of teaching experience. Teacher F was a Caucasian female, taught 20 third-grade students, held general education certification, had a Master’s degree plus additional credits, and had 13 years of teaching experience. Teachers A, E, Psychology in the Schools
DOI: 10.1002/pits
819
Multi-Tiered Implementation Supports Table 2 Steps of Implementation Supports by Tier Tier 1: Direct Training 1. Consultant provided didactic training for each intervention step. 2. Consultant modeled implementation of these steps. 3. Teacher practiced (i.e., role played) implementation of the intervention. 4. Consultant provided positive and corrective feedback. 5. Consultant engaged teacher in additional practice, if needed. 6. Consultant and teacher discussed generalization. Tier 2: Implementation Planning 1.Action planning: a. Teacher and consultant reviewed each intervention step to decide whether modifications were needed to increase contextual fit. b. Teacher and consultant identified the “who,” “where,” “how often,” and “with what” of implementation for each intervention step. 2. Coping planning: a. Teacher identified up to four barriers to implementation. b. Consultant and teacher identified strategies to address each barrier. Tier 3: Participant Modeling Session 1–Outside of instructional time: 1. Consultant reviewed rationale for intervention and importance of treatment integrity. 2. Consultant reviewed the intervention. 3. Consultant and teacher developed a plan for conducting the in-vivo practice. Session 2–During instructional time: 1. Consultant modeled the intervention step. 2. Teacher practiced the intervention step. 3. Consultant provided teacher with feedback. 4. Steps 1–3 repeated for each relevant intervention step.
and F taught in one school, Teachers B and C taught in a second school, and Teacher D taught in a third school, in which 23.9%, 40.3%, and 21.3% of students were eligible for free or reduced lunch, respectively. Teachers received gift cards at the conclusion of the study to acknowledge their participation. Consultants were four female and one male school psychology graduate students. The consultants completed coursework and had prior experience related to consultation, behavior assessment, and evidence-based classroom management. The consultants used protocols for all consultation and implementation support strategy meetings to standardize implementation across cases (see the Procedural Integrity section and Table 2). Additional graduate students, who had received behavior assessment coursework and training, completed observations with the consultants for inter-observer agreement. Consultants participated in weekly supervision with a licensed psychologist throughout the study. Measures Treatment Integrity. Direct observation measures were used to evaluate teachers’ implementation of a CMP. The format of the measures was standardized across cases, but there were slight differences in the content, based on teachers’ specific CMPs. For adherence, each intervention step was rated as (a) not implemented, (b) implemented with deviation, (c) implemented as planned, or (d) no opportunity for implementation. In addition, the step applicability was rated (i.e., “yes, Psychology in the Schools
DOI: 10.1002/pits
820
Sanetti and Collier-Meek
applicable” or “no, not applicable”). An intervention step was considered applicable if the teacher could have been expected to implement it during the observation period. After each observation period, the percentage of adherence was calculated as the number of CMP steps implemented as planned, divided by the total number of applicable steps. Observations occurred two to three times per week. A second rater was present for 30.85% of sessions (31.06% during Direct Training, 27.37% during Implementation Planning, 26.67% during Participant Modeling); across all cases, inter-observer agreement was 98.95% (SD = 0.91; 97.91% during Direct Training, 100% during Implementation Planning, 100% during Participant Modeling). Student Outcomes. Systematic direct observation of disruptive behavior was used to evaluate student outcomes. Disruptive behavior was defined as any action that interrupts the classroom activity (e.g., being out of seat, playing with objects, or talking about things unrelated to classroom instruction). Two to three times per week, consultants collected frequency counts of instances of disruptive behavior exhibited by students during timed 15-minute observations. A second rater was present for 29.09% of sessions (30.55% during Student Outcome Baseline, 31.06% during Direct Training, 28.08% during Implementation Planning, 26.66% during Participant Modeling); across all cases, inter-observer agreement was 95.65% (SD = 4.39; 95.20% during Student Outcome Baseline, 96.01% during Direct Training, 95.54% during Implementation Planning, 94.72% during Participant Modeling). Implementation Support Duration. Each implementation strategy meeting was audiotaped. After each meeting, the consultant listened to the audio file and timed the meeting. Consultation and CMP Development The consultation process in this study conformed to a problem-solving, or behavioral, consultation approach (Kratochwill & Bergan, 1990). After consent was obtained, the case was assigned to a consultant and a Problem Identification Interview was scheduled. During this meeting, the consultant and teacher discussed classroom behavior, relevant antecedents, and consequences, as well as the teacher’s current classroom management practices. Following this meeting, the consultant observed the students’ behavior and teacher’s classroom management practices on three to four occasions during instructional times the teacher identified as challenging. Based on data from these observations, the consultant drafted a CMP. All CMPs were based on best practices (Epstein, Atkins, Cullinan, Kutash, & Weaver, 2008) and included strategies to (a) increase classroom structure; (b) regularly use a small number of positively stated expectations; (c) actively engage students; (d) encourage appropriate behavior; and (e) systematically discourage inappropriate behavior. Although all plans addressed each of these areas, consultants individualized strategies within these categories based on baseline observations and teacher report related to current classroom management practices and classroom behavior. All teachers already engaged in some classroom management strategies. Thus, teachers’ pre-existing strategies that were aligned with best practices were maintained in the CMP, and strategies that were close to alignment with best practices were adjusted to increase alignment. When teachers’ strategies were not aligned with best practices or no strategies were evident, the consultant added relevant strategies in the CMP. After developing a draft of the CMP, the consultant and teacher met for the Problem Analysis Interview to review results of the observations and discuss the CMP. The teacher and consultant came to consensus on the specific strategies in the CMP, and the consultant provided Direct Training (see Tier 1: Direct Training section), after which the teachers began to deliver the CMP. Throughout CMP implementation, the consultants observed teachers’ treatment integrity two to three times per Psychology in the Schools
DOI: 10.1002/pits
Multi-Tiered Implementation Supports
821
week. Based on these data, increasingly intensive implementation supports (i.e., Implementation Planning, Participant Modeling) were delivered as needed. After a teacher implemented the CMP with high levels of adherence for at least 10 data points after Direct Training or five data points after subsequent support (i.e., Implementation Planning and/or Participant Modeling) that improved teacher adherence, the consultant and teacher completed a Treatment Evaluation Interview. During this meeting, the consultant provided a summary report that included narratively and graphically represented data on treatment integrity and student outcomes, and the consultant and teacher discussed whether the goals of consultation were met and whether additional implementation supports were required. Further, the consultant provided the teacher with the social validity measures, which the teacher completed after the meeting. Design and Tiered Implementation Supports To evaluate the effect of increasingly intensive implementation supports on teacher treatment integrity, a nonconcurrent multiple baseline design across implementers was employed. All participants received Direct Training immediately prior to implementation. When at least five data points had been collected in the Direct Training phase, and direct observation data indicated that teacher adherence was low (i.e., 2 days below 80% per week), an increasingly intensive implementation support was provided (i.e., Implementation Planning and then, if needed, Participant Modeling; see Table 2). There is no agreed upon criteria for sufficient treatment integrity, the decision rule employed here was based on criteria that have been successfully employed to evaluate individual implementation supports in prior research (e.g., Noell et al., 1997; Sanetti et al., 2013). Phase lengths were systematically varied to ensure staggering of the delivery of the tiered implementation supports for those teachers who needed additional support. Tier 1: Direct Training. During Direct Training, a consultant led the teacher through five training and practice activities in the teacher’s classroom. First, the consultant provided didactic training for each intervention step. Second, the consultant modeled the implementation of these steps. Third, the teacher practiced (i.e., role played) implementation of the intervention. Fourth, the consultant provided positive and corrective feedback to the teachers about their practice. Fifth, as needed, consultants engaged teachers in additional practice of specific intervention steps until mastery was achieved. The Direct Training sessions ended with a discussion about generalizing the practiced intervention steps. Tier 2: Implementation Planning. Teachers who demonstrated lower levels (i.e., below 80%) or decreasing trends in their adherence after Direct Training were eligible to receive Implementation Planning. During Implementation Planning, a consultant led the teacher through action planning and coping planning. Action planning involved logistical planning around the intervention and its implementation. Specifically, the teacher and consultant reviewed each intervention step to (a) determine whether any modifications were necessary to make the step more appropriate for the context, and (b) identify the specific implementation behaviors (i.e., who, where, how often, and with what the steps will be implemented). Coping planning involved proactive barrier identification. Specifically, the teacher was asked to identify up to four major barriers to implementation and together, the consultant and teacher identified strategies to address each barrier (see Sanetti, Kratochwill, & Long, 2013, for a more detailed description). Within 48 hours after Implementation Planning, the consultant provided the teacher with a brief report that summarized the agreed upon intervention logistics and plan to address anticipated barriers. Tier 3: Participant Modeling. Teachers who demonstrated lower levels (i.e., below 80%) or decreasing trends in their adherence after Implementation Planning were eligible to receive Psychology in the Schools
DOI: 10.1002/pits
822
Sanetti and Collier-Meek
Participant Modeling. Participant Modeling occurred across two sessions. The first session occurred during a free time preferred by the teacher. During this meeting, the consultant (a) reviewed the rationale for the intervention and importance of adequate treatment integrity, (b) described participant modeling, (c) reviewed the intervention, and (d) developed a plan for conducting the in-vivo practice (e.g., when the practice would occur, what intervention steps would be practiced). The second session occurred during classroom instruction. During this session, the consultant modeled an intervention step, the teacher practiced the step, and the consultant provided feedback. After this process was repeated for all intervention steps, the teacher practiced the intervention steps independently. Immediately following the in-vivo practice or at a separate meeting time, depending on the teacher’s schedule, the consultant and teacher reviewed the in-vivo practice and discussed skill generalization. Procedural Integrity All consultant meetings and implementation support sessions were audiotaped and reviewed for procedural integrity. Consultation Meetings. A consultation guide and consultation checklists aligned with the three interviews central to behavior consultation (i.e., problem identification, problem analysis, treatment evaluation; Kratochwill & Bergan, 1990) were developed to standardize consultation across consultants. Following each consultation meeting, consultants rated the presence or absence of essential components per the consultation checklists. In addition, a second rater reviewed all audiotaped sessions and provided ratings. Across all meetings, consultants indicated they completed an average of 100% (SD = 0) of the essential components and average inter-rater agreement was 99.74% (SD = 0.44). Implementation Supports. Each implementation support strategy (i.e., Direct Training, Implementation Planning, Participant Modeling) had a structured protocol to standardize delivery across consultants (see www.implementationscience.uconn.edu/prime/resources/). Following each implementation support meeting, consultants completed a treatment integrity assessment on which components for the strategy were rated for adherence. A second rater reviewed all audiotaped sessions and provided ratings. Across all meetings, consultants indicated they completed an average of 100% (SD = 0) of components; average inter-rater agreement was 100% (SD = 0). R ESULTS Treatment Integrity Findings indicate that all teachers responded to implementation supports, but that the magnitude of response was different across teachers and implementation supports (see Table 3 and Figure 2). Teachers’ adherence data are described by implementation support in the next section. Tier 1: Direct Training. After Direct Training, teachers implemented the CMPs with average adherence that ranged from 45.96% (SD = 14.89) to 92.13% (SD = 10.08). Teachers E and F demonstrated high and moderately variable levels of adherence after receiving Direct Training only (Teacher E mean adherence = 86.26%, SD = 9.53; Teacher F mean adherence = 92.13%; SD = 10.08). These data suggested they did not require more intensive implementation supports. Teachers A and B demonstrated initially high and moderate levels of adherence, respectively, with decreasing trends across time. Teacher C demonstrated an initially moderate level of adherence, with increasing variability across time. Teacher D demonstrated a moderate level of adherence, with a slowly decreasing trend across time. Treatment integrity data for Teachers A through D suggested the need for more intensive implementation supports. Psychology in the Schools
DOI: 10.1002/pits
823
Multi-Tiered Implementation Supports
Table 3 Means and Standard Deviations for Teachers’ Percent Treatment Integrity of Classroom Management Plans and Students’ Frequency of Disruptive Behavior by Phase Student Outcome Baseline Teacher
Mean
SD
A B C D E F Student A B C D E F
10.08 12.00 12.35 11.47 6.65 2.00
4.14 1.05 10.70 6.00 3.07 2.02
Direct Training
Implementation Planning
Participant Modeling
Mean
SD
Mean
SD
Mean
SD
70.97% 45.96% 50.45% 67.33% 86.26% 92.13%
13.13 14.89 11.03 12.54 9.53 10.08
88.91% 65.24% 62.97% 95.60%
7.56 9.45 10.11 4.69
75.76% 84.00%
9.38 15.16
2.93 7.79 4.68 4.16 4.24 0.77
1.97 7.81 5.46 3.35
1.50 3.28 3.11 1.10
7.12 5.97
3.10 2.55
8.01 12.32 11.97 8.15 4.66 0.73
Tier 2: Direct Training and Implementation Planning. All 4 teachers who completed Implementation Planning had a subsequent increase in their treatment integrity data; however, the magnitude of the increases and resulting levels of adherence varied substantially. Treatment integrity data for Teachers A and D increased to high levels and became less variable after Implementation Planning. Specifically, Teacher A’s adherence increased immediately by 17.94% and variability decreased by 5.57, whereas Teacher D’s adherence immediately increased by 28.27% and variability decreased by 7.85. These data suggested that Teachers A and D did not warrant more intensive implementation supports. Teacher B’s adherence immediately increased by 19.18% and remained stable with minimal variability. Teacher C’s adherence demonstrated a slightly increasing trend with variability similar to the previous phase. These treatment integrity data for Teachers B and C, although improved, still suggested the need for more intensive supports. Tier 3: Direct Training, Implementation Planning, and Participant Modeling. The 2 teachers who completed Participant Modeling, had a subsequent increase in their treatment integrity data. After completing Participant Modeling, Teacher B’s average adherence increased immediately by 10.52% to 75.76% and Teacher C’s average adherence increased by 21.03% to 84.00%. These data suggested Teacher C did not warrant more intensive implementation supports, but that Teacher B might benefit from additional supports. Student Outcomes Findings across classroom disruptive behavior ratings indicate that, in general, increasing levels of implementation supports and subsequently higher levels of treatment integrity were associated with decreases in the number of disruptive behaviors (see Table 3). After Direct Training and Implementation Planning, Teacher A’s students exhibited decreases of 2.07 and 6.04 instances of disruptive behavior, respectively. After Direct Training, Teacher B’s students exhibited a slight Psychology in the Schools
DOI: 10.1002/pits
824
Sanetti and Collier-Meek
FIGURE 2. Teachers Percent Adherence of Classroom Management Plans Across Sessions.
increase of 0.32 instances of disruptive behavior, whereas after Implementation Planning and Participant Modeling, Teacher B’s students exhibited decreases of 4.51 and 0.69 instances of disruptive behavior, respectively. After Direct Training and Implementation Planning, Teacher C’s students exhibited decreases of 0.38, and 6.51, respectively; however, after Participant Modeling, students exhibited a slight increase of 0.51 instances of disruptive behavior. After Direct Training and Implementation Planning, Teacher D’s students exhibited decreases of 3.32 and 4.80 instances of Psychology in the Schools
DOI: 10.1002/pits
Multi-Tiered Implementation Supports
825
disruptive behavior, respectively. After Direct Training, Teacher E’s students exhibited a decrease of 1.99 and Teacher F’s students exhibited a decrease of 1.27 instances of disruptive behavior. Implementation Support Duration Duration data were collected for each implementation support session as a marker of feasibility. The six Direct Training strategies took an average of 22.38 minutes (SD = 5.46) to complete, the four Implementation Planning strategies took an average of 53.75 minutes (SD = 12.33) to complete, and the two Participant Modeling strategies took an average of 80.40 minutes (SD = 31.07) to complete. D ISCUSSION To ensure that students receive the proactive, high-quality intervention supports promised in multi-tiered systems of support, treatment integrity must be evaluated and supported (Noell & Gansle, 2006; Sanetti & Kratochwill, 2009a). Available data suggest teachers demonstrate a variety of patterns of treatment integrity, many of which necessitate a need for implementation support (Simonsen et al., 2013). In response, a number of such supports have been developed and evaluated (Sanetti & Kratochwill, 2009a). As these supports vary significantly in terms of intensity and the stage of intervention delivery at which they are most appropriate, organizing implementation supports in a multi-tiered framework may facilitate efficient and systematic support delivery. The purpose of this pilot study was to evaluate the (a) influence of increasingly intensive implementation supports on teachers’ treatment integrity, (b) influence of increased levels of treatment integrity on student outcomes, and (c) feasibility of their delivery. The varied levels of treatment integrity exhibited by the six teachers presented within this investigation demonstrates the need for differential supports for implementers. All teachers received the Tier 1 support of Direct Training, after which two teachers demonstrated adequate treatment integrity, requiring no further support. Four teachers’ treatment integrity data did not demonstrate an adequate response, and they received the Tier 2 support of Implementation Planning. Two teachers’ treatment integrity data demonstrated an adequate response, not requiring further support; however, two teachers’ data did not demonstrate an adequate response, and they received the Tier 3 support of Participant Modeling. The differential responsiveness of these teachers to varied implementation supports suggests that, like interventions to bolster student outcomes, implementers may require distinct types and intensities of support to be successful. Further, teacher- and school-level variables (e.g., education, years of experience, school) were reviewed to determine whether there were any patterns that were associated with teacher’s differential responses. The only pattern found was that the two teachers who responded to Tier 1 support and one of the teachers who responded to Tier 2 support were from the same school. Thus, it is possible that there were school-level variables (e.g., school-level supports, climate, characteristics of the student population) that made it more likely for teachers in that context to respond to initial implementation supports In general, improved levels of treatment integrity demonstrated by teachers as a result of increasingly intensive implementation supports were associated with lower levels of disruptive behavior. Students in all classrooms demonstrated lower levels of disruptive behavior from baseline to after Direct Training, and students in classrooms A, B, C, and D demonstrated even lower levels of disruptive behavior after Implementation Planning. The impact of Participant Modeling on disruptive behavior is less clear; one classroom had a slight decrease in disruptive behavior, whereas the other classroom had a slight increase in these behaviors. Notably, the teachers who required higher levels of implementation support and whose average treatment integrity during each phase was lower than other teachers had students who demonstrated higher levels of disruptive behavior in each phase. Teachers E and F only required one Psychology in the Schools
DOI: 10.1002/pits
826
Sanetti and Collier-Meek
support strategy and had students who demonstrated the lowest level of disruptive behavior, whereas Teachers B and C required three support strategies and had students who demonstrated the highest level of disruptive behavior in each phase. It may be that the CMP, a Tier 1 behavioral strategy, needs to be delivered to a higher level to impact student outcomes or that there were more challenging students in classrooms B and C who contributed to the high rates of disruptive behavior and served as a barrier to implementation. Overall, the demonstration provides support for the relationship between treatment integrity and student outcomes and the idea that implementation supports delivered through MTIS can improve student behavior. To be both efficient and targeted, the delivery of these supports can build off this multi-tiered logic to provide foundational, universal support and be responsive to the unique patterns of treatment integrity data exhibited by implementers within MTIS. The initial results related to treatment integrity and student outcomes, although limited, suggest that MTIS may be a promising way to provide practitioners with a systematic framework for delivering treatment integrity promotion strategies. For these research-based implementation support strategies to be applied to school practice, however, the feasibility of their delivery must be considered. Although school psychologists report that their administrators support their consultation with teachers to serve students, the majority also report that time constraints and other responsibilities (i.e., assessments) are a barrier to both engaging in consultation as well as assessing and promoting treatment integrity during consultation (Cochrane & Laux, 2008). The duration data in the current study suggest that not only did the complexity of the implementation supports increase across tiers, but the duration of the support delivery also increased. On average, 33% of the teachers needed less than 30 minutes, 33% needed a little over 1 hour, 17% needed about 2.5 hours, and 17% likely needed more than 2.5 hours of support to demonstrate adequate levels of treatment integrity. Although this is still a considerable amount of time for a practitioner to spend supporting implementation, the use of tiered supports allows that time to be spent with teachers most in need of the support, as opposed to blindly providing standardized supports to all teachers, in the absence of treatment integrity data. Limitations and Future Research This initial assessment of MTIS has several limitations. The nonconcurrent multiple baseline design was employed due to the realities of research within practice settings. This investigation should be evaluated as an initial pilot study in which teachers demonstrated different levels of treatment integrity and responded differently to increasingly intensive implementation supports. Research to more stringently evaluate teachers’ responsiveness to MTIS and the teacher- and school-level variables that might influence their responsiveness is needed. Further, a convenience sample was used for this pilot study; the teachers and their varied student populations may not be representative. As such, the generalizability of the findings may be somewhat limited. Future research may evaluate whether there is a relationship between teacher demographics or school characteristics and level of treatment integrity and intensity of support required. In addition, only one student intervention was evaluated here. Future research could evaluate teachers’ treatment integrity and their response to MTIS when delivering varied student interventions (e.g., academic interventions, individual interventions). Only three types of implementation support strategies were evaluated, although other supports were described within the Introduction. From this initial study, it is not clear whether these strategies in particular or simply repeated supports were responsible for increased teacher’ treatment integrity. Although we suggest appropriate tiers for research-based implementation supports, additional research is needed to understand how different treatment integrity promotion strategies function within MTIS. To extend our understanding of the systematic provision of implementation supports within MTIS, future research could evaluate the impact of additional types of supports on teachers’ Psychology in the Schools
DOI: 10.1002/pits
Multi-Tiered Implementation Supports
827
treatment integrity. No long-term follow-up data are available for the cases, so it is not possible to evaluate how the teachers’ treatment integrity data did or did not sustain over time. Additional research is needed on how teachers maintain implementation after receiving support through MTIS. Further, due to the end of the school year, we were not able to evaluate another, more intensive implementation support for Teacher B; however, the data patterns in classroom B suggest a need for future research on (a) threshold levels of treatment integrity, (b) student behavior as a barrier to implementation, and (c) the effectiveness of using degree of student behavior as an indicator of the need to use a more intensive implementation support earlier. Only rates of disruptive behavior were collected to evaluate student outcomes. Subsequent research may evaluate the impact of MTIS and teacher treatment integrity on varied types of student outcome data. Finally, future research is needed that evaluates these strategies and includes feasibility-related adaptations (e.g., different methods of delivering support) and data (e.g., duration). For example, in previous investigations, Implementation Planning has taken less time (e.g., Sanetti et al., 2014) or been completed independently by implementers (Long, Sanetti, & Lark, 2014), demonstrating the adaptability of this strategy. Likewise, Participant Modeling, although the most time-intensive support in this study, can be considered relatively time efficient compared with the often-recommended strategy of ongoing Performance Feedback, which has been found to take practitioners up to 2.5 hours to prepare for and conduct each session (Sanetti et al., 2013). Implications for Practice School psychologists acknowledge the importance of treatment integrity data, while reporting that they do not regularly assess it and experienced barriers to doing so, primarily a lack of available time (Cochrane & Laux, 2008). As school psychologists are encouraged to collect treatment integrity data and provide supports as needed, multiple articles have focused on how to provide Performance Feedback, which can be a highly intensive implementation support (Collier-Meek, Fallon, Sanetti, & Maggin, 2013; Sanetti, Fallon, & Collier-Meek, 2011), and the empirical support for a plethora of other implementation supports is emerging. The introduction of MTIS provides a systematic process for the delivery of more feasible implementation supports with the provision of more intensive strategies, such as Performance Feedback, only when data suggest these strategies are needed. The possibility that treatment integrity support could include strategies such as Direct Training or Implementation Planning, which are both relatively brief strategies that occur within one meeting, might make it possible for school psychologists to deliver treatment integrity evaluation and support. MTIS may provide a more feasible way to support treatment integrity based on implementation supports, prevention science, professional development, and adult behavior change literatures. R EFERENCES Cochrane, W. S., & Laux, J. M. (2008). A survey investigating school psychologists’ measurement of treatment integrity in school-based interventions and their beliefs about its importance. Psychology in the Schools, 45, 499–507. Collier-Meek, M. A., Fallon, L. M., Sanetti, L. M. H., & Maggin, D. M. (2013). Focus on implementation: Strategies for problem-solving teams to assess and promote treatment fidelity. Teaching Exceptional Children, 45, 52–59. Dart, E. H., Cook, C. R., Collins, T. A., Gresham, F. M., & Chenier, J. S. (2012). Test driving interventions to increase treatment integrity and student outcomes. School Psychology Review, 41, 467–481. Ehrhardt, K. E., Barnett, D. W., Lentz, F. E., Jr., Stollar, S. A., & Reifin, L. H. (1996). Innovative methodology in ecological consultation: Use of scripts to promote treatment acceptability and integrity. School Psychology Quarterly, 11, 149–168. Epstein, M., Atkins, M., Cullinan, D., Kutash, K., & Weaver, R. (2008). Reducing Behavior Problems in the elementary school classroom: A practice guide (NCEE #2008-012). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides Fryling, M. T., Wallace, M. D., & Yassine, J. N. (2012). Impact of treatment integrity on intervention effectiveness. Journal of Behavior Analysis, 45, 449–453. Psychology in the Schools
DOI: 10.1002/pits
828
Sanetti and Collier-Meek
Joyce, B., & Showers, B. (2002). Student achievement through staff development (3rd ed.). Alexandria, VA: Association for Supervision and Curriculum Development. Kilgus, S. P., Collier-Meek, M. A., Johnson, A. H., & Jaffery, R. (2014). Applied empiricism: Ensuring the validity of response-to-intervention decisions. Contemporary School Psychology, 18, 1–12. Knight, J. (2007). Instructional coaching: A partnership approach to improving instruction. Thousand Oaks, CA: Corwin Press. Kratochwill, T. R., & Bergan, J. R. (1990). Behavioral consultation in applied settings. New York, NY: Plenum. Long, A. C. J., Sanetti, L. M. H., & Lark, C. R. (2014). Examining the promise of self-administered implementation planning: Promoting treatment integrity in an alternative high school setting. Manuscript in preparation. Myers, D., Simonsen, B., & Sugai, G. (2011). Increasing teachers’ use of praise with a response to intervention approach. Education and Treatment of Children, 34, 35–59. National Center on Response to Intervention. (2010). Essential components of RTI: A closer look at response to intervention. Retrieved from http://www.rti4success.org Noell, G. H., & Gansle, K. A. (2006). Assuring the form has substance: Treatment plan implementation as the foundation of assessing response to intervention. Assessment for Effective Intervention, 32, 32–39. Noell, G. H., Witt, J. C., Gilbertson, D. N., Ranier, D. D., & Freeland, J. T. (1997). Increasing teacher intervention implementation in general education settings through consultation and performance feedback. School Psychology Quarterly, 12, 77–88. Pelletier, K., McNamara, B., Braga-Kenyon, P., & Aheam, W. H. (2010). Effect of video self-monitoring on procedural integrity. Behavior Interventions, 25, 261–274. doi: 10.1002/bin.316 Petscher, E. S., & Bailey, J. S. (2006). Effects of training, prompting, and self-monitoring on staff behavior in a classroom for students with disabilities. Journal of Applied Behavior Analysis, 39, 215–226. Randall, C. M. & Biggs, B. K. (2008). Enhancing therapeutic gains: Examination of fidelity to the model for the intensive mental health program. Journal of Child and Family Studies, 17, 191–205. Rosengren, D. B. (2009). Building motivational interviewing skills: A practitioner workbook. New York, NY: The Guilford Press. Sanetti, L. M. H., Collier-Meek, M. A., Long, A. C. J., Kim, J., & Kratochwill, T. R., (2014) Using implementation planning to increase teachers’ adherence and quality to behavior support plans. Psychology in the Schools, 51, 879–895. Sanetti, L. M. H., Fallon, L. M., & Collier-Meek, M. A. (2011). Treatment integrity assessment and intervention by schoolbased personnel: Practical applications based on a preliminary study. School Psychology Forum, 5, 87–102. Sanetti, L. M. H., Fallon, L. M., & Collier-Meek, M. A. (2013). Increasing teacher treatment integrity through performance feedback provided by school personnel. Psychology in the Schools, 50, 134–150. Sanetti, L. M. H., & Kratochwill, T. R. (2009a). Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review, 38, 445–459. Sanetti, L. M. H., & Kratochwill, T. R. (2009b). Treatment integrity assessment in the schools: An evaluation of the Treatment Integrity Planning Protocol (TIPP). School Psychology Quarterly, 24, 24–35. Sanetti, L. M. H., Kratochwill, T. R., Collier-Meek, M. A., & Long, A. C. J. (2014). PRIME: Planning Realistic Implementation and Maintenance by Educators. Storrs, CT: University of Connecticut. Retrieved from implementationscience.uconn.edu Sanetti, L. M. H., Kratochwill, T. R., & Long, A. C. J. (2013). Applying adult behavior change theory to support mediatorbased intervention implementation. School Psychology Quarterly, 28, 47–62. Simonsen, B., MacSuga, A. S., Fallon, L. M., & Sugai, G. (2013). Teacher self-monitoring to increase specific praise rates. Journal of Positive Behavior Interventions, 15, 3–13. doi: 10.1177/1098300712440453 Simonsen, B., MacSuga-Gage, A. S., Briere, D. E., Freeman, J., Myers, D., Scott, T., Sugai, G. (2014). Multi-tiered support framework for teachers’ classroom management practices: Overview and case study of building the triangle for teachers. Journal of Positive Behavior Interventions, 16, 179–190. Sterling-Turner, H. E., Watson, T. S., & Moore, J. W. (2002). The effects of direct training and treatment integrity on treatment outcomes in school consultation. School Psychology Quarterly, 17, 47–77. Trevisan, M. S. (2004). Practical training in evaluation: A review of the literature. American Journal of Evaluation, 25, 255–272. Tschannen-Moran, M., & McMaster, P. (2009). Sources of self-efficacy: Four professional development formats and their relationship to self-efficacy and implementation of a new teaching strategy. Elementary School Journal, 110, 228–245.
Psychology in the Schools
DOI: 10.1002/pits