Prev Sci (2011) 12:411–422 DOI 10.1007/s11121-011-0233-6
Integrating Triple P into Existing Family Support Services: A Case Study on Program Implementation Rhonda Breitkreuz & David McConnell & Amber Savage & Alec Hamilton
Published online: 13 July 2011 # Society for Prevention Research 2011
Abstract The purpose of this paper is to present a case study of “evidence-based” program uptake and implementation. The process of integrating Triple P (levels 2 and 3) into existing family support centers in Alberta, Canada, was examined. We conducted ten individual interviews with directors, and ten group interviews, involving a total of 62 practitioners across ten Triple P pilot sites. Key findings show that there was variability in the approach and extent to which Triple P was integrated into family support centers. Five key factors impacting the integration process emerged from the interviews. These were: (1) the level of development of pre-existing support services; (2) the degree of “fit” between the Triple P program approach and existing agency practice, including the perceived suitability/unsuitability for some client groups; (3) practitioner perceptions of the adaptability of the program; (4) rules about who can and who cannot use Triple P resources; and (5) training and sustainability issues. In addition to identifying specific factors, this study was able to provide some insight as to why and how these factors were significant, thereby adding to the literature on knowledge/program dissemination processes. Keywords Triple P . Program implementation . Knowledge dissemination . Evidence-based programs . Parenting program R. Breitkreuz (*) Human Ecology, University of Alberta, Edmonton, AB T6G 2N1, Canada e-mail:
[email protected] D. McConnell : A. Savage : A. Hamilton Family and Disability Studies Initiative, University of Alberta, 11487 89 Ave, Edmonton, AB T6G 2M7, Canada
Introduction The Triple P Positive Parenting Program, a behavior based, parent training and support program, was developed in Australia and has been widely implemented throughout many countries including Canada, the United States, New Zealand, the Netherlands, and Germany. Triple P is comprised of five levels of intervention, ranging from multi-media strategies designed to improve parent access to high-quality parenting information, through to multi-modal parent training with enhancements for high-risk families. Triple P has a well structured and systematic strategy of program dissemination that includes practitioner training, accreditation, and use of proprietary resources. The dissemination of Triple P is based on an ecological model informed by self-regulatory and systems-contextual approaches (Sanders and Turner 2005). It has been identified as an effective, relatively inexpensive, research supported program with a substantial and increasing number of randomized control trials to provide evidence of its effectiveness (Sanders et al. 2002, 2009; Sanders and Turner 2005; Seng et al. 2006). One particularly interesting aspect of Triple P is that it is being introduced into existing organizations with diverse organizational cultures, staff of varying educational backgrounds, and wide-ranging geographic and cultural settings. As such, the melding of this program into existing organizations provides an interesting window into the uptake of evidence-based programs into naturalistic settings with pre-established workplace cultures, programs and staff. Yet, to date, this aspect of Triple P has received little scholarly attention. Situated within this context, the purpose of this paper is to contribute to the literature on knowledge dissemination by presenting a case study of one Triple P implementation process in the Province of Alberta, Canada.
412
We investigate the process of integrating levels 2 and 3 of the Triple P system (hereafter referred to simply as Triple P) into existing family support centres in Alberta, including influences on program uptake and utilization.
Background A growing body of literature that examines the dissemination of evidence-based programs as well as other knowledge dissemination strategies within the social, health and behavioral sciences suggests that there are multiple influences on the uptake of research supported programs. This research suggests that key influences on the extent to which uptake of new programs occurs includes: organizational support (Seng et al. 2006), including support from front-line staff (Rapp et al. 2010); compatibility (Addis 2002); adaptability (Addis and Krasnow 2000); practitioner confidence (Aarons and Palinkas 2007); opportunity to trial the program (Rogers 1995); and evidence that the new program will meet existing needs (Berwick 2008). In short, evidence of efficacy is typically not enough to ensure adoption. Critical to uptake is support from top levels of the organization (Seng et al. 2006). Management has to be open to the innovation and willing to invest in the change process, including practitioner training and program resources (Addis 2002). For this to occur, there is typically some perceived or demonstrated cost-advantage (Linney 1990). However, evidence of efficacy and “top-down” support are still not sufficient conditions for the successful dissemination of research products and utilization of evidence-supported interventions (Schinke et al. 1991). One key determinant of successful dissemination activity, in terms of program uptake, is the compatibility or fit of the program with currently felt needs and the beliefs and values of the potential adopter, where the potential adopter could be an organization and/or individual practitioner (Berwick 2008; Landry et al. 2006). When a good fit exists, programs are more likely to be accepted and integrated into practice. In contrast, when the values of the organization are seen to be put at risk by the proposed program, integration is less likely. “Mis-fit” occurs, for example, when the implementation of pre-packaged manualized programs are perceived, rightly or wrongly, as detracting from the therapeutic relationship or as antithetical to client-centered practice: turning professionals into technicians rather than caring human beings (Addis 2002; Addis and Krasnow 2000). Another key influence on the uptake of evidencesupported programs is the perceived simplicity (ease of adoption) and adaptability of the program. Dissemination is more likely to succeed when the program is simple, flexible, and adaptable to different adoption settings. This
Prev Sci (2011) 12:411–422
includes, but is not limited to, the perceived adaptability of the program for different client groups and particular client needs. Local adaptation of evidence-supported programs is, however, controversial. Proponents of strict program fidelity point to evidence suggesting that tailoring may reduce program efficacy (e.g., Kumpfer et al. 2002). Diffusion research shows, however, that any insistence on rigid adherence may be a barrier to successful dissemination. Programs (and other innovations) that are successfully disseminated are almost always adapted in some way (Berwick 2003). Skillful competence appears to be a more realistic goal than rigid, technical adherence (Addis and Krasnow 2000). The training experience for a new evidence-supported program and, in turn, practitioner confidence in his or her newly developed implementation skills have also been identified as important determinants of program uptake and sustained use. In the child welfare context, Aarons and Palinkas (2007) found that practitioners were more likely to “buy in” to a new program if the rationale for implementation was clear; if the trainers demonstrated respect for the practitioners’ experiences and were responsive to their concerns; and, if the trainer was perceived by practitioners to have expertise. In addition, diffusion research suggests that practitioners may need to trial the program and observe the benefits for themselves before an evidencesupported program or innovation is fully integrated into their helping repertoire (Rogers 1995). Addis (2002) also notes that learning a new program often requires practitioners to step out of their comfort zone, so opportunities to try out new interventions and to receive support from colleagues is often vital for practitioners to develop confidence in their implementation skills. Moreover, positive client feedback may be the single most important determinant of whether a program is fully adopted and sustained (Sanders et al. 2009). Informed by this previous research, we conducted individual and group interviews with the directors and staff in ten Triple P sites in Alberta to identify factors that were facilitators and barriers to implementation. Importantly, in addition to identifying specific factors, this study was also able to provide some insight as to why and how these factors were significant, thereby adding to the literature on knowledge dissemination processes.
Triple P The primary objective of Triple P is to improve the overall health, resourcefulness, and independence of families through enhancing parental knowledge, skills, and confidence. A key assumption of Triple P is that enhanced parenting will lead to healthy child development and reduced incidences of child abuse, mental illness and
Prev Sci (2011) 12:411–422
behavioral problems. The program is based on a multilevel approach that includes five intervention levels. In the Triple P approach, level 1 includes a media campaign intended for all parents interesting in learning about parenting and child development. Levels 2 and 3 (detailed below) offer interventions by primary care practitioners for specific behavior problems. Level 4 is for parents of children with more severe behavioral problems and entails eight to ten sessions. Finally, level 5 offers intensive parent training programs addressing broader family issues such as relationship conflict and parental depression, anger, and stress. It is usually provided to parents who have already taken, or are currently in, a level 4 program (Triple P 2010).
413
programming into current parent education services, replacing programs that address similar issues but were identified by ACYS as lacking an evidence base. Triple P International was contracted to provide training and accreditation for 60 family support center staff in level 2 (provision of parenting advice through seminars and brief consultations with parents) and level 3 (narrow-focus parent skills training) in 2007–2008. Staff from the agencies participating in this evaluation received Triple P training and accreditation in two waves. The first cohort was trained in Fall, 2007, and the second cohort was trained in Fall, 2008. Staff participated in four consecutive days of training in Triple P levels 2 and 3, followed by an accreditation session 6 weeks following training.
Triple P (Levels 2 and 3) Methods Levels 2 and 3 of Triple P, currently being implemented in Alberta, are designed for use in primary care settings with parents who seek professional guidance and support to deal with common, discrete child behavior problems (e.g., tantrums, whining) and challenging (but typical) child developmental transitions (e.g., toilet training). Selected Triple P (i.e., level 2) is available in two formats. The first is a brief, one to two session intervention that provides early anticipatory developmental guidance to parents of children with mild behavioral difficulties or developmental issues with the aid of tip sheets and videotapes that demonstrate specific parenting skills. Additionally, selected Triple P can be offered as a seminar series, including three specific positive parenting topics. The seminars are used both to promote awareness of Triple P and to provide information to parents. Each seminar includes a presentation, a question and answer period, distribution of parenting tip sheets, and an opportunity for parents to consult with practitioners to make individual inquiries and request further assistance. Primary Care Triple P (i.e., level 3) is a four-session intervention designed for children with mild to moderate behavior problems and includes active skills training for parents. Implementing Triple P in Alberta In 2007, Alberta Children and Youth Services (ACYS) implemented a pilot of levels 2 and 3 of the Triple P program in 19 family support centers in three regions in the Province of Alberta. The pilot sites included urban and rural locations. ACYS limited training in the pilot to levels 2 and 3 of the Triple P system, ascertaining that these would provide the most appropriate levels of intervention in the non-targeted settings of family support centers. The Triple P pilot sites were expected to integrate Triple P
During the period of May through July, 2009, interviews were conducted with ten directors of family support agencies piloting Triple P, and group interviews were conducted with 62 practitioners (including Triple P accredited and non-accredited staff and directors) at each of the ten Triple P “pilot” sites in both rural and urban areas throughout the Province of Alberta. All of the family support centers piloting Triple P had five universal preexisting core services including parent education, early learning and care (e.g., drop-in playgroup activities), developmental screening, family support (e.g., community kitchen, clothing exchange), and information and referral. All interviews were conducted at the participating agencies by a doctoral student who was also an experienced practicing psychologist with demonstrated interviewing and group facilitation skills. Participation was voluntary and written informed consent was obtained. Each group interview consisted of three to nine practitioners, with two or three of these being Triple P trained. There was an average of six practitioners per group interview, and these interviews took between 60 and 90 min to complete. The participating practitioners were all women, and most of them had college diplomas or undergraduate degrees in early childhood development or social work. The work experience of the practitioners ranged from 3 years to over 20 years of employment in family support services. The participants also had a range of ethnic and racial backgrounds, including those of Aboriginal origin, reflecting the population of Alberta. A semi-structured interview format was employed. Semi-structured interviews are designed to seek information about a particular topic, covering various domains of knowledge, while still maintaining the flexibility of an unstructured interview (Richards and Morse 2007). The areas of interest for the purposes of these interviews
414
included: information about the local community and the parents accessing the service; the organization, its mission and the range of services provided by it; the experience of implementing Triple P within the organization; and the strengths and limitations of Triple P from the perspective of the practitioners. Using this approach, an interview guide was developed to shape the course of the interview and ensure that particular areas of interest were considered, but this was used more as an aide memoire than a rigid interview protocol. The aide memoire was adapted over the course of the interviews as the concurrent data analysis revealed data collection needs (e.g., divergent findings or emerging themes that required further exploration). Issues and emerging insights garnered from earlier interviews were also brought up in later interviews for verification. Using this approach, the interviewer was free to probe at certain points to elicit more in-depth information, and ask questions in a responsive manner (Bernard 2000). This style allowed for consistent data to be collected, while leaving room for important and enriching data to emerge (Mayan 2009). Detailed field notes were made by the interviewer following each group and individual interview. Information in the field notes included summaries of the interview as well as brief documentation of compelling points made that seemed particularly prevalent. With participant consent, each interview was digitally recorded and then transcribed verbatim. Transcripts were checked for accuracy. The interviewer then completed the preliminary analysis, identifying recurring themes using the constant-comparison method (Strauss and Corbin 1998). This method involves comparing two or more descriptions of experiences or key events and looking for similarities and differences across participants (Miles and Huberman 1994). These descriptive themes were summarized into a preliminary report that detailed specific information about the implementation process of Triple P, how the program was being utilized, what practitioners noticed about the program, how it had helped or hindered practitioners’ ability to fulfill the mandate of the organization, and their perceptions of what had worked well or not worked well in the implementation and delivery of Triple P programming. The first author then conducted a secondary thematic analysis of the interview data to ensure the rigor of the preliminary analysis, refine, develop and expand on recurring themes, and search for and analyze “negative cases” (i.e., any inconsistencies) (Simons et al. 2008). This was done through an iterative process of reading the transcripts and writing analytical notes throughout the process, searching for commonalities and noting any points of divergence. By comparing chunks of coded data to look for commonalities, a process called “axial coding” (Strauss and Corbin 1998), interrelationships between codes were
Prev Sci (2011) 12:411–422
discovered, and these codes were merged to create comprehensive themes. Through this in-depth analysis, two overarching themes were identified in the data: 1) Triple P adds value to existing services; and 2) there were facilitators and barriers to the integration of Primary Care Triple P into existing services. Under these comprehensive themes, key factors were delineated, as described in the findings below.
Findings Primary Care Triple P: Adding Value to Existing Services Overall, staff at the Triple P pilot sites indicated that although it was still “early days,” the experience of implementing Triple P into their organizations had been positive. The consensus was that Triple P is enabling staff to “do what they do” more efficiently and more effectively. One practitioner describes her success with the program: “I have had a few really great experiences with it—very, very positive.” Interview participants generally indicated that although Triple P was not a radical departure from existing services, there were still a number of ways that Triple P was enhancing agency services. These included: 1) the high quality of Triple P resources; 2) a change in how services were offered; 3) enhanced credibility; and 4) improved linkages with other agencies. First, practitioners described how the high quality Triple P resources, and the structured and systematic nature of the program, were optimizing teaching time and effectiveness. One staff member captures this sentiment: Triple P has [packaged] good parenting well…Once upon a time we had this filing cabinet full of resources. And so I’d have a client come and then I’d have to go back to my filing cabinet, and I knew something on toilet training was in there and I’d get it out. And there’s five things on toilet training that would work for it. And I would write up that sheet and then I’d give it to them. Whereas now, I can go straight to Triple P toilet training. Another practitioner describes the efficiency of Triple P, portraying the “no nonsense” approach of the program: With Triple P you’re the expert coming in saying, okay…this is what you need to do. We don’t have a lot of time for small talk… With home visitation you can be messing around for a very long time to get that same solution. It’s efficient. It’s effective. It’s fast, but it’s not that warm and fuzzy…
Prev Sci (2011) 12:411–422
In this quote, the practitioner hints that there was somewhat of a trade-off between efficiency and rapport-building with the client, yet still describes the effectiveness of this approach. Second, although Triple P had not necessarily changed what family support practitioners did, for at least some practitioners, it had transformed how they did it. Triple P was enhancing efficiency through a systematized process of support and service delivery. One participant explains: “There was nothing new, nothing I had not seen before, nothing I hadn’t come across before… I think it’s in how the Triple P provider approaches as systematically as you do with the forms and the tracking—that makes that program unique.” Triple P was also described as well structured and simple to implement. Another participant summed-up Triple P as “a great little package that is easy to deliver.” Third, practitioners perceived that the accreditation process, and the Triple P emphasis on evidence-based practices, gave them more credibility. Having a “structured,” “defined,” “research-based” program meant that staff could draw on a larger body of evidence to demonstrate that these techniques worked. As one participant said, “this isn’t just something airy-fairy… the key is to stay evidence-based.” The structure of the program, perhaps ironically, facilitated a more individualized case plan for the client: What’s different about this program, and I appreciate this a lot, is that I am not spending an hour talking at the parent. I am spending an hour working with the parent…So it’s much more interactive. It’s much more focused on the parent. It’s their program. It’s about them, where they’re at, and meeting them where they’re at. I just guide them through it. Triple P also provided a rationale for presenting and sticking with a particular solution to a problem. One practitioner describes this advantage: “It was excellent because you could say, we have been trained in this. This is an evidence-based program and these are the things that if we follow with, it will work. You know, we have to stick to it.” In short, offering an evidence-based program was valued and appreciated by organization staff. Fourth, directors and practitioners reported that Triple P was enhancing linkages with other agencies. One interview participant indicated that they were receiving referrals because “word of mouth is we are doing a good job.” Similarly, another indicated that the health unit had been a very positive source of referrals “because they heard that we are making a difference in those clients’ lives.” Evidence of enhanced inter-agency relationships and increased referrals was described by one agency director:
415
“We seem to be getting a lot of parents that are being referred from Child Welfare to do Triple P as well.” Others also observed that their relationship with child and youth protection services had never been more positive. As one participant indicated: “We haven’t had, I don’t believe, as much of a relationship with Social Services as we have now.” In summary, there were numerous aspects of Triple P that were value-added for existing organizations offering family support and parenting advice: efficiency; a systematic approach; more credibility; and, enhanced relationships with other service organizations. Facilitators and Barriers to Integrating Primary Care Triple P into Existing Agencies There was variability in the way and extent to which Triple P was integrated into existing family support services. Five key factors impacting the integration process emerged from the group interviews. These were: (1) the level of development of pre-existing services; (2) the degree of “fit” between the Triple P program approach and existing agency practice, including philosophical approach, methods of delivery, and the perceived suitability/unsuitability for some client groups; (3) practitioner perceptions of how free they were to adapt the program; (4) rules about who could and could not use Triple P resources; and (5) training and sustainability issues. Level of Development of Agency Services One factor that seemed particularly salient in the implementation of Triple P was the pre-existing level of service/program development before Triple P was introduced. If the agency was already conducting a variety of quality programs, Triple P seemed to be more readily accepted and implemented as another “tool in the tool box” of resources. This is eloquently described by one practitioner: You know how they say how you build something… you put the big rocks in first and then the smaller rocks and then the gravel and then the sand and then the water. So we already had the big rocks and probably some of the smaller rocks. I think we are probably at the point where Triple P adds the gravel. Notably, a few group interview participants pointed out that the existing skills of agency staff contributed greatly to the success of Triple P implementation. One staff member stated: “without the skills and the experience level that the staff bring to Triple P, we would not have the success that we have.” With a well-established agency, staff already had
416
pre-existing relationships with parents, and were subsequently able to “market” Triple P to parents more effectively. This point is described by a practitioner: “So that relationship is important and then it makes them kind of able to participate…so we are getting the parents who have already built a relationship with [staff] and love them.” Practitioners saw the approach they took to their routine agency programming overall as key to legitimizing Triple P. Through offering programs that were not stigmatizing or threatening, parents would be open to additional programs such as Triple P. One staff member described this as “normalizing getting help.” Staff believed that it was important that relationships developed between parents and staff occurred within a destigmatizing context in order for their agency mandates to be fulfilled. One staff member indicated that problems were normalized through relationships between staff and families in programs such as drop-in playgroups. Another staff member echoes this sentiment: It’s through the relationships that we have established with our families. You know, you are not coming here because you’re having parenting difficulties or because you’re isolated or because you’re in a difficult relationship. You are coming here to play with your child. And because they have a relationship with all of us, if these things come up, they are more willing to talk to us. You know, there is not a lot of stigma coming here. Programs such as playgroups served as a “foot in the door” so that parenting concerns could then be addressed in a non-threatening manner. If, on the other hand, the agency was still in the process of “getting traction” in its overall programming, it appeared that Triple P was more difficult to implement. There were a number of reasons for this: lack of adequate infrastructure, insufficient staffing, and the inability to coordinate yet another program in an already struggling organization. Not surprisingly, implementing Triple P into an organization that was struggling to survive was difficult at best. This appeared to be the biggest struggle in agencies where there was a hub with various satellite sites in remote locations. One practitioner describes some of the challenges: “We have never established our center really closely out there. We hired one staff, she stayed for almost a year before she went on—and then we’ve had staff short, you know…” In short, having a well-established family support program with pre-existing rapport with parents enhanced the likelihood of successful implementation and uptake of Triple P. In contrast, an agency struggling to staff its programs, particularly in rural and remote areas, had more difficulty implementing a new program.
Prev Sci (2011) 12:411–422
Degree of “Fit” Between Triple P and Current Agency Practices There were three key issues with Triple P identified by some practitioners as not fitting well with existing agency approaches and needs: the behavioral approach of Triple P, the lecture style of Triple P seminars, and the “mis-fit” of Triple P for some client groups. Theoretical Approach Some agency practitioners expressed discomfort with the underlying theory of Triple P: behavioral Family Intervention (BFI). They indicated that this behavior modification approach ran counter to their training in early childhood development and attachment theory. One participant said that she is “still struggling with how attachment [fits] in Triple P.” Other participants indicated discomfort with the use of “time out” and “cry out” sessions. For instance, one practitioner said: “I am not in love with some of the cry time and time out stuff.” Others indicated that these techniques were far too prevalent in the Triple P materials, stating that “on every single tip sheet it gets back to the time outs and the quiet times.” These concerns with the approach were expressed by practitioners in half of the centers in our study, as indicated in the following quote: I found the behavior modification stuff, the rewards and time outs, that kind of stuff was too prevalent in it for me—I did not feel comfortable with it. And I finished the training and I went back to my employer and said, I don’t really like this program. I can’t see myself using it—I will use parts of it. There is a lot of stuff I will use but the few things in it that I didn’t like I really don’t like and I feel strongly about. In addition to being concerned with the philosophy of behavioral approaches, participants indicated that this blanket approach did not seem to recognize the individuality of different children and families: I struggle with time outs…I don’t necessarily believe in them. And for me it was a bit of a hard—it’s a hard sell… and I also don’t think it works with every child although Triple P would absolutely disagree with me. I think that’s pigeon-holing people and I think you need to find out what works…so that’s my struggle. And I had a hard time presenting that. Because their concern about the content was contrary to their theoretical approach, some practitioners chose not to use some of the Triple P resources. One participant describes how she dealt with this: “And there is one piece in that video, and it seems to me it’s the crying it out [part]. And then you leave and you just let them cry….But I remember saying ‘ladies, I’m not even going to really play this for you…’ I cannot promote something I am completely against.”
Prev Sci (2011) 12:411–422
Seminar Style Most staff perceived Triple P seminars to be very useful for parents. One staff member indicated that the Triple P seminars had been “hugely, hugely attended, more successful…than any of the [other] parenting courses.” However, some also found the lecture style difficult. This was due to two key issues: their own personal fears of public speaking, and their general disagreement with holding a lecture-type seminar as compared to a more process-oriented workshop style of group work. Concerns were raised about the clinical nature of the seminar approach versus a more process-oriented workshop style of facilitation. This perspective was described by one participant: “I found the validity of doing workshop based programs compared to seminar based, it appeared to me that the participants got more out of it compared to a seminar.” Part of the concern with seminars seemed to be a lack of knowing the impact on parents. One participant summed this up, stating: “Because you spend the whole time just giving information…you’d never have a chance to see how it works.” The Suitability/Unsuitability of Triple P for Some Client Groups Practitioners reported that Triple P did not work well for English as second language (ESL) families and was not appropriate for their clients with multiple or more complex needs. One participant who worked in an agency that served many immigrant families explained that because ESL families are struggling with language, the Triple P material, although good, “needed to be simplified.” Another participant indicated that it was a “big challenge” to get through the seminar material with an ESL group. Participants also noted that Triple P was not suitable for their clients who had more complex needs. They indicated that if they screened a family and found that they had more than one or two issues, Triple P (levels 2 and 3) would not be appropriate. One practitioner indicated: “So they need to be at a place where they feel they can focus…whereas if there is too much other stuff going on in their life, they likely don’t have time to track things and make a chart and—you know.” In summary, the theoretical orientation of Triple P, the approach toward seminar delivery, and the unsuitability of Triple P (levels 2 and 3) for some client groups were issues identified by some practitioners as potential areas of concern for Triple P implementation in their agencies.
“Permission” to Adapt Triple P The way in which Triple P staff were trained influenced the extent to which Triple P was implemented “by the book” or adapted to meet the specific needs of the parents at a particular site. Practitioners who had participated in one
417
wave of Triple P training used words like “rigid” and “inflexible” to describe the program. Conversely, practitioners who had participated in another wave of training viewed Triple P as a flexible and adaptable program. Although some practitioners described adapting some of the actual material, most adaptations pertained to how the material was delivered. Practitioners sometimes changed the wording of materials or provided additional examples to ensure that the information was clear to clients. Yet another adaptive approach was to combine Triple P resources with other pre-existing program resources, as explained below. So I just gathered some information when I was here, put a package together for them and fired it off. And there was some Triple P and there was some Active Parenting in it, put it all together. Because if I have information that they are looking for, I am doing them a disservice not to pass it on. So I turned a blind eye… Concerns related to the perceived adaptability of the program were closely related to understanding about who could or could not utilize Triple P resources. Rules About Who Can Use Triple P Resources A number of practitioners described frustration with “the rules” about who could use the tips sheets and other Triple P resources. This sentiment is well-captured by one participant: “Thou shalt not give out a tip sheet unless you are an accredited Triple P facilitator….” Another participant suggested that it would be helpful to post the tip sheets on their wall instead of keeping them locked away, where they can only be accessed by Triple P accredited staff. Another staff member indicated that the resources were fabulous and beneficial to parents but inaccessible to staff who weren’t accredited. She voiced her frustration with what she called the “Triple P police.” Yet another used hyperbole to express her viewpoint about how the guidelines for usage of Triple P resources created a barrier: If your tip sheets are all hidden in this metal cupboard— and you need a swipe card and a key to get into it right, and then show your accreditation pass and put your thumb print in, and it opens—then it makes it seem like scary and something that’s unapproachable… whereas if it’s up on our wall and parents are reading it and they’re interested and are asking questions, it will get utilized, and it will just become a normal part of what we are, what we do, what we offer. Above, this practitioner is suggesting that in addition to issues with access to resources, there is a kind of prohibition around the resources that is not constructive.
418
Although practitioners were concerned about breaking the rules of Triple P, it was sometimes difficult to resist the temptation to use the high-quality Triple P resources. One non-accredited practitioner described how she “cheated” by using a video from Triple P, although as she explained, she didn’t call it Triple P because she would “get into trouble”: I cheated a little with the video. There was a group that already exists that does a parent topic once a month. And you know, they wanted positive parenting. So I just brought the “Every Parent’s Survival Guide” video. And we just played it. And I paused it at good spots and we discussed it, and played a little bit more and we discussed it. And it was actually really successful. Similarly, one director indicated that in her agency, staff creatively incorporated tip sheets into their general programming in addition to giving them out to select parents. This way, the information could be more broadly utilized.
Prev Sci (2011) 12:411–422
to implement Triple P. When asked about whether or not Triple P has added to the agency, one director said: It absolutely added. I wouldn’t argue that. But at the same time, you know, you are using the staff you have. And so if you are adding programs to their list then you have to subtract programs somewhere else, right? So you know, in that sense, it’s a bit of a balancing act to just weave it in with what we do and make sure everybody has a balanced piece of the program…I think we have actually been working far beyond our capacity. Because the programs the organizations offered were provided by their pre-existing staff, the tax on staffing of adding yet another program could be worrisome. Staff shortages in personnel were a concern. Clearly, the sustainability of the Triple P program was a prominent concern for staff.
Training and Sustainability Discussion Participants described the training as “interesting” and “worthwhile” while at the same time indicating that it was “intense,” “stressful,” “overwhelming,” and “difficult.” While some said it was the process of training that was difficult, others described the anticipation of the accreditation process as the pressure point. Staff were unequivocal in their statements, however, that despite how challenging the processes of training and accreditation were, the end result was beneficial. One participant summed up the sentiment well in her comparison of the training experience to child birth: “It’s like being pregnant, right? You give birth to the baby. You really don’t want to do that again but you like the end result.” Perhaps a more salient issue around training was concern about the sustainability of Triple P due to staff turnover. A number of organizations had lost a Triple P trained worker, and these workers had not been replaced. There were concerns about how they would be able to continue offering Triple P because training was not offered very frequently, and participants were aware that training was a costly process. Staff suggested that a train-the-trainer model would help to ensure the continuity of the program. But you know, like for example, if [staff member] was to leave, or if [staff member] was to leave….then we have lost that piece of the program, because there is nobody else…again, it’s just that whole turnover…..I think there should be a training trainer. So that even if there was one or two people from each [family support centre] that were trained as trainers…there might be somebody in another [agency] that could still come in and train the staff. Concerns were also raised about the expense of the program, and how much time and energy it took from staff
From the perspective of Triple P International (the proprietor), Alberta Children and Youth Services (the customer), and many, though not all of the participating family support agencies and practitioners, the dissemination of Triple P in the Province of Alberta could be viewed as a success. The proprietor of Triple P has a major share in the international market for parenting training programs, and was successful in engaging the interest of ACYS in piloting the program in Alberta. ACYS was also successful in engaging the participation of non-government family support agencies and practitioners, although it is unclear whether or to what extent their participation was truly voluntary. And, many family support practitioners reported success in engaging parentclients in the Triple P program. This case study identified several factors that were key to this success, and which may be applicable or transferable to other programs and knowledge dissemination projects. These are: Triple P as evidence-based practice; the organizational or work-place context; and high-quality resources. Key Success Factors The branding of Triple P as evidence-based practice appears to be one of the keys to its successful dissemination and implementation in Alberta. In popular evidence-based practice discourse, evidence generated by “hard science” (i.e., randomized controlled trials) is privileged and the legitimacy of other sources and forms of evidence, such as practice-based experience, is reduced or discounted (Clegg 2005; La Caze 2009). The influence of this popular
Prev Sci (2011) 12:411–422
discourse in preparing the groundwork for the dissemination and implementation of Triple P in Alberta was apparent at two levels of the dissemination chain. First, it appears that ACYS instigated the pilot of Triple P in Alberta despite there being little or no evidence that existing programs were ineffective. Rather, it seems that the branding of Triple P as evidence-based practice, and the fact that existing services were not branded as such (despite having many of the same basic ingredients as Triple P), was sufficient justification for the pilot. At the next level, a number of interview participants explained that although the Triple P program was not radically different from pre-existing programs, the fact that it was “evidence-based” gave them more credibility in the eyes of clients and other service providers. The reported result was increased inter-agency cooperation, client referrals, and client engagement. These findings are similar to those reported by Dean et al. (2003), who found that linkages with external agencies were strengthened through Triple P implementation. The next key success factor identified in this study was the organizational or workplace context. Most of the participating agencies were established (e.g., in terms of programming and community presence) and stable (e.g., in terms of staffing), and these agencies found it easier to integrate Triple P into the services they offered. Sanders et al. (2009) found that lack of confidence in parent consultation work, lack of time due to after-hours appointments, and lack of knowledge or skill in behavioral family intervention were barriers to use of Triple P. It may be that more established family support centers in our study were able to provide additional time and support to staff implementing Triple P, thereby making the transition to the new program less taxing. A related point is that stable agencies were able to capitalize on parent trust earned through other “nonstigmatizing” programs, to engage parents in Triple P. Programs such as drop-in playgroups were an effective outreach to parents in the community and an effective medium for promoting Triple P and bringing parents into the Triple P fold. The third key success factor was the high quality Triple P resources. These resources effectively translated knowledge from research into user-friendly resources for practitioners and their clients. Practitioners in this study highlighted “efficiency gains” related to the quick access to high-quality material available through Triple P. Having these educational resources in-hand was timesaving, and the systematic nature of the intervention ensured that time was used effectively. The way that Triple P was able to consolidate and distribute researchbased information for practitioners shows the importance of this aspect of evidence-based practice. Relevant here
419
is the discipline of health informatics (Spring 2007). This term refers, in part, to how information is stored and managed so that it is readily available when it is needed. Informatics goes beyond resource management, however, to also consider electronic record keeping of client information, professional practice guidelines, and systematic reviews of the evidence. It is, in short, about “the technological systems infrastructure that provides decision support” (Spring 2007, p. 616). Findings from our study suggest that exploring ways to facilitate accessibility to high-quality parenting materials could be helpful for family support centers. Health informatics could be one way to enhance access in a sustainable way. The advantage of programs like Triple P is that they provide consistent access to relevant, high-quality, and up-to-date information for practitioners, facilitating better use of time, and enhancing the quality of services that organizations in various locals can provide. Ongoing Tensions and Threats to the Sustainability of “Evidence-Based” Programs Although the dissemination and implementation of Triple P in Alberta may be described as a success from at least some vantage points, this case study also sheds light on some of the central tensions, conflicts, and potential barriers inherent in the dissemination of Triple P. In this study, central tensions existed between the interests of the proprietor and the interests of end-users, particularly in regard to training and sustainability; between the requirement of program fidelity and the practical need for adaptation or tailoring to best meet the needs of local populations and clients; between the theoretical underpinnings of the program and the theoretical positions and practice experience of practitioners; and, between program scope and client needs. Insights gained from better understanding these tensions could potentially be useful in the implementation of similar standardized programs. Proprietor and End-User Interests One point of tension surrounded the interests of proprietors and end-users in regard to practitioner training. On the one hand, it was clear that practitioners in our study perceived the training they received from Triple P as worthwhile. This is not surprising, as training in a naturalistic setting offers hands-on skills that didactic approaches cannot teach as effectively (Spring 2007). Onsite training provided group skill development with peers to enhance common understanding. Furthermore, the requirement that practitioners demonstrate their skills through initial testing and repeated practice through an accreditation process, although challenging for practitioners, was
420
perceived as a way to better ensure program credibility. In combination, these aspects of the training process appeared to be key components of successfully preparing practitioners for the implementation of a new program into an existing organization. On the other hand, the Triple P training and accreditation model, which involved “bringing a trainer in from the outside,” was viewed as a significant threat. Because it was so costly, contracting agencies required the best possible and long-lasting training for their substantial investments. It is here that we see a considerable pressure point in our study. Program sustainability was a significant concern for agency directors and practitioners generally and more so for those working in environments in which there was high staff turnover. Our findings suggest that there are inherent tensions between the interests of agencies in their efforts to sustain a pre-packaged manualized program in the long-term, and the interests of the corporation attempting to build a business that guarantees program integrity and consistency backed by research evidence. There is no easy resolution of this tension between the needs of the individual agencies and those of the program owners. However, this study suggests that innovative solutions will be required in order to find ways to reconcile competing needs. One such way may be through a train-the-trainer approach. Indeed, several participants suggested that a train-the-trainer model would be more responsive to agency needs, and promote program sustainability. Findings from other implementation studies suggest that train-the-trainer models can be an effective way to increase the likelihood of successful program uptake and sustainability (Corelli et al. 2007; McLellan et al. 2009). Introducing a model such as this inevitably bring new challenges related to program consistency and quality, and trade-offs for standardized programs are inevitable. However, in order to keep programs running over the long-term—an interest shared by both the corporations disseminating them and the agencies adopting them— alternative approaches to program implementation of evidence-based programs will need to be considered to enhance sustainability. Fidelity vs. Adaptation Another tension related to the issue of sustainability was the perceived adaptability of the Triple P manualized program. In our study, some practitioners perceived Triple P to be rigid while others perceived it to be flexible and adaptable. This appeared to be an effect of training, depending on the way the material was presented. The extent to which a program is perceived as adaptable is important. Maintaining program adherence is a legitimate concern for program devel-
Prev Sci (2011) 12:411–422
opers; yet, it is also critical to recognize the importance of adaptations to a particular program in a particular context that will enhance rather than diminish the integrity of a program. The challenge of recognizing the difference between competent adaptation and rigid adherence to a program is critical to successful program implementation, but not easy to discern (Addis 2002). However, Berwick (2003) indicates that adaptations to programs are imperative to successful implementation. He posits that “innovations are more robust to modification than their inventors think, and local adaptation, which often involves simplification, is nearly a universal property of successful dissemination” (Berwick 2003, p.1971). The results of this study suggest that innovations may be an important component of program implementation; however, we also posit that these innovations should be scientifically evaluated in order to more clearly understand the implications of these adaptations on program outcomes. Theory-Fit The fit or mis-fit between practitioner’s theoretical orientation or preferred approach and the theory and approach of Triple P was another factor that appeared to create some tension for the successful implementation of Triple P. Specifically, some practitioners preferred a more relationship-based approach over the seminar style of Triple P, and some felt that Triple P was “too behavioral” and/or “too problem-focused.” Ogden et al. (2005) had similar findings in their study of a parenting program implemented in Norway. They found that in addition to having to change their theoretical orientation, practitioners were somewhat resistant to the new approach because the change in direction implied a critique of previous practice. Here, we observe a tension between the valorization of evidence-based practice (Clegg 2005), and practitioners’ experiences of “tried and true” methods of program delivery, known to them to be effective through substantial experiential learning. It seems that introducing “new wine into old wine skins” can create bumps in an implementation process and warrants careful consideration. Program Scope and Client Needs Another component of “fit” that appeared to affect Triple P implementation was the extent to which the program met some clients’ needs. In particular, agencies serving families with multiple or complex needs or ESL clients raised concerns about the suitability of Triple P for their particular client groups. This concern may in part be an impact of the levels of Triple P offered in this particular implementation process. Triple P levels 2 and 3 are designed for discrete and/or minor child behavioral problems and not designed for more complex behavioral problems or families with considerable dysfunc-
Prev Sci (2011) 12:411–422
tion. However, concerns raised about the suitability for ESL families require further investigation. Diversity in clients served, particularly in immigrant-receiving nations such as Canada and the U.S., must carefully be considered when implementing a manualized program into a particular context.
Limitations There are several limitations in this study. First, when using interviews as a primary research tool, researcher effect must be considered in the interview (Kvale 1996). The extent to which participants shape their answers to reflect what they think the interviewer may want to hear is increased. Because this study was a case study of an implementation process of a pilot program in family support centers throughout Alberta, it is possible that staff may have believed that the interviewer wanted to hear positive things about Triple P and framed their responses to questions accordingly. Using group interviews also has its limitations. Although group interviews can enhance discussion through enriching the details of an experience or event from multiple standpoints, there is also a risk that power dynamics within the group can curtail some points of view, while exaggerating others. In the case of these group interviews, practitioners and directors were present, as were Triple P accredited staff and non-accredited staff. It is possible that some practitioners may have felt pressure to censor their comments because their boss was there, or because there were Triple P trained staff present that they did not wish to offend. Alternatively, Triple P staff may have felt it necessary to over-emphasize the contributions of non-accredited staff to the successful implementation of Triple P in the sites. Finally, it must be acknowledged that this is a relatively small case study of a program implemented in many countries throughout the world. The extent to which these findings can be generalized must be carefully assessed, recognizing that experiences of implementation will vary in different national, geographic, and organizational contexts.
421
approach to practitioner training that includes active skills training; development of peer support and supervision networks; and, built-in evaluation mechanisms or “feed-back loops” to reinforce success and foster continuous quality improvement. The findings from our study support some of Sanders and Turner’s (2005) assertions. In particular, the quality of the program, the value of partnership with existing organizations, and the value of high-quality resources were evident in the interviews we conducted with directors and staff of Triple P sites in Alberta. However, our study findings also suggest that there are remaining issues to be resolved in the successful implementation of Triple P into existing organizational structures. The sustainability of the program due to high training costs and the requirement that the trainer be from Triple P International, combined with high staff turnover, is an issue that merits further consideration. The inability to sustain the program due to staff attrition could result in what Schinke et al. (1991) calls “maintenance failure” (cited in Sanders and Turner 2005). In addition, perceived flexibility and adaptability of the program were ongoing concerns for staff. Finally, the behavioral approach was somewhat disconcerting for practitioners educated in different theoretical schools such as attachment approaches, as was suitability of the program for some client groups. Developers of evidencebased programs must be cognizant of this concern, to avoid an unintended outcome of “one size fits none” in program delivery. Future studies of the implementation process of Triple P and other research-based programs will continue to shed light on how to facilitate successful adoption of a new program into an existing organization. Like Addis (2002), we suggest that careful consideration of voices from the trenches will provide valuable insight into program implementation that will ultimately serve to strengthen evidence-based practice and best assist the clients these programs aim to serve. Acknowledgement The work was supported by a grant from the Alberta Centre for Child, Family and Community Research. We would like to thank Laura Hedlin for her research assistance on this manuscript. We would also like to thank two anonymous reviewers who provided valuable feedback on this paper.
Conclusion References Sanders and Turner (2005) attribute the success of Triple P to a variety of factors, including but not limited to the following: quality of the intervention; flexibility of the Triple P system; strategic alliances with organizations, including the identification and support of an internal advocate to ensure that program adoption is supported by management; a “just right” (i.e., not too onerous)
Aarons, G. A., & Palinkas, L. A. (2007). Implementation of evidencebased practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 34, 411–419. Addis, M. E. (2002). Methods for disseminating research products and increasing evidence-based practice: Promises, obstacles, and future directions. Clinical Psychology: Science and Practice, 9, 367–378.
422 Addis, M. E., & Krasnow, A. D. (2000). A national survey of practicing psychologists’ attitudes toward psychotherapy treatment manuals. Journal of Consulting and Clinical Psychology, 68, 331–339. Bernard, H. R. (2000). Social research methods: Qualitative and quantitative approaches. Thousand Oaks, CA: Sage. Berwick, D. M. (2003). Disseminating innovations in health care. Journal of the American Medical Association, 289, 1969– 1975. Berwick, D. M. (2008). The science of improvement. Journal of the American Medical Association, 299, 1182–1184. Clegg, S. (2005). Evidence-based practice in educational research: A critical realist critique of systematic review. British Journal of Sociology of Education, 26, 415–428. Corelli, R., Fenlon, C., Kroon, L., Prokhorov, A., & Hudmon, K. (2007). Evaluation of a train-the-trainer program for tobacco cessation. American Journal of Pharmaceutical Education, 71, 1–9. Dean, C., Myors, K., & Evans, E. (2003). Community-wide implementation of a parenting program: The south east Sydney positive parenting project. Australian e-Journal for the Advancement of Mental Health, 2. Retrieved December 1, 2010 from http://www.reachoflouisville.com/meath/meath/Community-wide %20Implementation%20of%20a%20Parenting%20Program% 20The%20South%20East%20Sydney%20Positive%20Parenting %20Project.pdf Kvale, S. (1996). Interviews: An introduction to qualitative research interviewing. Thousand Oaks, CA: Sage. Kumpfer, K. L., Alvarado, R., Smith, P., & Bellany, N. (2002). Cultural sensitivity and adaptation in family-based prevention interventions. Prevention Science, 3, 241–246. La Caze, A. (2009). Evidence-based medicine must be…. The Journal of Medicine and Philosophy, 34, 509–527. Landry, R., Amara, N., Pablos-Mendes, A., Shademani, R., & Gold, I. (2006). The knowledge-value chain: A conceptual framework for knowledge translation in health. Bulletin of the World Health Organization, 84, 597–601. Linney, J. A. (1990). Community psychology into the 1990s: Capitalizing opportunity and promoting innovation. American Journal of Community Psychology, 18, 1–17. Mayan, M. (2009). Essentials of qualitative inquiry. Walnut Creek, CA: Left Coast Press. McLellan, J., Leon, T., Haffey, S., & Barker, L. (2009). Exporting a Canadian parenting education program to the Dominican Republic. Public Health Nursing, 26, 183–191.
Prev Sci (2011) 12:411–422 Miles, M., & Huberman, A. (1994). Qualitative data analysis: A sourcebook of new methods (2nd ed.). Thousand Oaks, CA: Sage. Ogden, T., Forgatch, M. S., Askeland, E., Patterson, G. R., & Bullock, B. M. (2005). Implementation of parent management training at the national level: The case of Norway. Journal of Social Work Practice, 19, 317–329. Rapp, C. A., Etzel-Wise, D., Marty, D., Coffman, M., Carlson, L., Asher, D., et al. (2010). Barriers to evidence-based practice implementation: Results of a qualitative study. Community Mental Health Journal, 46, 112–118. Richards, L., & Morse, J. M. (2007). Read me first for a user’s guide to qualitative methods (2nd ed.). Thousand Oaks, CA: Sage. Rogers, E. M. (1995). Diffusion of innovations (4th ed.). New York: Free Press. Sanders, M. R., & Turner, K. M. (2005). Reflections on the challenges of effective dissemination of behavioral family intervention: Our experience with the Triple P—Positive Parenting Program. Child and Adolescent Mental Health, 10, 158–169. Sanders, M. R., Turner, K. M., & Markie-Dadds, C. (2002). The development and dissemination of the Tripe P- Positive Parenting Program: A multi-level, evidence-based system of parenting and family support. Prevention Science, 3, 173–189. Sanders, M. R., Prinz, R. J., & Shapiro, C. J. (2009). Predicting utilization of evidence-based parenting interventions with organizational, service-provider and client variables. Administration and Policy in Mental Health and Mental Health Services Research, 36, 133–143. Schinke, S. P., Botvin, G. J., & Orlandi, M. A. (1991). Substance abuse in children and adolescents: Evaluation and intervention (Vol. 22). Thousand Oaks, CA: Sage. Seng, A., Prinz, R., & Sanders, M. (2006). The role of training variables in effective dissemination of evidence-based parenting interventions. International Journal of Mental Health Promotion, 8, 19–27. Simons, L., Lathlean, J., & Squire, C. (2008). Shifting the focus: Sequential methods of analysis with qualitative data. Qualitative Health Research, 18, 120–132. Spring, B. (2007). Evidence-based practice in clinical psychology: What it is, why it matters; what you need to know. Journal of Clinical Psychology, 64, 611–631. Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks, CA: Sage. Triple P. (2010). What is Triple P? Retrieved December 4, 2010 from http://www27.triplep.net/?pid=29.