Enhancing Quality Interventions Promoting ... - Wiley Online Library

72 downloads 43262 Views 1MB Size Report
an example of a “T3” study—using the nomenclature of Khoury et ... USA: 3RAND Corporation, Santa Monica, CA, USA ; 4Georgia Campaign for Adolescent Pregnancy ... workshops, and telephone/email consultation focusing on coalition ..... Green L . From research to 'best practices' in other settings and populations .
Adolescent series (4th in the Series) Enhancing Quality Interventions Promoting Healthy Sexuality (EQUIPS): A Novel Application of Translational Research Methods Matthew Chinman, Ph.D.1, Joie Acosta, Ph.D.2, Patricia Ebener, B.A.3, Jennifer Driver, B.S.4, Jamie Keith, M.S.5, and Dana Peebles, M.P.H.4 Abstract Translational research is expanding, in part, because Evidence-Based Programs or Practices (EBPs) are not adopted in many medical domains. However, little translational research exists on EBPs that are prevention programs delivered in nonclinical, community-based settings. These organizations often have low capacity, which undermines implementation quality and outcomes. Rigorous translational research is needed in these settings so within a single study, capacity, implementation quality, and outcomes are measured and links between them tested. This paper overviews the study Enhancing Quality Interventions Promoting Healthy Sexuality (EQUIPS), which tests how well a community-based setting (Boys & Girls Clubs) conducts an EBP called Making Proud Choices that aims to prevent teen pregnancy and sexually transmitted infections, with and without an implementation support intervention called Getting To Outcomes. The study design is novel as it assesses: Getting To Outcomes’ impact on capacity, implementation quality, and outcomes simultaneously and in both study conditions; will assess sustainability by measuring capacity and fidelity a year after the Getting To Outcomes support ends; and will operate on a large scale similar to many national initiatives. Many studies have not incorporated all these elements and thus EQUIPS could serve as a model for translational research in many domains. Clin Trans Sci 2013; Volume 6: 232–237

Keywords: prevention, methodology, translational research Introduction

Translational research is increasing in popularity given the mounting evidence that Evidence-Based Programs or Practices (EBPs) have been slow to be adopted in many medical domains.1-3 Translational research within medicine has shown that strategies more active than simply disseminating information are needed to increase the use of EBPs. For example, a recent meta-analysis of 23 studies found that “practice facilitation,” or having external consultants promote the use of EBPs, increased the adoption of a variety of EBPs within primary care settings.4 However, despite a few exceptions,5,6 there is little translational research on EBPs that are prevention programs delivered in nonclinical settings by community-based organizations which tend to have low capacity. In addition, because certain medical EBPs are narrowly defined (e.g., hemoglobin tests), merely tracking their adoption may be a sufficient marker for whether the EBP is being “used.” In contrast, certain prevention programs—e.g., the prevention of teen pregnancy and sexually transmitted infections (STIs)—must not only be adopted, but planned, implemented, and evaluated with quality in order to achieve outcomes similar to those found in the original research studies. Therefore, translational research is needed that focuses on these low capacity settings and on empirical links beyond implementation strategies and adoption, namely links between implementation strategy ➝ capacity ➝ implementation quality (i.e., fidelity) ➝ individual outcomes. This paper presents an overview of a study that attempts to make these links and was originally presented as part of the symposium, “From Genes to Community: Exploring Translational Science in Adolescent Health Research” held at the University of Pittsburgh School of Medicine. The study is called Enhancing Quality Interventions Promoting Healthy Sexuality, or EQUIPS, and is an example of a “T3” study—using the nomenclature of Khoury et al.,7 adopted by NIH8—in which the express purpose is to increase uptake and implementation of EBPs into practice. The study tests how well a community-based setting (Boys and Girls Clubs) carries out an evidence-based prevention program called Making Proud Choices (MPCs) that aims to prevent teen pregnancy and

STIs, with and without an implementation support intervention called Getting To Outcomes. Although only in its second year, the study could serve as a model for translational research in a wide range of domains in community based settings. Teen sexual health The teen pregnancy rate in the United States for girls aged 15–19 years (39.1 per 1,000) continues to be the highest among industrialized nations.9 In 2008, 733,000 young women 15–19 years old became pregnant; most of these pregnancies were unintended.10 Sexually active teens are at high risk for contracting STIs.11 For example, women aged 15–24 have rates of Chlamydia five times greater than the overall average rate for women. Early child-bearing puts adolescents and their children at risk for additional negative consequences. Compared with women who wait until age 20 or 21 to have children, teen mothers are more likely to drop out of school, require public assistance, and live in poverty.12,13 Challenges implementing EBPs During the last decade, over 50 EBPs addressing the prevention of teen pregnancy and STIs have been shown through research to improve outcomes including increased use of condoms, reduced numbers of sexual partners, and delayed first sexual intercourse.14 Yet communities often face difficulty in adopting and implementing these EBPs with the fidelity needed to achieve outcomes demonstrated by researchers,15,16 yielding a “gap” between research and practice at the local level. This gap17,18 results from limited resources, complex EBPs and a lack of capacity—the knowledge, attitudes, and skills—needed to adopt and implement “off the shelf ” EBPs. Distributing information about effective EBPs—e.g., through reports and online registries—is a popular method intended to improve prevention outcomes.19 While there have been exceptions,20 merely distributing information about EBPs does not appear to have the desired impact—in part because it does not sufficiently

RAND Corporation, Pittsburgh, PA, USA; 2RAND Corporation, Washington DC, USA: 3RAND Corporation, Santa Monica, CA, USA ; 4Georgia Campaign for Adolescent Pregnancy Prevention, Atlanta, GA, USA ; 5Alabama Campaign to Prevent Teen Pregnancy, Montgomery, AL, USA .

1

Correspondence: Matthew Chinman ([email protected]) DOI: 10.1111/cts.12031

232

VOLUME 6 • ISSUE 3

WWW.CTSJOURNAL.COM

Chinman et al. Novel Translational Research ■

affect the capacity that communities need to plan and implement EBPs. This gap exists in many other fields. Lessons learned from prior efforts to increase the use of EBPs in teen sexual health The Centers for Disease Control and Prevention (CDC) has conducted several large scale initiatives to help community based organizations adopt and implement EBPs to prevent teen pregnancies and STIs. The first was to build capacity among community coalitions (i.e., community stakeholders who collaborate to solve a problem) in the Community Coalition Partnership Program.15 However, the result of technical assistance (TA, e.g., annual site visits, workshops, and telephone/email consultation focusing on coalition management) yielded few new EBPs being started.15,16 In the CDCs second effort, the Coalition Capacity Building for Teen Pregnancy Prevention cooperative agreement, CDC funded three national organizations to provide TA to five state level organizations (also CDC-funded) that, in turn, would help 5–10 local organizations per state adopt EBPs to prevent teen pregnancy. This tiered capacitybuilding system helped several local organizations to adopt EBPs and engage in more quality implementation and self-evaluation of those EBPs. In its most recent effort, the CDC built upon the tiered system approach by explicitly using implementation theory to guide how the capacity building was organized and delivered and Getting To Outcomes (GTO) to guide the actions of the community practitioners. Studies of implementation theory have shown that factors at both the individual level (e.g., training, skills, efficacy, involvement in decision making, and satisfaction of teachers) and at the organizational level (e.g., active support of leadership) predict successful implementation of EBPs,21-23 and thus are important factors for capacity-building efforts to address. For example, one implementation theory—Diffusion of Innovation—states that the more complex the intervention, the less likely it is to be adopted and implemented.24 Therefore, CDC translated a number of studies and tools in the manual Promoting Science-based Approaches to Teen Pregnancy Prevention using Getting To Outcomes (PSBA-GTO).25 This manual was then provided to the state-level TA providers and community-based organizations to make it easier for communities to digest the EBP literature.26 While there was some evidence that community-based organizations were better able to adopt, implement, and self-evaluate EBPs,27 no systematic evaluation was conducted. These CDC initiatives and other literature yielded several important lessons. First, communities need both information and direct support to improve the capacity needed for implementing EBPs. Second, the support needs to be targeted directly at program staff at existing community-based organizations (not ad hoc coalitions) for greater impact and sustainability of capacity building efforts. Third, a tiered capacity-building system to deliver TA can be a more efficient use of resources to structure the capacity-building. Fourth, using implementation theory to guide how the capacity building is organized and delivered is important. As described below, EQUIPS incorporates these lessons, systematically testing the impact on capacity, fidelity, and teen sexual health outcomes in a single randomized controlled trial (RCT). Methods

Design overview EQUIPS is a 5-year, multi-state, cluster RCT to assess how a capacity-building intervention called GTO affects three variables WWW.CTSJOURNAL.COM

of interest: capacity to adopt and implement an EBP, fidelity of EBP implementation, and the sexual health outcomes of middle school youth. The trial is comparing 16 Boys & Girls Clubs (BGC) implementing the MPCs program (MPC28 in the fashion typical of community settings, with 16 Boys & Girls Clubs implementing MPC augmented with GTO. After 2 years, the GTO support will end with the first 16 MPC sites, but the study will continue to assess MPC fidelity for another year. Guided by Greenberg’s conceptual framework,29 Figure 1 shows the EQUIPS logic model of factors that influence program implementation. Program Context includes the capacity of individual staff. The GTO intervention will build the capacity of individual practitioners to implement EBPs. Capacity will be measured by a Capacity Interview. Planned Program refers to the content, structure, and format of the MPC program. In typical community settings, the Planned Program and Program Context influence EBP adoption and implementation, reflected in a certain level of fidelity (e.g., dose, adherence, delivery quality, target population responsiveness), which in EQUIPS is being measured through objective observations of MPC sessions. A third factor, Implementation Support includes planning activities and the availability of quality technical support, including training, ongoing coaching and consultation. In EQUIPS, Implementation Support is the GTO capacity-building intervention. According to Greenberg, the factors within Program Context and Planned Program can be influenced by Implementation Support to improve implementation and subsequently enhance program outcomes, which is being assessed through a Youth Survey. Study sites The delivery of MPC, the use of GTO, and data collection is taking place at 32 BGC sites in Atlanta, Georgia (16 locations) and Alabama (16 locations). Since 1906, BGCs have worked with young people from disadvantaged economic, social, and family circumstances. Although there is some variability across sites, each site has its own facility and a small number of staff and part-time volunteers (7–10). Traditional BGC programming takes place after school and in summer and ranges from unstructured time in large common rooms and gyms to specific programming focusing on leadership and character education, health and wellness, and academic tutoring. At the 16 sites of the BGC of Metro Atlanta, there are approximately 1,450 11–14-year-old members. The ethnic/racial make-up is similar across sites: about 81% Black, 10% Latino, 2% White, 7% mixed race or other. In Alabama the 16 locations are spread over three Clubs that serve about 2,550 middle school youth. The ethnic/racial make-up is similar across the Alabama sites, ranging from 70 to 95% Black, 1 to 22% White, and 4 to 8% Latino. Making proud choices and getting to outcomes MPC is a well-established pregnancy and HIV/STI risk-reduction EBP with multiple trials demonstrating its effectiveness. Using Social Cognitive Theory,30 the Theories of Reasoned Action,31 and Planned Behavior,32 MPC aims to influence adolescents’ knowledge and beliefs about risk, efficacy, and control to change behavior.28,33 MPC stresses the role of sexual responsibility, community, and pride in making safer sexual choices. The program promotes abstinence first, but also provides information and skills needed for safer-sex decision-making and practices (e.g., condom use). Several RCTs have demonstrated the intervention’s efficacy in decreasing sexual-risk behavior and influencing the cognitive VOLUME 6 • ISSUE 3

233

Chinman et al. Novel Translational Research ■

Figure 1. Study conceptual model.

mechanisms driving behavior change.20,28,33-35 Community-based tests of MPC20 and of very similar precursor programs34 showed that it has not achieved the same outcomes as the original research. Thus, EQUIPS presents an excellent opportunity to assess the impact of a capacity building intervention like GTO on an EBP like MPC as implemented in community-based organizations. The GTO intervention builds capacity for implementing EBPs by strengthening the knowledge, attitudes, and skills needed to choose, plan, implement, evaluate, and sustain those EBPs. GTO does this by posing ten questions or “steps” (Table 1) that must be addressed in order to obtain positive results and then provides practitioners with the guidance necessary to answer those questions with quality—i.e., perform each step as close to the ideal as possible. Implementation of these ten steps is facilitated by three types of assistance: the GTO manual of text and tools originally published by the RAND Corporation36 and then applied to teen pregnancy (PSBA-GTO), face-to-face training, and onsite TA. GTO has been adapted to a number of other content areas including drug prevention,36 underage drinking prevention,37 positive youth development,38 and homelessness.39 In all these domains, the goal is to work with organization leadership to integrate the practices GTO targets into routine operations, closing the gap between research and practice. Consistent with social cognitive theories of behavioral change,30,32,40 exposure to GTO training and TA leads to more knowledge about performing GTO-related activities, which leads to more positive attitudes towards these activities, which in turn leads to the execution 234

VOLUME 6 • ISSUE 3

1. What are the needs to address? GTO Step #1 provides information about conducting a needs assessment. 2. What are the goals & objectives? GTO Step #2 has worksheets for creating measurable goals & objectives. 3. Which evidence-based programs can be useful in reaching the goals? GTO Step #3 overviews the importance of using evidence-based programming. 4. What actions need to be taken so the selected program fits the community context? GTO Step #4 prompts readers to reduce duplication and facilitate collaboration with other programs. 5. What capacity is needed for the program? GTO Step #5 prompts readers to ensure there is sufficient organizational capacity to conduct the program. 6. What is the plan for this program? GTO Step #6 presents information and tools to plan activities. 7. How will implementation be assessed? GTO Step #7 provides information/tools to do process evaluation. 8. How well did the program work? GTO Step #8 presents information/tools to do outcome evaluation. 9. How will continuous quality improvement strategies be incorporated? GTO Step #9 prompts practitioners to reassess Accountability Questions 1–8 after completing the program to improve the program. 10. If the program is successful, how will it be sustained? GTO Step #10 presents several ideas to consider when attempting to sustain an effective program. Table 1. The 10 GTO steps and information presented in the manual.

WWW.CTSJOURNAL.COM

Chinman et al. Novel Translational Research ■

of more GTO-related behaviors. These behaviors support the successful implementation of EBPs,41 reflected in improved fidelity and, ultimately, improved outcomes. With practitioners of drug prevention programs, GTO has been found to improve the capacity of individual practitioners and the performance of prevention programs in both quasi-experimental42 and RCTs.43 EQUIPS is the first opportunity to extend the research to individual youth outcomes. Organization of the GTO capacity-building system EQUIPS uses a tiered GTO capacity-building system that consists of two half-time TA providers, one local to Atlanta and one local to Alabama, to deliver the three types of assistance: a manual of text and tools; face-to-face training; and onsite TA. Training. TA staff provide 12–18 hours of training for all the BGC staff slated to facilitate MPC in both conditions (MPC & MPC+GTO). Facilitators learn the content, unique features, and evaluation findings of MPC as well as effective facilitation skills such as answering sensitive questions and addressing challenging participant behavior. Facilitators also learn about the importance of fidelity and which adaptations to the curriculum are consistent with MPC. This training is repeated annually for both continuing and new staff. At the same time, all BGC staff in the MPC+GTO group receive GTO training and the PSBA-GTO manual. The goals of these sessions are to help the BGC staff understand the GTO process, including the use of TA. Although GTO training is always spread over several sessions in which two GTO steps are addressed, there has been some variation in the training delivery as dictated by the needs of individual sites. For example in some sites, GTO Steps 1&2 and 3&4 are addressed via webinars, while other sites are using face to face meetings. In all cases, Steps 5&6, 7&8, 9&10 are being addressed via face to face training. Each training involves walking BGC staff through GTO’s steps, as applied to MPC, and providing concrete examples of how each GTO Step can be successfully addressed with tools from the PSBA-GTO manual. TA providers will continue GTO training for the MPC+GTO group in the second year, supplemented with process and outcome evaluation data from the first year’s implementation as additional information for discussion and TA. Technical assistance. The two TA providers are supporting 16 MPC+GTO sites (eight each) for 2 years. The type of TA provided in GTO is “facilitation,” a consultation method that emphasizes change in work practices through encouragement and action promotion.44 Facilitation has been shown to improve EBP implementation as compared to manuals only and/or 1-day training workshops.45 Consistent with several other TA processes,46,47 the GTO-based TA involves three structured steps including 1) an initial diagnosis of program functioning, 2) development of a logic model, and 3) development of a plan for how the TA and program staff are to make improvements. After that, TA focuses on how to use each GTO step to guide implementation of the MPC program during and in-between regular (often biweekly) meetings between TA and BGC staff. For example, TA providers are working with BGC staff to complete several of the GTO-based forms to plan MPC at their site. BGC staff begin their planning forms during GTO trainings, finish the forms on their own, and then receive feedback about the forms at the TA meetings. Also, TA providers have been using data from the teen sexual health outcome survey (pre- and postadministrations) and fidelity monitoring conducted at the BGC WWW.CTSJOURNAL.COM

sites and their own observations of MPC to guide continuous quality improvement of MPC. As part of the tiered system, a TA supervisor has been supporting the TA providers. The TA supervisor reviews completed tools and responds to TA field notes (TA Monitoring Form, see below) and participates in weekly phone calls with TA providers to discuss site-specific issues. Delivery of MPC BGCs have started implementing MPC with the goal of enrolling 30 middle school youth over the 2 year intervention period. Consistent with the community-based replication of MPC conducted by Jemmott et al.,20 all BGC sites receive MPC training, materials and about $3,000 a year to support implementation. In the MPC-only group, BGC staff use the training and the materials and implement MPC on their own. In the MPC+GTO group, BGC staff use the GTO process to plan, implement, evaluate, and sustain MPC, assisted by the TA providers and the PSBA-GTO manual. It is hypothesized that having an opportunity to actively use newly learned skills will help BGC staff to incorporate GTO into their regular work process. This is supported by research showing that active participation and practice of newly learned skills is the most effective way to internalize information.48,49 Measures and data collection Below is a brief description of the measures being used in EQUIPS to highlight how the study will use data not only to make empirical links between capacity, implementation quality, and outcomes, but also to form the basis of feedback that TA providers will use to help the BGC sites in the MPC+GTO group improve. TA Monitoring Forms will be completed by TA providers to track the amount and content of all TA sessions. In the past, this data has been critical in facilitating TA supervision. Additionally, hours of TA spent on certain domains within GTO (e.g., evaluation) has been associated with increases in the performance of programs in those same domains. Based on previous GTO research,42,43 we are using the Capacity Interview to assess BGC staff capacity to conduct highquality teen pregnancy programming. Although programs consist of individual people with varying levels of abilities, capacity ratings are made at the program level since programs operate as a unit. The ratings are made using a structured interview with key program personnel at all 32 BGC sites. The ratings reflect how well each BGC site is carrying out the tasks tied to each of the ten steps of the GTO model, from “highly faithful” to “highly divergent” from ideal practice. The Capacity Interviews are conducted with BGC staff at both MPC+GTO (n = 16) and MPC only (n = 16) sites annually during the first 4 years: baseline (prior to GTO), midpoint of GTO, post GTO, and after a year of no GTO (for GTO+MPC group) or after a year of GTO after not having it for the previous 2 years (for MPC only group). Fidelity of MPC at all sites is being assessed along four dimensions: adherence, dosage, quality of delivery, and participant response50-52; with the same measures used by Jemmott et al.28 supplemented with tools used in prior GTO work (i.e., attendance sheets). Adherence—Over the 2-year intervention period, trained local data collectors are visiting each BGC site and rating how closely BGC staff adheres to each MPC module as designed. Dosage—BGC staff are recording how many of the eight modules each youth receives. Quality of delivery—During site visits, local data collectors are rating BGC staff on their teaching style— communication skills, interactions with participants, enthusiasm, VOLUME 6 • ISSUE 3

235

Chinman et al. Novel Translational Research ■

and pacing—using a standardized rating sheet. Participant response—During site visits, local data collectors are rating student participation. Youth participants in both conditions (30 youth × 32 sites = 960) are completing surveys before and immediately following MPC (pre and post), and at 6-month follow-up, assessing sexual health outcomes using the same measures as Jemmott et al.28 Previous efficacy trials28 of MPC have improved condom use at 6-month follow-up and decreased the amount of sexual intercourse of certain subgroups. Trials featuring communitybased implementation of MPC either found few outcomes34 or similar impacts on condom use, but not for frequency of sex.20 EQUIPS aims to narrow this gap in outcomes between communitybased implementation of MPC and the original efficacy trials of MPC. Several sets of outcomes are on the Youth Survey including sexual activity, STIs, condom use, and pregnancy. In addition, the Youth Survey asks about attitudes and beliefs about sexual activity, condom use, and pregnancy, as well as knowledge about HIV, STIs, condom use, and pregnancy. Hypotheses and data analyses It is hypothesized that capacity among BGC in the MPC+GTO group will improve more over time than in the MPC only group. Similarly, it is hypothesized that fidelity of MPC at sites in the MPC+GTO group will be higher than sites in the MPC only group. Finally, it is hypothesized that teen sexual health outcomes will improve more in MPC+GTO programs than in MPC-only programs. While each will be analyzed separately, the most important analyses will attempt to link all three sources of data to truly link capacity, fidelity, and outcomes. For example, while some analyses of the youth outcomes will be multilevel repeated measures models to test whether outcomes change as functions of time and group assignment, additional analyses will either include terms for fidelity scores of the programs of which the youth attended and/or include capacity of programs as additional factors in the model. Discussion

Improving the uptake of EBPs has become a national priority53 and translational research’s focus and methods have potential for providing answers on how to do so. There are several areas within translational research in which the EQUIPS study has the potential to make a significant contribution. First, much of translational research focuses on medical problems that involve physicians, nurses, or allied health professionals. Much less translational research has been conducted on prevention programs delivered by community-based organizations, which typically have less capacity than medical settings. However, as argued by Glascow et al.,8 conducting research in low capacity settings is an important frontier for translational research. We believe that with EQUIPS we have an opportunity to conduct T3 translational research in communitybased settings that are low capacity, a condition typical of many such settings. Additionally, often translational studies that target implementation only evaluate the impact on implementation (e.g., Baskerville et al.4). In EQUIPS, we will be able to test in a single study the impact of GTOs capacity-building all the way to the ultimate individual-level outcomes. This combines translational research (how community-based settings can implement EBPs well) with effectiveness research (do EBPs brought to diverse 236

VOLUME 6 • ISSUE 3

settings continue to be effective), which can be stronger than conducting separate effectiveness and translational trials.8 If successful, it will have been shown that improving capacity is an appropriate target in which to intervene to improve implementation of EBPs in community-based settings. Translational research does not always show empirically how a given implementation strategy impacts implementation quality (whether via capacity or some other hypothesized mediator) and in turn, outcomes. To do this, the capacity (or other mediator), fidelity, and outcomes of both the intervention and control groups must be measured, which is not always the case within translational research involving the prevention of youth risk behaviors.5,6 EQUIPS is attempting to test empirically whether using an implementation support intervention (i.e., GTO) designed to facilitate the uptake of a certain EBP leads to increased capacity, which in turn leads to high quality implementation of that EBP, and ultimately improved outcomes compared to typical practice of the same EBP. Thus, the capacity, fidelity, and outcomes of both study groups are being assessed. There has been a call for more translational research that measures sustainability of EBPs. A recent review found most sustainability research to date has been retrospective, naturalistic, and lacking in rigor. 54 By removing the GTO intervention from the first 16 BGC sites while continuing to track capacity and fidelity for an additional year, EQUIPS will have the opportunity to assess the degree to which the MPCs program was sustained. Finally, EQUIPSs large scale (32 sites, N = 960 youth) allows for a test of establishing an implementation support infrastructure on a scale which is closer to the kinds of EBP roll outs being conducted. For example, the 15 state-level Campaigns to Prevent Teen and Unplanned Pregnancy attempt to serve many community sites. The Office of Adolescent Health has established a multi-million dollar program to support scores of EBPs in teen pregnancy. A capacity building intervention like GTO could be a model for these large scale roll-outs of EBPs in teen pregnancy and other domains, potentially impacting community practices and outcomes on a broad scale. Conclusion

The EQUIPS study design is novel in that it: assesses GTO’ impact on capacity, implementation quality, and outcomes simultaneously; measures all three variables in both the intervention and control conditions; will assess sustainability by continuing to measure capacity and fidelity a year after the GTO support has been removed; and will operate on a large scale similar to many national initiatives. Many studies have not been able to incorporate all these elements and thus the EQUIPS study could serve as a model for translational research that could increase the successful implementation of EBPs in a wide array of domains. Acknowledgments

This project is supported by a grant from the National Institute of Health, National Institute of Child Health and Human Development entitled Enhancing Quality Interventions Promoting Healthy Sexuality (R01HD069427-01). References 1. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ 1998; 317(7156): 465–468.

WWW.CTSJOURNAL.COM

Chinman et al. Novel Translational Research ■

2. Grimshaw JM, Thomas RE, MacLennan G, Fraser C, Ramsay CR, Vale L, Whitty P, Eccles MP, Matowe L, Shirran L, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004; 8(6): iii–iv, 1–72.

Center for Mental Health Services, Substance Abuse and Mental Health Services Administration; 2005.

3. Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ. 1995; 153(10): 1423–1431.

30. Bandura A. A social cognitive approach to the exercise of control over AIDS infection. In: DiClemente R, ed. Adolescents and AIDS: A Generation in Jeopardy. Newbury Park, CA: Sage Publications; 1992: 89–116.

4. Baskerville NB, Liddy C, Hogg W. Systematic review and meta-analysis of practice facilitation within primary care settings. Ann Fam Med. 2012; 10(1): 63–74.

31. Fishbein M, Ajzen I. Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research. Reading, MA: Addison-Wesley; 1975.

5. Hawkins JD, Oesterle S, Brown EC, Arthur MW, Abbott RD, Fagan AA, Catalano RF. Results of a type 2 translational research trial to prevent adolescent drug use and delinquency: a test of communities that care. Arch Pediatr Adolesc Med. 2009; 163(9): 789–798.

32. Ajzen I. Theory of planned behavior. Organ Behav Hum Decis Process 1991; 50: 179–211.

6. Spoth R, Redmond C, Shin C, Greenberg M, Clair S, Feinberg M. Substance-use outcomes at 18 months past baseline: The PROSPER Community-University Partnership Trial. Am J Prev Med. 2007; 32(5): 395–402. 7. Khoury MJ, Gwinn M, Yoon PW, Dowling N, Moore CA, Bradley L. The continuum of translation research in genomic medicine: how can we accelerate the appropriate integration of human genome discoveries into health care and disease prevention? Genet Med. 2007; 9(10): 665–674. 8. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012; 102(7): 1274–1281. 9. Hamilton B, Martin J, Ventura S. Births: Preliminary data for 2009, Table 2. Hyattsville, MD: National Center for Health Statistics; 2010. 10. Kost K, Henshaw S. US teenage pregnancies, births and abortions, 2008: National trends by age, race, ethnicity. New York, NY: Guttmacher Institute; 2012. 11. Kaestle E, Halpern CT, Miller WC, Ford CA. Young age at first sexual intercourse and sexually transmitted infections in adolescents and young adults. Am J Epidemiol. 2005; 161: 774–780. 12. Kaye K, Chadwick L. The lives of teen parents after welfare reform and the role of TANF: U.S. Department of Health and Human Services, Assistant Secretary of Planning and Evaluation; 2006.

33. Jemmott JB, Jemmott L, Spears H, Hewitt N, Cruz-Collins M. Self-efficacy, hedonistic expectancies, and condom-use intentions among inner-city Black adolescent women: a social cognitive approach to AIDS risk behavior. J Adolesc Health. 1992; 13:512–519. 34. Borawski EA, Trapl ES, Adams-Tufts K, Hayman LL, Goodwin MA, Lovegreen LD. Taking Be Proud! Be Responsible! to the suburbs: a replication study. Perspect Sex Reprod Health 2009; 41: 12–22. 35. Villarruel AM, Jemmott III JB, Jemmott LS. A randomized controlled trial testing an HIV prevention intervention for Latino adolescents. Arch Pediatr Adolesc Med. 2006; 160: 772–777. 36. Chinman M, Imm P, Wandersman A. Getting to Outcomes 2004: Promoting Accountability through Methods and Tools for Planning, Implementation, and Evaluation. Santa Monica, CA: RAND Corporation; 2004. 37. Imm P, Chinman M, Wandersman A, Rosenbloom D, Guckenburg S, Leis R. Preventing Underage Drinking: Using Getting To Outcomes with the SAMHSA Strategic Prevention Framework to Achieve Results. Santa Monica, CA: RAND Corporation; 2007. 38. Fisher D, Imm P, Chinman M, Wandersman A. Getting To Outcomes with Developmental Assets: Ten Steps to Measuring Success in Youth Programs and Communities. Minneapolis, MN: Search Institute; 2006.

13. Perper K, Peterson K, Manlove J. Diploma Attachment among Teen Mothers (Fact Sheet). Washington, DC: Child Trends; 2010.

39. Hannah G, McCarthy S, Chinman M. Getting To Outcomes in Services for Homeless Veterans: 10 steps for Achieving Accountability. Philadelphia, PA: National Center on Homelessness Among Veterans; 2011.

14. Kirby D, Laris B, Rolleri L. The Characteristics of Effective Curriculum-Based Sex and HIV Education Programs for Adolescents. Scotts Valley, CA: ETR Associates; 2006.

40. Fishbein M, Ajzen A. Beliefs, Attitudes, Intentions, and Behavior: An Introduction to Theory and Research. Reading, MA: Addison-Wesley; 1975.

15. Chervin DD, Philliber S, Brindis CD, Chadwick AE, Revels ML, Kamin SL, Wike RS, Kramer JS, Bartelli D, Schmidt CK, et al. Community capacity building in CDC’s community Coalition Partnership Programs for the prevention of teen pregnancy. J Adolesc Health. 2005; 37: S11–S19.

41. Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008; 41: 327–350.

16. Kramer JS, Philliber S, Brindis CD, Kamin SL, Chadwick AE, Revels ML, Chervin DD, Driscoll A, Bartelli D, Wike RS, et al. Coalition models: lessons learned from the CDC’s Community Coalition Partnership Programs for the prevention of teen pregnancy. J Adolesc Health. 2005; 37: S20–S30.

42. Chinman M, Hunter S, Ebener P, Paddock S, Stillman L, Imm P, Wandersman A. The getting to outcomes demonstration and evaluation: an illustration of the Prevention Support System. Am J Community Psychol. 2008; 41: 206–224.

17. Green L. From research to ‘best practices’ in other settings and populations. Am J Health Behav. 2001; 25(3): 165–178.

43. Chinman M, Tremain B, Imm P, Wandersman A. Strengthening prevention performance using technology: a formative evaluation of interactive getting to outcomes. Am J Orthopsychiatry 2009; 79: 469–481. PMC2859836.

18. Wandersman A, Florin P. Community interventions and effective prevention: bringing researchers/evaluators, funders and practitioners together for accountability. Am Psychol. 2003; 58(6/7): 441–448. 19. Backer T, David S, Soucy G, eds. Reviewing the Behavioral Science Knowledge Base on Technology Transfer. Washington, DC: National Institute of Drug Abuse; 1995. 20. Jemmott JB, Jemmott LS, Fong GT, Morales KH. Effectiveness of an HIV/STD risk-reduction intervention for adolescents when implemented by community-based organizations: A cluster randomized controlled trial. Am J Public Health. 2010; 100: 720–726, PMID: 20167903. 21. Rohrbach LA, D’Onofrio CN, Backer, TE, Montgomery SB. Diffusion of school-based substance abuse prevention programs. Am Behav Sci. 1996; 39: 919–934. 22. Steckler AB, McLeroy KR, Goodman RM, McCormick L, Bird ST. Toward integrating qualitative and quantitative methods: an introduction. Health Educ Q. 1992; 19: 1–8. 23. Ennett ST, Ringwalt CL, Thorne J, Rohrbach LA, Vincus A, Simons-Rudolph A, Jones S. A comparison of current practice in school-based substance abuse prevention programs with metaanalysis findings. Prev Sci. 2003; 4: 1–14. 24. Rogers EM. Diffusion of Innovations. 4th edn. New York: Free Press; 1995. 25. Lesesne CA, Lewis KM, Moore C, Fisher D, Green D, Wandersman A. Promoting Sciencebased Approaches to Teen Pregnancy Prevention Using Getting To Outcomes (PSBA-GTO). Atlanta, GA: Centers for Disease Control and Prevention; 2007. 26. Lesesne CA, Lewis KM, White CP, White DC, Green DC, Duffy JL, Wandersman A. Promoting science-based approaches to teen pregnancy prevention: proactively engaging the three systems of the interactive systems framework. Am J Community Psychol. 2008; 41: 379–392. 27. Department of Health & Human Services (DHHS). Program announcement AA004-maternal, infant, and reproductive health: National and state coalition building, parts D and E. Fed Regist. 2005; 70(55): 14687–14696. 28. Jemmott JB, Jemmott LS, Fong GT. Abstinence and safer sex HIV risk-reduction interventions for African American adolescents: a randomized controlled trial. JAMA. 1998; 279: 1529–1536. 29. Greenberg MT, Domitrovich CE, Graczyk PA, Zins JE. The Study of Implementation in SchoolBased Preventive Interventions: Theory, Research, and Practice (Volume 3). Rockville, MD:

WWW.CTSJOURNAL.COM

44. Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, Hagedorn H, Pineros S, Wallace C. Role of “external facilitation“ in implementation of research findings: A qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006 Oct 18; 1: 23. 45. Kelly JA, Somlai AM, DiFranceisco WJ, Otto-Salay LL, McAulifee TL, Hackl KL, Heckman T, Holtgrave D, Rompa D. Bridging the gap between science and service of HIV prevention: transferring effective research-based HIV prevention interventions to community AIDS service providers. Am J Public Health. 2000; 90: 1082–1088. 46. Mitchell R, Stone-Wiggins B, Stevenson JF, Florin P. Cultivating capacity: outcomes of a statewide support system for prevention coalitions. J Prev Interv Community. 2004; 27: 67–87. 47. Stevenson JF, Florin P, Mills DS, Andrade M. Building evaluation capacity in human service organizations: a case study. Eval Program Plann. 2002; 25(3): 233–243. 48. Dusenbury L, Falco M. Eleven components of effective drug abuse prevention curricula. J Sch Health. 1995; 65(10): 420–425. 49. Basen-Engquist K, O’Hara-Tompkins N, Lovato CY, Lewis MJ, Parcel GS, Gingiss P. The effect of two types of teacher training on implementation of Smart Choices: a tobacco prevention curriculum. J Sch Health. 1994; 64(8): 334–339. 50. Mihalic S, Irwin K, Fagan K, Ballard D, Elliot D. Successful Program Implementation: Lessons from Blueprints. Washington, DC: NCJRS; 2004. 51. Mihalic S, Irwin K. Blueprints for violence. Youth Violence Juv Justice. 2003; 1(4): 307–329. 52. Dusenbury L, Brannigan R, Hansen W, Walsh J, Falco M. Quality of implementation: developing measures crucial to understanding the diffusion of preventive interventions. Health Educat Behav. 2005; 20: 308–313. 53. U.S. Department of Health and Human Services. Strategic Plan: Fiscal Years 2010—2015. Washington, DC: U.S. Department of Health and Human Services. 54. Stirman S, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012 Mar 14; 7: 17. doi: 10.1186/1748-5908-7-17.

VOLUME 6 • ISSUE 3

237

Suggest Documents