Journal of Evidence-Based Social Work
ISSN: 1543-3714 (Print) 1543-3722 (Online) Journal homepage: http://www.tandfonline.com/loi/webs20
Implementing Evidence-Based Practice: Practitioner Assessment of an Agency-Based Training Program Sarah E. Bledsoe-Mansori , Jennifer I. Manuel , Jennifer L. Bellamy , Lin Fang , Erna Dinata & Edward J. Mullen To cite this article: Sarah E. Bledsoe-Mansori , Jennifer I. Manuel , Jennifer L. Bellamy , Lin Fang , Erna Dinata & Edward J. Mullen (2013) Implementing Evidence-Based Practice: Practitioner Assessment of an Agency-Based Training Program, Journal of Evidence-Based Social Work, 10:2, 73-90, DOI: 10.1080/15433714.2011.581545 To link to this article: http://dx.doi.org/10.1080/15433714.2011.581545
Published online: 14 Apr 2013.
Submit your article to this journal
Article views: 191
View related articles
Citing articles: 1 View citing articles
Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=webs20 Download by: [New York University]
Date: 12 November 2015, At: 05:19
Journal of Evidence-Based Social Work, 10:73–90, 2013 Copyright © Taylor & Francis Group, LLC ISSN: 1543-3714 print/1543-3722 online DOI: 10.1080/15433714.2011.581545
Downloaded by [New York University] at 05:19 12 November 2015
Implementing Evidence-Based Practice: Practitioner Assessment of an Agency-Based Training Program Sarah E. Bledsoe-Mansori School of Social Work, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
Jennifer I. Manuel School of Social Work, Virginia Commonwealth University, Richmond, Virginia, USA
Jennifer L. Bellamy School of Social Service Administration, University of Chicago, Chicago, Illinois, USA
Lin Fang Factor-Inwentosh Faculty of Social Work, University of Toronto, Toronto, Canada
Erna Dinata School of Social Service Administration, University of Chicago, Chicago, Illinois, USA
Edward J. Mullen School of Social Work, Columbia University, New York, New York, USA
Responding to the call for evidence-based practice (EBP) in social work, the authors conducted a multiphase exploratory study to test the acceptability of a training-based collaborative agency– university partnership strategy supporting EBP. The Bringing Evidence for Social Work Training (BEST) study includes an agency training component consisting of 10 modules designed to support the implementation of EBP in social agencies. Qualitative data from post-training participant focus groups were analyzed in order to describe practitioner perceptions of the 10 training modules and trainer experiences of implementation. Based on the findings from this study the authors suggest that the BEST training was generally acceptable to agency team members, but not sufficient to sustain the use of EBP in practice. Keywords: Evidence-based practice, empirically supported interventions, agency–university partnerships, training Supported in part by the Willma & Albert Musher Program at Columbia University, the National Institutes of Health (K12-HD001441; Dr. Bledsoe-Mansori) and the National Institute for Mental Health (T32 MH014623; Drs. BledsoeMansori, Bellamy, & Manuel). All authors were members of the Bringing Evidence for Social Work Training (BEST) team and authored this article on behalf of the entire team. Address correspondence to Sarah E. Bledsoe-Mansori, School of Social Work, University of North Carolina at Chapel Hill, 325 Pittsboro Street, CB# 3550, Chapel Hill, NC 27599-3550. E-mail:
[email protected]
73
74
S. E. BLEDSOE-MANSORI ET AL.
Downloaded by [New York University] at 05:19 12 November 2015
INTRODUCTION The underuse of existing scientific knowledge by social workers and other health and human service professionals has been widely documented (Bhattacharyya, Reeves, & Zwarenstein, 2009; Fixsen, Naoom, Blase, Friedman, & Wallace, 2005; Hess & Mullen, 1995; Kirk, 1990; Kirk & Reid, 2002; Mullen, 2009; Mullen & Bacon, 2004; Panzano & Herman, 2005; Torrey & Gorman, 2005; Weissman & Sanderson, 2001). It has been suggested that discrepancies between research evidence and practice begin with education and training. Researchers have identified practitioners’ lack of knowledge and training in evidence-based practice (EBP) and empirically supported interventions (ESI) as key barriers preventing regular use of EBP principles in routine social work practice (Bellamy, Bledsoe, & Traube, 2006; Bledsoe et al., 2007; Mullen, Shlonsky, Bledsoe, & Bellamy, 2005; New Freedom Commission on Mental Health, 2003; Weissman & Sanderson, 2001). This deficiency in EBP/ESI training was confirmed by two large national surveys of Council on Social Work Education (CSWE) accredited social work training programs. The first, examining faculty’s views of EBP/ESI, found that social work educators disagreed about both the definition of EBP and the standards of evidence by which an intervention is judged to be empirically supported (Rubin & Parrish, 2007). A second survey of ESI offered in social work curricula found that social work training programs continued to focus on interventions that lacked empirical support and were unlikely to offer the training gold standard of combined didactic instruction and clinical supervision when ESI were offered (Bledsoe et al., 2007; Weissman et al., 2006). The intersection of lack of knowledge and training in EBP and ESI compromises the quality of services and the generation of future knowledge for social work practice and translational research, presenting a critical challenge to bridging the gap between research and practice (Brekke, Ell, & Palinkas, 2007; Mullen, Bledsoe, & Bellamy, 2008; Mullen et al., 2005; Rubin & Parrish, 2007; Upshur & Tracy, 2004). Despite these gaps in knowledge and training, the demand for social work practitioners and educators to base practice and education on research knowledge, specifically to incorporate EBP/ESI into routine social work practice, persists though few models exist to guide the dissemination and implementation of EBP and ESI in the profession (Bellamy et al., 2006c; Brekke et al., 2007; Mullen, Bellamy, Bledsoe, & Francois, 2007; Mullen et al., 2005). In response to this demand and in light of the paucity of research on the implementation of EBP in social agencies, we conducted a multiphase study, the Bringing Evidence for Social Work Training Study (BEST Study), to test the feasibility and acceptability of a training-based collaborative agency–university partnership strategy to support the implementation of EBP in three social service agencies. We have reported feasibility results elsewhere (Bellamy, Bledsoe, Mullen, Fang, & Manuel, 2008; Manuel, Mullen, Fang, Bellamy, & Bledsoe, 2009; Mullen et al., 2008). In this article we report the acceptability results and aim to answer the following research question: To what degree is the BEST EBP training model acceptable to agency based practitioners when delivered in a brief, agency-based group training? This is accomplished through a thorough, module-by-module examination of qualitative data provided in post-training focus groups conducted with agency participants. BACKGROUND AND SIGNIFICANCE The BEST Study included an agency-based training component (Bellamy, Bledsoe, Manuel, Fang, & Mullen, 2006b; Musher Program, 2006). This training is based on the seven-step EBP model for social work that Gibbs (2003) adapted from evidence-based medicine and conceptualizes EBP as a professional model of practice (Sackett, Straus, Richardson, Rosenberg, & Haynes, 2000). Following this model the BEST Training is designed to guide practitioners, in partnership with clients, through a reflexive process of identifying, evaluating, and applying research knowledge
IMPLEMENTING EBP
75
to guide practice decision making and improve client outcomes (Musher Program, 2006; Gibbs, 2003). The BEST Training consists of 10 teaching modules based on steps one through five of the EBP model: motivation, question development, search for relevant research, research appraisal, and application. The teaching modules were designed to increase agency teams’ knowledge of EBP, improve EBP research skills, foster positive attitudes toward EBP, identify factors that promote the use of EBP in social service agencies, and provide agency-based social workers with the knowledge to implement EBP in practice (Musher Program, 2006).
Downloaded by [New York University] at 05:19 12 November 2015
METHODS For a detailed report of the methodology of the BEST project, see Bellamy et al. (2008) and Manuel et al. (2009). Both qualitative and quantitative data were collected to analyze various aspects of this multiphase pilot feasibility project; however, in this study the authors focus on the post-training qualitative data related to practitioner acceptance of the BEST Training modules. A brief report of the methodology, including the methodology of the unique qualitative analysis conducted for the current study follows. Sample Participants were recruited from a convenience sample of three New York City social service agencies (hereafter referred to as partner agencies). The three partner agencies differed in terms of program structure, location, and populations served (see Table 1). Following initial meetings with partner agency administrators and program supervisors, teams of 4 to 6 staff members from each agency were recruited as project participants (hereafter referred to as the agency team) forming 3 agency teams with a total of 16 voluntary participants. Table 2 presents the demographic and professional characteristics of the agency teams. All but one agency team member, who left the partner agency prior to completion of the project, participated in the training and post-training focus group. Agency team members were not compensated for their involvement in the study;
TABLE 1 Agency Characteristics Agency A Type of agency
Community-based agency
Number of clients served annually Services provided at BEST site Clients
12,000
Participants
Variety of child, adult, and family services Predominantly Latino immigrant families
Program supervisors Upper-level administrators
Agency B
Agency C
Large, multi-location (25 sites), multi-program (85 programs) agency 40,000
Large (320 staff members), community-based agency
Housing program
Outpatient health and mental health services Primarily Asian Americans and Asian immigrants
Adults with dual diagnosis of chronic, severe mental illness and substance abuse or dependence Program supervisors Housing site supervisors
24,000
Supervisors Mental health therapists Case managers Student interns
76
S. E. BLEDSOE-MANSORI ET AL.
Downloaded by [New York University] at 05:19 12 November 2015
TABLE 2 Sample Demographic and Professional Characteristics (N D 16)
Age Female Race Asian Black/African American Caucasian/White Hispanic/Latino Salary 35K or less 35K–50K 50K or more Education Bachelor’s Master’s PhD Degree focus Social work Other Social work license Years worked Hours worked per week
Mean or n
Standard Deviation or Percentage
34.9 14
9.1 87.5%
6 3 6 1
37.5% 18.8% 37.5% 6.3%
2 9 4
12.5% 56.3% 25.0%
6 8 2
37.5% 50.0% 12.5%
13 3 8 6.3 36.17
81.3% 18.7% 50.0% 7.5 5.8
however, all project activities took place during regular working hours. Columbia University Institutional Review Board approved the project for human participant research. Training The BEST Training was designed as part of a cooperative agency–university partnership. Partner agencies were involved from the earliest stages of the study and training development. Prior to implementation of the BEST training, the research team conducted meetings with agency administrators and staff to incorporate their perspectives and preferences on how the project would proceed. Thus, agency teams and administrators guided and shaped many of the project details including the decision to use a group training approach as opposed to implementing a single ESI. The EBP model described above was selected based on the preference of agencies to use a “bottom up” approach that engaged practitioners in the EBP process. This is in contrast to a “top down” approach, where an ESI is dictated by administrators or supervisors. The BEST training consisted of 10 modules targeting the first five steps of EBP (see Table 3). The training used a problem-based learning strategy to develop agency team competencies in EBP. During the planning stages, each agency team generated a target problem that was of interest to their agency and unique to the clients served by their agency. The training focused on these selected problems, rather than an abstract standardized practice question, as a strategy to increase acceptability. By using this strategy, agency teams had the opportunity to focus on problems that they identified as common, meaningful to their practice, or particularly challenging. Thus, through the course of the training, the agency teams transformed unique, practice-generated target problems into researchable questions while the research team served in a consultant role to provide technical support for the agency teams’ efforts to locate, evaluate, and appraise existing
IMPLEMENTING EBP
77
TABLE 3 Training Modules
Downloaded by [New York University] at 05:19 12 November 2015
1. 2. 3. 4. 5. 6. 7. 8. 9. 10.
EBP Introduction and Overview Question Selection Overview of Research Evidence Search Tools Search Demonstration Troubleshooting the Search Evaluating the Evidence General Findings and Observations Synthesizing Evidence Found Action Plan
evidence. All training materials are publicly available through the BEST Project Website (http:// www.columbia.edu/cu/musher/Website/Website/index.htm). Data Qualitative data on the training modules were drawn from post-training focus groups conducted separately with each of the three agency teams. Additional data were obtained from implementation process notes recorded by the research team members during the agency trainings. Agency team members provided demographic data on a self-administered, anonymous questionnaire that was completed prior to the first training module. Approximately 8–12 weeks after the final training session, semi-structured focus groups were conducted with each agency team. The research team developed the focus group questions based on research conducted in an earlier phase of the project (Bellamy et al., 2006c) as well as process notes gathered during training implementation. Each focus group was facilitated by two research team members with experience in qualitative research methods. Open ended questions focused on participants’ current knowledge and perceptions of EBP, the use of EBP in practice, perceived barriers and promoters related to EBP, experiences with the BEST training, and the agency’s current and future plans to use EBP. All focus groups were audiotaped and transcribed verbatim. Interviewer notes, focused on emerging themes, were recorded for each focus group. Qualitative Data Analysis To enhance the rigor and relevance of the findings, the research team used the following strategies for analysis of the focus group data: triangulation of data and sources, multiple coders, and member checking. Triangulation was achieved by comparing multiple forms of data, including audio recordings, interviewer notes, and verbatim transcriptions. Following transcription, focus group data were analyzed using NVivo 7.0 (QSR International, 2006). These data were coded according to both pre-identified and emergent themes. Using qualitative methods described by Krueger and Casey (2000), the pre-identified themes were based on the 10 BEST training modules. Coders were instructed to identify statements related to each of the 10 unique modules and to code each statement as positive or negative. Two research team members coded transcripts independently, and then coding classification was cross-checked by a research assistant. Interrater reliability was assessed by determining the ratio of coding similarities between coders. To resolve ambiguities and coding discrepancies, research team members reviewed focus group transcripts until consensus was reached. Member checking was accomplished by presenting and eliciting feedback on the preliminary findings to the agency team members and administrators.
78
S. E. BLEDSOE-MANSORI ET AL.
RESULTS
Downloaded by [New York University] at 05:19 12 November 2015
BEST Training Module 1: EBP Introduction and Overview Training Module 1 addresses the first step of the EBP model: becoming motivated to apply EBP (Gibbs, 2003). In this module, EBP is defined as an approach to practice that involves posing specific questions, searching for best available evidence, evaluating the evidence, and taking action based on the evidence (Bellamy et al., 2006c). This module provides participants with a brief history of the origin of EBP in medicine and its expansion into human services in the 1990s. In addition, the first module outlines major motivations to use EBP including professional goals and ethics, effective and efficient integration of research into practice, and funding and policy demands including accountability. An explanation of best available evidence as research evidence alongside practitioner experience, client values and expectations, agency mission and values, and professional ethics is included. Finally, the module outlines the seven steps of EBP and encourages practitioners to engage in continual evaluation of evidence and assumptions related to practice. From the trainers’ perspective, this module was useful in creating an opportunity to engage practitioners in a conversation regarding EBP, answer questions about EBP, acknowledge concerns and reservations about EBP, and address misconceptions and misinformation team members had regarding EBP and ESI. Some common misconceptions that were identified included: (a) a narrow or incomplete perception of EBP as simply conducting research or using an ESI in practice, and (b) the view that EBP sacrifices practitioner experience and client circumstance in favor of reductionistic scientific methods and evidence. Although agency teams’ post-training responses to Module 1 were positive, one participant stated that the history of EBP was one of the most useful things she learned in the BEST Training. Another participant felt that her agency team’s understanding of EBP had changed stating, “I think we have a fuller view of EBP.” Other positive responses noted that the training helped the agency teams to understand that EBP is a complement to good practice. One participant responded, “I feel like now, you have a chance to be very intuitive and thoughtful about your selection—what evidence you might choose and you know it could work.” Similarly, another respondent said: EBP has gone from being : : : narrowly defined and dictating what you do to the client. It isn’t only using empirical or quantitative research, it can also use qualitative or clinical studies : : : it’s about, seeing if what we’re doing in practice has been evaluated so that we can improve the quality of our work : : : and the change is coming from us.
BEST Training Module 2: Question Selection Module 2 focuses on question selection, the process of converting a relevant practice problem into a well-formulated question that can be answered using existing evidence (Gibbs, 2003). In this module, trainers helped teams clarify and transform a practice-based need into a researchable question. This module teaches practitioners to formulate Client-Oriented Practical Evidence-Search questions (COPES; Gibbs, 2003) that are used to guide the next steps in the EBP process. Participants are provided with a worksheet containing COPES examples and an outline of the five question types and four elements of a well-formulated question as defined by Gibbs (2003). By the end of Module 2, consensus was obtained regarding a COPES question that would serve as the focus for the agency team during the remaining modules. Trainers reported that using a question identified from current agency practice enhanced the agency teams’ motivation to learn EBP and stimulated discussions about prioritizing practice problems. The value of this strategy was evidenced in the comment of one participant who said that identifying a practice relevant question increased her motivation to search out empirical evidence:
Downloaded by [New York University] at 05:19 12 November 2015
IMPLEMENTING EBP
79
“That’s the first step for me, having a burning issue that I want to explore.” However, trainers found it challenging to clearly communicate the unique construction of a COPES question, particularly the notion of practice alternatives. The agency teams sometimes found it difficult to identify true alternatives (e.g., choosing between alternative interventions, or between an intervention and no intervention). Though participants did find it was helpful to emphasize the decision-making nature of the process. Each of the agency teams remarked on positive elements of Module 2. One participant reported, “I like the question formulation part, to break it down. That helped to organize my thinking : : : in the past, without formulating a question, I didn’t know what I was searching for.” Several participants reported that formulating an answerable question was useful in streamlining efforts to identify research by taking more time to engage in the process systematically. One participant stated: When you start to research, it can become so overwhelming and this is a way to narrow it and narrow it so that you get what you want, so that it’s more practical. It’s a little bit more time consuming and taxing but on the other hand, it’s a time saver, more effective.
Although overall reactions to Module 2 were positive, some participants worried that time constraints and their lack of experience in formulating COPES questions on their own would be a barrier to future use of EBP in their agency. Whereas a few participants thought the question selection process was tedious, most agreed that the process was necessary and helpful. BEST Training Module 3: Overview of Research Evidence Module 3 provides an overview of research evidence that focuses on understanding the impact of research design and identifying empirical evidence. This module addresses step three in EBP: efficiently finding the best evidence with which to answer practice questions (Gibbs, 2003). Discussions focused on evaluating the quality of research evidence by examining differences between systematic reviews, meta-analyses, program evaluations, case studies, and intervention outcome studies. Trainers found Module 3 useful for introducing discussions about research design as one way to evaluate the quality of evidence, understanding the evidence (if any) practitioners had used in the past to guide practice decisions, and identifying the most useful evidence to answer each team’s unique question. This module was more technical than many others and included technical concepts, such as meta-analysis, systematic review and individual outcome studies, as well as a considerable emphasis on research methods. Trainers reported it was more difficult to maintain participant engagement in Module 3. Perhaps because of the relatively technical nature of the content, few participants commented on this module. One helpful aspect noted by participants was the explanations of such terms as meta-analysis and the impact of research design on the quality of empirical evidence. However, one participant expressed the concerns of her team regarding the integration of empirical or research based knowledge with theoretical work: There are certainly more theoretical readings that aren’t particularly related to empirical demonstrations of different techniques : : : I know they have a place in this process: : : : I didn’t emerge with a lot of clarity about where theoretical readings fit in to the process, but sometimes I think it would be useful for us to dedicate some time to think about theory.
Although the training includes a discussion of different types of evidence that might be used in EBP, there is no explicit treatment of how theory relates to, or should be incorporated into, the EBP process.
80
S. E. BLEDSOE-MANSORI ET AL.
Downloaded by [New York University] at 05:19 12 November 2015
BEST Training Module 4: Search Tools Module 4 expands on EBP step three by describing and demonstrating tools to enhance the evidence search process. In this module, trainers provide an overview of electronic search engines, including free versus fee-based search engines, and tips on beginning a search by first looking for systematic reviews, meta-analyses, or other existing syntheses of empirical evidence. The trainers illustrated the successful use of search terms using examples related to each agency team’s unique research question. Participants were introduced to the use of strategic search terms including Methodology Orienting Locators for Evidence Search (MOLES; Gibbs, 2003), which are systematically designed filters that increase the efficiency of electronic database searches. Participants were provided with a search planning worksheet (Gibbs, 2003) to identify key search terms. Other resources supporting the evidence search process included a list of Web resources, a clinical utility guide with such information as the accessibility (free or fee-based) and substantive area of web resources on practice guidelines, search databases, and EBP knowledge. In addition, participants were provided with an EBP tool summary ranking resources by research design (e.g., systematic reviews). For example, the Campbell Collaboration Database, is a resource that is included under systematic reviews. The hands-on nature of this module was particularly helpful for trainers in guiding practitioners through the search process, especially those participants with less computer experience and little or no experience with search engines. Whereas some participants were unaware of many free search engines (e.g., Education Resources Information Center [ERIC]), others were aware of the resources but had not used them to address practice-based questions. Despite the guidance of search tool, including information on the use of MOLES and Boolean operators to conduct efficient searches, trainers found it was a challenge to communicate the process involved in expanding and narrowing search results. Participants said that the inclusion of resources such as Web-based search engines in Module 4 were especially helpful, particularly the tool giving an overview of search engines that distinguished between fee-based and free resources. Participants also noted the utility of learning to identify key words based on the COPES question. Participants’ concerns regarding the use of EBP tools and the electronic search process fell generally into two categories: lack of experience and lack of access. Some participants noted the introduction to the tools was insufficient for them to feel comfortable relying on these skills in practice and indicated they needed more time using and practicing with the tools. Other participants noted that budget constraints would likely mean their agencies would not have access to the feebased resources and were concerned that this would limit their access to the best available evidence to guide practice decisions. BEST Training Module 5: Search Demonstration Module 5 is conducted in a computer lab and also addresses step three of the EBP process. In this module, a search demonstration is performed that includes elements such as accessing databases; starting a search using keywords; keeping track of searched terms using a search tracking worksheet (Search History Form); and using abstracts to identify relevant evidence. Participants are instructed to follow the search demonstration on computers individually or in pairs. Trainers provide technical support to the agency teams as they begin searching for evidence to answer their unique COPES question using the key search terms identified in Module 4. The shared experience of the actual search process allowed trainers to problem solve as difficulties arose. Equally important, the agency teams’ hands-on experience seemed to demystify some of the concepts and applications that were more difficult to describe didactically in earlier training modules. Although some participants were experienced searchers, many said they needed
Downloaded by [New York University] at 05:19 12 November 2015
IMPLEMENTING EBP
81
assistance to begin the search, and did not have the tools to perform a search on their own prior to the training. Positive comments related to BEST Training Module 5 focused primarily on the search tracking form and the hands-on experience of conducting a search with technical support. One participant remarked that the handouts demonstrating how to organize a search were the most helpful tools and indicated that increasing the availability of such support tools would likely increase practitioners’ willingness to use EBP. Nearly all participants found the hands-on aspect of Module 5 to be especially helpful in providing concrete experience in using the tools and in reinforcing the information from previous modules. In addition, several participants suggested expanding the training by adding additional hands-on computer sessions. One team specifically requested an additional session focused on this module. Although most reactions to Module 5 were positive, the primary concern voiced by participants was related to limited access to search engines and full-text articles, a barrier that was brought to life for the agency teams when they began to conduct their own searches. One participant remarked, “I think the major issue probably is access to the actual literature. If you’re not affiliated with a university then you don’t have access. That’s the only real difficult part.” However, another participant identified additional concerns related to whether practitioners would follow the search steps thoroughly, “I don’t think people are going to take the time to search for a list of sites to search and to look for different key words.” This participant worried that practitioners might instead use the most easily available evidence rather than investing more time to identify and evaluate the best available evidence.
BEST Training Module 6: Troubleshooting the Search Module 6 continues with a troubleshooting review of step three of the EBP process by reviewing the hands-on search conducted in the previous module. This review focuses on the amount of evidence identified, where the evidence was found, what aspects of the search were easy or difficult, and addresses general questions and issues related to the search process. The structure of this module also allows participants to review a variety of sources of evidence available via the Internet, from academic-based search engines to non-academic, credible websites, such as EBP dissemination groups and federal agencies. Participants’ comments on Module 6 focused primarily on the challenges and strategies of the search process rather than the module content. Participants noted that demonstration and search tips, including clarification on the use of Boolean terms, were helpful. Several participants identified potential problem-solving strategies to gain access to fee-based search engines and full-text articles including the use of access through public libraries and suggestions that universities and/or professional organizations, such as the National Association of Social Workers, might partner with agencies to provide practitioners with access to fee-based search engines and full text journal articles. One participant noted continued difficulty in applying the full search process and using search operators and questioned whether the full search process could be implemented in practice. Some participants raised concerns about the applicability of the empirical evidence they identified in their search to the agency’s unique practice question and their client population. Most participants were concerned about the limited research available that precisely fit their agencies’ question, clients, and practice setting. For example, one participant commented, “To base practice on the limited evidence that we were able to access for this question, I mean we were struggling.” Another participant voiced her concerns about applying the evidence she was able to retrieve because the population in the studies differed in significant ways from her clients: “When I come to articles on certain populations : : : I’m not confident enough to apply the evidence to my clients when there aren’t experts in the interventions I have identified available for consultation.”
82
S. E. BLEDSOE-MANSORI ET AL.
Downloaded by [New York University] at 05:19 12 November 2015
BEST Training Module 7: Evaluating the Evidence Module 7 moves to step four of the EBP process: critically appraising the evidence for validity and usefulness (Gibbs, 2003). The module provides a review of the different types of evidence and employs an example of a quality review. This module underscores the value of existing syntheses of evidence (i.e., systematic reviews). Handouts included in this module are aimed at helping practitioners to evaluate the identified empirical evidence, such as six “quick tips” to identify high-quality research (developed by the research team) and an evidence-based practice glossary (largely drawn from the Bandolier EBM Glossary, 2004). Trainers walk participants through an evaluation of the empirical evidence identified in Module 5 using a series of worksheets created by Gibbs (2003) for critically appraising treatment and prevention studies, meta-analyses, and assessment and risk evaluations. The worksheets provide practitioners with statistical formulas (i.e., effect size, control event rate, experimental event rate, relative risk reduction, absolute risk reduction, number needed to treat, and confidence intervals for the number needed to treat), intended to enable practitioners to compare the strength of evidence across studies. The module concludes with a discussion of the refinement of the search plan based on the empirical evidence found to date. From the trainers’ perspective, this module was the most challenging to facilitate. The trainers struggled to help participants to understand and calculate the statistics they were expected to use to judge the quality of the evidence across studies. In particular, the trainers had difficulty explaining complex research-related concepts in a limited amount of time to an audience with varying levels of research training. In addition, the lengthy worksheets appeared to overwhelm participants, most of whom expressed a preference for a group discussion of the quality of the evidence facilitated with the assistance of trainers. Despite the challenges, participants identified positive aspects of Module 7. Some agency team members agreed that learning to evaluate the evidence in a systematic way was a valuable skill, and they appreciated being exposed to the concepts involved in evaluation. One participant remarked, “We’ve had a chance to try out looking at the articles and evaluating each of them, deciding whether they’re relevant or not—that was something new.” Agency teams were less optimistic about their ability to engage in systematic evaluation because they doubted their abilities to use the complex worksheets on their own. The majority of remarks related to Module 7 were focused on concerns regarding the agency team’s ability and/or desire to systematically and rigorously evaluate the evidence using worksheets. In general, participants preferred a modified approach that fit better with current practice: I think the tools were wonderful, but extremely rigorous and I think that we would probably do it a little more impressionistically. I’m skeptical that we would all be using the checklists and evaluation tools. I think it’s beyond what we would probably jump into. What I liked about evaluating the evidence is doing it as a group. If there were a way to evaluate the relevance through that kind of narrative discussion with some simple guidelines that would feel almost in sync with how we approach issues and challenges in our department.
This participant’s perspective illustrates the reluctance of most participants to use statistical methods for evaluation of empirical evidence. Another participant addressed a general discomfort with statistics and using statistics in Module 7 to interpret empirical evidence: I think it’s really important if you actually were able to wrap your head around the math. Because it goes back to : : : is this study actually important enough for us to rely on its conclusion? I still don’t know which study’s data is really that clean or the way they went about it or if the methods are really that good. I can’t see myself calculating a statistic.
Downloaded by [New York University] at 05:19 12 November 2015
IMPLEMENTING EBP
83
Nearly all participants remarked that additional training and practice would be necessary before they would feel confident in using the study rating forms. Even though the quick tips form developed by the research team, included in Module 7, was intended only as an initial review step to determine whether full evaluation was needed, more participants remarked on the usefulness of the quick tips tool in contrast to the detailed evaluation worksheets. Overall, participants noted that they would need a great deal of additional training and support if they were going to incorporate rigorous evaluation of empirical evidence, a critical component of the EBP process, into routine practice. Typical participant comments included the following, “I can find research articles but I don’t have confidence in evaluating the quality of the findings.” Other participants also noted that evaluating the quality of evidence was a challenging task and did not feel confident in their ability to interpret the findings or evaluate the evidence they found. Using statistical methods to evaluate the quality of empirical evidence presented a particular problem for several participants. One participant said Module 7 was “complicated” and that she would “zone out” during the worksheet discussions about how to calculate and interpret statistics such as effect size. Several participants suggested the need to develop a simplified process for practitioners to use in evaluating the quality of empirical evidence. Participants also expressed concerns about using evidence derived from populations that were very different from the agency’s clients, and questioned whether sufficient research is available that is relevant for the agencies clients. One participant commented on the concerns of many: Sometimes I have difficulty evaluating the evidence because a lot of research is done with populations that are different from the clients I work with. I have to think, can I really generalize the techniques and would this be applicable to the specific population I’m working with? A lot of times I find the empirical evidence seems really good but I wonder if it’s helpful or good for my clients. This evaluation would be challenging to me.
BEST Training Module 8: General Findings/Observations Module 8 continues the focus on step four of the EBP process, the evaluation of the evidence. In this module, the agency team reviews general findings from their evidence search and their evaluations of the quality of the evidence. This review includes discussions of whether the available evidence addresses their practice question, and the applicability of the main findings of their search and evaluation. Further attention is given to the fit of the evidence with the agency’s client population and practice setting. The trainers found this module useful in addressing participants’ concerns regarding the applicability of the evidence to agency practice issues and populations. Because many of the participants had difficulty evaluating the quality of the evidence, Module 8 was useful in reinforcing the utility of systematic reviews and meta-analytic studies in which a great deal of the work of collecting and assessing the quality of the evidence has already been performed. However, the agency teams identified few, if any, existing syntheses that addressed their particular practice questions. Some participants remarked that EBP gave them a systematic means to identify and evaluate evidence for use in making practice decisions, which improved their ability to feel successful when using research evidence. Another participant noted this module was useful because it allowed for the incorporation of practice experience into the process, “I really think the good research can incorporate the practice perspective.” However, concerns remained about the fit of the research evidence to the practice setting. One participant’s comment summed up the concerns of many about the fit of the identified evidence with the agency practice question and the needs of the diverse clients she serves, “I just want to say, I did get a sinking feeling when we were actually evaluating the articles we all found. I wondered, is this it? Is this the best available evidence?” Participants reported feeling excited by the potential of the process, but disappointed by the actual
84
S. E. BLEDSOE-MANSORI ET AL.
payoff at the end of the project given the scarcity of satisfying empirical evidence they were able to locate.
Downloaded by [New York University] at 05:19 12 November 2015
BEST Training Module 9: Synthesizing Evidence Found Module 9 maintains a focus on step four of the EBP process and is explicitly devoted to assisting agency teams to synthesize the empirical evidence identified and evaluated in Modules 3 through 8. A research synthesis worksheet is provided to help practitioners organize the evidence and compose a summary statement regarding their evaluation of the evidence. Agency team members then use the evidence summary to create a recommended plan of action. The trainers found that Module 9 was useful to support agency teams in developing recommendations and a plan of action based on the EBP process. Practitioners raised many issues related to the implementation of new interventions, assessment tools, and other practice strategies that were identified in the search process. This questioning also raised concerns about how agency staff would access manuals, training, and supervision to ensure successful implementation and application of ESI identified through the EBP process. Participants expressed both excitement and concern around the identification of new interventions and potential implementation barriers. The focus group discussions contained few comments related specifically to Module 9. Many participants appreciated the critical approach to making recommendations about moving forward based on the identified evidence. One participant stated, that through her experience with the training she could see where “contributions need to be made” to address questions that are important to social work that have not yet been answered by research. Most participants recognized the gap between research and practice as well as the challenges in addressing that gap. One participant acknowledged the importance of the EBP process while speaking realistically about the challenges of translating research into practice: We did decide that there were some approaches or interventions we identified that were useful. There is still a whole piece missing and that’s how we would practice and how we would actually implement the evidence-based recommendations in practice and that’s not in the literature we identified.
Other participants reported that they found the process of summarizing findings and recommending a plan of action to be validating. One participant stated, “When previous research supports your ideas about practice, it is affirming and it helps you solidify your perspective and that’s good.”
BEST Training Module 10: Action Plan Module 10 focuses on assisting agency teams to create an implementation plan for the remaining three steps in the EBP process: applying the results of the evidence appraisal to policy/practice (step 5); evaluating performance (step 6); and teaching others to do the same (step 7) (Gibbs, 2003). Creating an implementation plan is accomplished by discussing the findings of Modules 1–9 and outlining the agencies next steps in the EBP process. In this module, the agency team reviews the evidence summary statement as a basis for application of evidence to practice or policy. This process could result in a number of outcomes including a letter to the administration or agency board, a training to share findings with others at the agency, training practitioners in an identified ESI, or the development of new practice tools such as a resource manual. The agency team develops a plan that outlines the steps they will take to apply their findings from Modules 1 through 9, which team member will carry out specific steps, and when the steps will be carried out. Finally, Module 10 facilitates discussion regarding the team’s plan to continue using EBP and how they plan to implement EBP in routine agency practice.
IMPLEMENTING EBP
85
Discussion of Module 10 in the post-training focus groups elicited a reexamination of how the participants and agency teams might use pieces of the training rather than specific feedback related to the usefulness of Module 10. The comments of many participants were summarized in the remarks of one participant who commented on the importance of the BEST Training:
Downloaded by [New York University] at 05:19 12 November 2015
Training in EBP is important so clinicians can read the research—especially for newer clinicians who still need help applying it to the situation. I believe we have the responsibility and the agency has the responsibility to provide EBP training because I think EBP is really a challenging process.
Each of the three agency teams developed a unique plan to further implement EBP. One team proposed a requirement for students in field placements at the agency to provide evidence from a literature review during clinical case conferences. This addressed barriers of access to published empirical literature, as students would have access to fee-based databases through the university, as well as limitations on staff time that participants felt would prohibit the agency from searching and evaluating evidence: I think that will be good. My experience is when people get the articles and when they read the articles and they are excited about the findings they try things, so I think that’s how it’s going to work.
Participants remarked about the need to incorporate EBP into their current practices and engage in the EBP process as a group. One agency team member said, “We’re definitely bringing EBP into our weekly casework meeting.” Another agency team suggested dividing the EBP research tasks among individuals and working as a group. A participant described the anticipated use of this approach: We would probably incorporate EBP as a group process, we probably would divide it into different tasks, smaller tasks, shared among people. We need to incorporate it into our daily practice. Our regular team discussion will focus on EBP and promote EBP as part of our agency structure.
Although other participants expressed a desire to use the skills from the training, they also expressed concerns about their ability to carry out the EBP steps on their own and identified several potential barriers: lack of technical support, time constraints of already overextended staff, and the need for additional training and experience with EBP, ESI, and research methods. Several participants commented on the need for an EBP technical support team and continued support in the process of searching for, accessing, and interpreting evidence. A participant reinforced many of these barriers including the need for additional training and ongoing expert consultation for EBP and specific identified ESI: I tried to use an intervention from an article, but how can I do it alone? Is there anything that I have been missing or potential problems with my patients that I don’t realize or know? I worry I’m lost or even I’ve made some mistake because I haven’t been trained in this intervention and I am not sure if I am using it correctly. I have no one to ask.
Finally, one participant summed up the concerns of many who worried about having the time to implement EBP, “The only drawback is time—not having enough time prevents you from doing it. But, it could never be a drawback to using it. EBP is valuable.” Conversely, some participants did not anticipate using EBP as a routine part of practice, but expected to use EBP for isolated, particularly difficult practice problems: “In practice as problems come up, and puzzles arise. I have a plan to use EBP. I’m just waiting for the time.” Although participants identified barriers to continuing to train staff in EBP and incorporate EBP into practice, most acknowledged the benefits of the EBP approach. In response to high staff turnover as a barrier to continuing to use EBP in the agency, one participant responded:
86
S. E. BLEDSOE-MANSORI ET AL.
This program will be here and people will come and go, and this community needs good clinicians so no matter how long staff stay here; if you go through the process, you become a good clinician. So I believe we need to continue to use EBP. Even if staff go somewhere else, the providers will be well trained and that will benefit the community and clients.
Downloaded by [New York University] at 05:19 12 November 2015
In addition, participants regarded teaching other agency practitioners the process of EBP as a staff benefit: “EBP training could be an incentive that we can offer our agency staff. We can set up small groups to help teach people.”
DISCUSSION Overall this study indicates that the BEST training was generally acceptable to agency team members, though there are challenges that should be addressed to improve the acceptability of the training. These challenges are largely indicated by practitioners’ statements highlighting their lack of confidence in their ability or desire to carry out the tasks of certain modules. Identified challenges include difficulties in selecting alternative interventions (Module 2) and integrating the empirical evidence with theoretical frameworks (Module 3) to more technical challenges in conducting an electronic search (Modules 4 and 5), evaluating the applicability and quality of evidence for agency’s unique practice questions and client populations (Modules 6, 7, 8), and using statistical methods to interpret the identified evidence (Module 7). While the BEST Training is likely a useful first step for agencies seeking to implement EBP it does not appear to be independently sufficient to launch and sustain the use of EBP in agency practice. One of the most challenging aspects of the BEST training was participants’ lack of technical skills and the level of support needed by agency teams to evaluate the quality of identified empirical evidence. This finding suggests that long-term partnerships with agencies and universities may be one vehicle by which to facilitate the sustained implementation of EBP in social work practice. This corresponds with Fixsen and colleagues’ (2005) recommendation that EBP training should be combined with ongoing consultation and supervision that extend well beyond the training period. Learning new skills and knowledge should be considered an ongoing process and one that requires practicing skills and applying the knowledge in real-world context, as well as continued troubleshooting with consultants or trainers (Iowa Consortium for Substance Abuse Research and Evaluation, 2003). Little research is available to inform how enduring partnerships might be structured and funded, whether or not these partnerships would be acceptable to practitioners and agencies, the costs associated with implementing and maintaining EBP in social service agencies, or the optimal mechanisms by which to reinforce the continuous use of EBP in agency settings. Findings from this study also suggested the need for complementary training and supervision in ESI in addition to EBP training. Participants pointed out the need for initial training and ongoing supervision in applying the ESIs identified in the EBP process as well as training in practice technologies, such as assessment tools. However, some ESIs (e.g., standardized instruments and manualized treatments) require specialized training. An attempt to implement these interventions in practice in the absence of the necessary formal training would be counter to the goals of the EBP process, as well as the values and ethics of the social work profession, and could result in null or even detrimental results for clients. Accordingly, training practitioners in the process of EBP, whereby promising new approaches to practice are identified, without also providing for resources to train staff in the use of these new tools is likely to result in frustration for practitioners and limited progress toward improved services for clients. A further challenge was the mismatch between the evidence participants identified and the populations served by the agencies. Participants questioned whether sufficient, relevant research
Downloaded by [New York University] at 05:19 12 November 2015
IMPLEMENTING EBP
87
was available to meet the needs of their clients. Given that limited evidence relevant to diverse populations and agency contexts is available, a critical gap between research and practice is highlighted here. This mismatch may reflect a historical lack of communication and collaboration between researchers and practitioners in the design, implementation, and evaluation of research studies that take into consideration the real-world agency context (Bellamy et al., 2006c; McCracken & Marsh, 2008; Nutley, Walter, & Davies, 2009). Overall, participants found the BEST training useful and beneficial but stated barriers, including time, funding, and access to resources, might prevent them from implementing the full EBP process. The participants’ feedback indicated that both changes in agency organizational culture (i.e., allowing practitioners dedicated time to engage in EBP and access training in ESI) and provision of additional resources (i.e., access to full-text journal articles, fee-based search engines) are necessary for practitioners to routinely and comprehensively incorporate EBP in practice. These larger systemic barriers are frequently discussed in the EBP literature (Anderson, Cosby, Swan, Moore, & Broekhoven, 1999; Barratt, 2003; Fixsen et al., 2005; Mullen & Bacon, 2004). Although not ideal, all three agency teams expressed a commitment to implementing pieces of the EBP process in a modified form that fit the context of their particular agency. This may signal their openness to the increased use of empirical evidence in making decisions related to practice, however piecemeal in nature. Limited knowledge and experience in searching for empirical evidence and understanding of the applicability of statistical tests both reported in the empirical literature and suggested as a means to judge the quality of the empirical literature created a challenge for both trainers and the agency teams of practitioners. If social work practitioners are expected to use EBP in agency settings, prior training should stress the skills necessary to apply these techniques including electronic searching skills and applied statistical knowledge. Social work researchers should continue existing efforts to translate empirical evidence into practical and accessible practice knowledge through partnership with agency based social work practitioners as well as changes in the presentation of empirical findings. Our target audience should include social workers in social agencies as well as other researchers. Findings from this study should be viewed in the context of several limitations. The research design and small sample size limit the study’s generalizability and therefore results should be interpreted cautiously. Given the exploratory nature of this study, future research should expand upon this work using a larger sample of agencies and practitioners. Furthermore, a better understanding of whether the EBP process can be sustained over a longer period of time following a brief training is warranted. IMPLICATIONS The use of EBP and ESIs is critical for delivering ethical and effective social work practice. Yet, studies suggest that the use of EBP and ESI in social work is less than optimal (Bledsoe et al., 2007; Gibbs & Gambrill, 2002; Kirk & Rosenblatt, 1981; Mullen & Bacon, 2004; Rosen, 1994; Rubin & Parrish, 2007; Weissman et al., 2006). Agency–university partnerships represent a model for motivating and educating social work practitioners to increase adoption of EBP. Findings from the BEST study are consistent with prior literature (Nutley, Walter, & Davies, 2009; Proctor, 2004) indicating that widespread dissemination and implementation of EBP in practice might require a multilevel strategy focused on producing agency-based practice relevant research, improved organizational infrastructures, and relevant class and field education. Specifically, university–agency partnerships are needed not only to enhance practitioners’ motivation to use EBP but also to provide the initial and ongoing training, technical assistance, and supervision needed for practitioners to successfully implement and use EBP efficiently in
Downloaded by [New York University] at 05:19 12 November 2015
88
S. E. BLEDSOE-MANSORI ET AL.
practice. This dimension also requires agencies and agency administrators to regard the EBP process as a necessary investment in their agency practice and programs, and that researchers regard their investment of time and resources into agency training and technical assistance as important to expanding EBP in practice, which will ultimately advance the field of social work in bridging the gap between research and practice. Researchers may also play a role in increasing practitioner use of empirical evidence by routinely including outcomes, such as effect size and number needed to treat, in study reports thus providing practitioners with data to compare outcomes across studies. Additionally, future research should replicate EBP training efforts with a more representative sample of social agencies and begin to focus on developing and testing agency– university/researcher partnerships that will facilitate the sustained use of EBP in practice. Further indicated is the need for schools of social work to provide adequate training and continuing education opportunities to ensure social work practitioners have the critical skills to support the use of EBP and ESI in agency practice. Recommendations along these lines include infusing the EBP process into social work curricula, similar to the model used by the University of Toronto School of Social Work (Regehr, Stern, & Schlonsky, 2007), and students’ fieldwork. This routine infusion of EBP enables students to translate what they are learning in the classroom to the reality of their practice in the field. We also recommend increased focus in introductory and advanced research courses on the basic research skills and statistical methods required to fully support EBP. In addition, schools of social work should offer course work and field supervision in ESI as part of a multilevel strategy to prepare future practitioners to function as evidence-based social workers in agency practice. This study provides insight into some of the challenges and benefits of training agency based social workers in using EBP. In order for the EBP process to become infused into agency practice, a long-term strategy is necessary to foster and strengthen community and university partnerships; to create mechanisms that mitigate barriers to implementing and using EBP in practice; and to ensure that schools of social work are providing the necessary training and continuing education in research to carry out the EBP process.
REFERENCES Anderson, M., Cosby, J., Swan, B., Moore, H., & Broekhoven, M. (1999). The use of research in local health service agencies. Social Science & Medicine, 49, 1007–1019. Bandolier. (April, 2004). EBM Glossary. Retrieved from http://www.jr2.ox.ac.uk/bandolier/glossary.html Barratt, M. (2003). Organizational support for evidence-based practice within child and family social work: A collaborative study. Child and Family Social Work, 8, 143–150. Bellamy, J., Bledsoe, S. E., Fang, L., Manuel, J., Coppolino, C., Crumpley, J., : : : Mullen, E. J. (2006a, January). Implementing evidence-based practice: From research to the front line. Poster presented at the 10th Annual Conference of the Society for Social Work Research, San Antonio, Texas. Bellamy, J. L., Bledsoe, S. E., Manuel, J., Fang, L., & Mullen, E. J. (2006b). BEST training on evidence-based practice. New York, NY: Willma & Albert Musher Program at Columbia University. Retrieved from http://www.columbia.edu/cu/ musher/Website/Website/EBP_OnlineTraining.htm Bellamy, J. L., Bledsoe, S. E., Mullen, E. J., Fang, L., & Manuel, J. (2008). Learning from agency-university partnership for evidence based practice in social work: Participant voices from the BEST project. Journal of Social Work Education, 44, 55–75. Bellamy, J. L., Bledsoe, S. E., & Traube, D. (2006c). The current state of evidence-based practice in social work: A review of the literature and qualitative analysis of expert interviews. Journal of Evidence-Based Social Work, 3, 23–48. Bhattacharyya, O., Reeves, S., & Zwarenstein, M. (2009). What is implementation research?: Rationale, concepts, and practices. Research on Social Work Practice, 19, 491–502. Bledsoe, S. E., Weissman, M. M., Mullen, E. J., Ponniah, K., Gameroff, M., Verdeli, H., : : : Wickramaratne, P. (2007). Empirically supported psychotherapy in social work training programs: Does the definition of evidence matter? Research on Social Work Practice, 17, 449–455.
Downloaded by [New York University] at 05:19 12 November 2015
IMPLEMENTING EBP
89
Brekke, J. S., Ell, K., & Palinkas, L. A. (2007). Translational science at the National Institute of Mental Health: Can social work take its rightful place? Research on Social Work Practice, 17, 123–133. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature (FMHI 231). Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network. Gibbs, L. E. (2003). Evidence-based practice for the helping professions: A practical guide to integrated multimedia. Pacific Grove, CA: Brooks/Cole. Gibbs, L. E., & Gambrill, E. (2002). Evidence-based practice: Counterarguments to objections. Research on Social Work Practice, 12, 452–476. Hess, M., & Mullen, E. J. (Eds.). (1995). Practitioner-research partnerships: Building knowledge from, in, and for practice. Washington, DC: NASW Press. Iowa Consortium for Substance Abuse Research and Evaluation. (2003). Evidence-based practices: An implementation guide for community-based substance abuse treatment agencies. Iowa City: University of Iowa. Kirk, S. A. (1990). Research utilization: The substructure of belief. In L. Videka-Sherman & W. J. Reid (Eds.), Advances in clinical social work research (pp. 233–250). Washington, DC: National Association of Social Workers. Kirk, S. A., & Reid, W. J. (2002). Science and social work: A critical appraisal. New York, NY: Columbia University Press. Kirk, S. A, & Rosenblatt, A. (1981). Research knowledge and orientation among social work students. In S. Briar, H. Weissman, & A. Rubin (Eds.), Research utilization in social work education (pp. 29–35). New York, NY: Council on Social Work Education. Krueger, R. A., & Casey, M. A. (2000). Focus groups: A practical guide for applied research (3rd ed.). Thousand Oaks, CA: Sage. Manuel, J. I., Mullen, E. J., Fang, L., Bellamy, J. L., & Bledsoe, S. E. (2009). Preparing social work practitioners to use evidence-based practice: A comparison of experiences from an implementation project. Research on Social Work Practice, 19, 613–627. McCracken, S. G., & Marsh, J. (2008). Practitioner expertise in evidence-based practice decision making. Research on Social Work Practice, 18, 301–310. Mullen, E. J. (2009). Evidence-based policy & social work in healthcare. Social Work in Mental Health (special issue: Social Work and Mental Health, a Global Research and Practice Perspective), 7, 267–283. Mullen, E. J., & Bacon, W. (2004). A survey of practitioner adoption and implementation of practice guidelines and evidence-based treatments. In A. R. Roberts & K. Yeager (Eds.), Evidence-based practice manual: Research and outcome measures in health and human services. New York, NY: Oxford University Press. Mullen, E. J., Bellamy, J. L., Bledsoe, S. E., & Francois, J. J. (2007). Teaching evidence-based practice. Research on Social Work Practice, 17, 574–582. Mullen, E. J., Bledsoe, S. E., & Bellamy, J. L. (2008). Implementing evidence-based social work practice. Research on Social Work Practice, 18, 325–338. Mullen, E. J., Shlonsky, A., Bledsoe, S. E., & Bellamy, J. L. (2005). From concept to implementation: Challenges facing evidence-based social work. Evidence and Policy, 1, 61–84. Musher Program. (2006). Evidence-based practice & policy online resource training center. Columbia University: New York. Retrieved from http://www.columbia.edu/cu/musher/Website/Website/EBP_OnlineTraining.htm New Freedom Commission on Mental Health. (2003). Achieving the Promise: Transforming Mental Health Care in America. Final Report (No. DHHS Pub. No. SMA-03-3832). Rockville, MD: United States Department of Health and Human Services. Retrieved from http://www.mentalhealthcommission.gov/reports/FinalReport/downloads/downloads.html Nutley, S., Walter, I., & Davies, H. T. O. (2009). Promoting evidence-based practice. Research on Social Work Practice, 19, 552–559. QSR International. (2006). NVivo, Version 7. (Computer software). QSR International Pty Ltd. Panzano, P., & Herman, L. (2005). Developing and sustaining evidence-based systems of mental health services. In R. E. Drake, M. R. Merrens, & D. Lynde (Eds.), Evidence-based mental health practice (pp. 167–187). New York, NY: Norton. Proctor, E. K. (2004). Leverage points for the implementation of evidence-based practice. Brief Treatment and Crisis Intervention, 4, 227–242. Regehr, C., Stern, S., & Shlosky, A. (2007). Operationalizing evidence-based practice: The development of an institute for evidence-based social work. Research on Social Work Practice, 17, 408–416. Rosen, A. (1994). Knowledge use in direct practice. Social Service Review, 68, 561–577. Rubin, A., & Parrish, D. (2007). Views of evidence-based practice among faculty in master of social work programs: A national survey. Research on Social Work Practice, 17, 110–122. Sackett, D. L., Straus, S. E., Richardson, W. S., Rosenberg, W., & Haynes, R. B. (2000). Evidence-based medicine: How to practice and teach EBM (2nd ed.). New York, NY: Churchill Livingstone.
90
S. E. BLEDSOE-MANSORI ET AL.
Downloaded by [New York University] at 05:19 12 November 2015
Torrey, W. C., & Gorman, P. G. (2005). Closing the gap between what services are and what they could be. In R. E. Drake, M. R. Merrens, & D. W. Lynde (Eds.), Evidence-based mental health services (pp. 167–188). New York, NY: Norton. Upshur, R. E. G., & Tracy, C. S. (2004). Legitimacy, authority and hierarchy: Critical challenges for evidence-based medicine. Brief Treatment and Crisis Intervention, 4, 197–204. Weissman, M. M., & Sanderson, W. C. (2001). Promises and problems in modern psychotherapy: The need for increased training in evidence-based treatments. In M. Hager (Ed.), Modern psychiatry: Challenges in educating health professionals to meet new needs (pp. 132–165). New York, NY: Josiah Macy Jr. Foundation. Weissman, M., Verdeli, H., Gameroff, M., Bledsoe, S. E., Betts, K., Mufson, L., : : : Wickramaratne, P. (2006). A national survey of psychotherapy training programs in psychiatry, psychology, and social work. Archives of General Psychiatry, 63, 925–934.