Program Specification: A Precursor to Program Monitoring and Quality Improvement. A Case Study From Boysville of Michigan Sue Ann Savas, M.S.W. William M. Fleming, Ph.D. Erika E. Bolig, Ph.D.
Abstract As a result of new accreditation standards, diminishing resources, community concern with recidivism, and state agencies and foundations requiring more rigorous evaluations, program accountability is becoming a necessityfor social services providers and contractors alike. In the subsequent discussion, Boysville of Michigan's program specification process is describe& specifically, the ways in which the process has been usefulfor monitoring program operations, identifying quality improvement indicators, designing ongoing program evaluations, and developing consensus and continuity with respect to program theory and procedures. In addition, benefits, lessons learned, and implications for services delivery are discussed.
Within the last decade, accountability has become the focus for many agencies providing mental health and child welfare services. Increasingly, agencies are being required to demonstrate the client's need for treatment, document plans of care, measure client progress, and establish the costeffectiveness of the services delivered. Frequently, demands are being viewed through the lens of economic efficiency as managed behavioral health care becomes firmly established. Additionally, new accreditation standards, ~diminishing resources, community concern with recidivism, 2and state agencies and foundations requiring more rigorous evaluations have all led to this increased focus on accountability. As a result of these pressures, program accountability is becoming a necessity for service providers and contractors alike. Establishing and implementing processes to determine program integrity, effectiveness, and efficiency has been an operating principle at Boysville of Michigan long before it was fashionable or mandatory) Boysville's ability to monitor program processes and outcomes rests on its quality improvement system, which begins with the specification of programs. In the subsequent discussion, the ways in which the program specification process has been useful for monitoring program operations, identifying quality improvement indicators, designing evaluations, and developing consensus and continuity with respect to program theory and procedures are presented as a child and family service agency case study. These principles, concepts, and techniques are applicable to all
Address correspondenceto Sue Ann Savas,M.S.W.,ProgramEvaluationDirector,Boysvilleof Michigan, 8759ClintonMacon Road, Clinton, M149236; e-mail:
[email protected]. WilliamM. Fleming,Ph.D., is an assistantprofessor,Schoolof Educationand HumanSciences,Familyand Consumer Sciences, BerryCollege,Mount Berry,Georgia. Erika E. Bolig, Ph.D., is a postdoctoralresearchfellow,Boysvilleof Michigan, Clinton, Michigan.
208
The Journal of Behavioral Health Services & Research
25:2 May 1998
service delivery systems and providers, especially for building evaluation capacity. More generally, agencies and practitioners providing services in a managed care environment may soon be required to engage in some form of program specification.
About the Agency Boysville of Michigan, established in 1948, is Michigan's largest privately owned, nonprofit child welfare and family reunification agency. Programs are implemented throughout Michigan and northwestern Ohio. The agency's professional staffwork directly with more than 700 boys and girls, and their families, on a daily basis at nearly 30 different sites. Clients are referred from the child welfare, juvenile justice, and mental health systems. The agency has over 17 years of experience collecting, reporting, and using systematically gathered client and program information to inform practice. 4'5The applications and processes that constitute today's quality improvement system have been in development since the inception of Boysville's practice-based evaluation effort and will continue to evolve as the interventions, program models, computer hardware, and information system technologies advance. 6 The diversity of programs offered and the complexity of meeting ever-changing state and county requirements necessitated the implementation of a flexible and relevant quality improvement system.
Quality Improvement System Components At Boysville, the program specification process is central to the multipurpose quality improvement system. The key components of the quality improvement system are presented first as a context for the more detailed program specification discussion. The program specification process sheds light on the program's theory for intervention and client change, whereas the quality improvement processes function to determine and maintain program integrity according to the specified theory. What occurs in the black box of intervention? Are these program services, activities, and interventions being implemented according to the program model'? In addition to integrity, the quality improvement system includes processes that require routine evaluation of program effectiveness and efficiency.7 These activities focus on measuring client outcomes and the degree to which the plan of care was delivered in the least restrictive, most cost-effective, efficient manner. The quality improvement system includes mechanisms for disseminating single-case and aggregate reports useful for the refinement of service delivery procedures. 8Routine reviews of program operation indicators and the development of corrective action plans are also key system elements. For example, program performance objectives and actual results are reviewed formally on a quarterly basis by management teams to better understand areas of strength and areas that need improvement. For agencies in the process of developing a quality improvement system, practice indicators and requirements for a quality assurance review will surface as a result of the program specification. Additionally, according to program staff and managers, the program specification process is one experience that has resulted in an increased "ownership" of their program and program outcomes. Through the application of the processes described above, the Boysville quality improvement system supports and facilitates the integration of accountability with service delivery.9
Origins of the Process Limitations in the interpretation of youth outcome information led to the development of the Boysville program specification process beginning in late 1993. For a number of years, 3- and 12month community adjustment outcomes for youth released from an out-of-home program were collected, analyzed, and reported. These outcomes included living at home, school enrollment or employment, and no new convictions. Managers found it difficult to interpret the findings to make Program Specification
SAVAS et al.
209
decisions regarding the revision, discontinuation, or maintenance of specific programs. For example, youth outcomes differed across residential programs that were operating with the same theories, intervention models, daily schedule, and staffing patterns. Why were youth from one residential program achieving a higher level of community adjustment compared with youth from another, but similar, program? Managers and agency evaluators realized that they had been operating on the assumption that the residential programs were delivering the same services in the same way to the same population. This was an assumption that required testing. The verification of the assumption began with a search for the documentation of the residential program model, including a description of the target population, planned services, anticipated client outcomes, program performance standards, the research supporting the model, and the program thetry. This information was located in a number of documents (program eligibility requirements, procedure manual, contract, program description) but was not organized in a coherent and logical fashion. How the services were going to meet the identified client needs, and how those services would result in the expected outcomes, was unclear. The program specification process was drafted to address this agency need. The work of Wholey l°'n was drawn upon for guidance on how to extract and articulate program theory using a logic-modeling process. Developing a strong foundation in understanding the program theory was seen as, and continues to be, an important element before evaluation and quality improvement efforts are initiated. 1z'13The program specification process was founded on the belief that program designs are best implemented when they are grounded in theory and/or when empirical evidence shows that the intervention approach is effective] Before the application of the program specification is presented, a brief description of the logic-modeling process is provided.
Logic Modeling Program specification is a collaborative process whereby the various components of a program are laid out in a schematic way in order to make logical connections between the characteristics of the clients, the major components of the program and their associated activities, and the intended outcomes. Wholey 1°'11and others have referred to this process as logic modeling. The Boysville logic modeling process was built on the assumption that program staff must be collaboratively engaged in all program specification and evaluative processes if these activities are to be used effectively. Through a series of group interviews, the program manager, clinicians, and other program staff described their program. Boysville's Day Treatment Program was the first program developed with this program specification process. The model presented in Figure i is the third version of the program after a four-year implementation period; the process supports systematic and planful program modifications. Youth, family, and system conditions associated with the need for day treatment services are presented in column 1. The major components of the program are described in column 2, with the specific staff activities associated with each component listed in column 3. Columns 4, 5, and 6 outline the client outcomes at multiple points in time.
Model Developed in Four Steps First, the specification of the program began with the formulation of a team of six members. The team included clinicians, program staff, and the manager. The agency program evaluator facilitated the meetings. An individual not directly involved with the program, such as a manager from another program, is recommended as a facilitator. Boysville and other agencies using this process have discovered that facilitators who are not familiar with the services have greater latitude in asking clarifying questions and are more likely to refrain from assumptions; this has resulted in a more comprehensive and accurate program model. 210
The Journal of Behavioral Health Services & Research
25:2 May 1998
t~
1.6 The Corcoran Day Treatment Program was implemented to fill a gap in the service delivery system by o|ferlng in-home services as an alternative to out-of-home residential care.
t.5 This pro9ram model was not designed to intervene with the family's community.
1.4 This prograrn model was nOl designed to intervene with the youth's "commt,llty peers'.
1,3 Families ale talgeted for intervention according 1o their individual strengths, resources, needs and circumstances.
1.2 Youth need to advance their educations, coping skills, pre-lndependent living life skills, theb employability and Ihek pro-social use of leisure time.
1.1 Youlh with "mediunl" to "high" security risk from GeneseeCounty need services in the community (as an alternative to residential placement),
YOUTH, F A M I L Y A N D SYSTEM CONDITIONS
2.3 In-Home (PhaseII. 4-8 months)
2.2 Ongoing Tracking
I,:, (PhaseI, 4-8 months)
PROGRAM COMPONENTS
I
3.3a Youth support group twice per week. 3.3h Mentoring weekly. 3.3c Fam~ therapy sessionstwo times per month. 3.3d Twenty-four hour crisis response setvlce. 3,3e W~paround servicesand flexible funds. 3.3f Respitecare for youth. 3.3g Monitor and support communitybased schooling. 3.3h Preparefor vocationaYjobplacement, as needed. 3.3i Community service projects. 3.3j Moflitor and support youth's adjustment in the community. 3.3k Support famil~s Useof communRy resources and services. 3.31 Tennination o f services.
3.2a Telephone,face-to-face,and collateral tracking contacts. 3.2b Other tracking supports.
3.1a Assessmerit and case planning. 3.1b On-site education. 3.1c Youth group therapy three days per week. 3.1d Family therapy sessionsat least two per month, 3.re Family education support g roups, parent-teacher conferencemonthly. 3.1f Twenty-four hour crisis respome service. 3.1g Wrap-around servicesand flexible funds. 3.1h Respitecare for youth. 3.1i Preparefor reintegration into community-besed school. 3.1j Preparationfor vocationa~r~ob placement, as nc.eded. 3.1k Community service projects. 3.11 Recreationactivities six days per week.
ACTIVITIES
:
4.5 Familieswill drveiop a network of positive formal and informal suppports (Le. relatives, neighbors, churches).
4.4 Familieswill have knowledge of resources, servicesand opportunities in their own community.
4.3 Familieswill complete all or most of their family treatment goals,
¢2 EducationalGain: For evesy6 months in ppogram, youth will attain an averagegain for one year in math and reading.
4,1 Suceessful c.ampletion: 8O%of the youth in program for at least 30 days will (1) complete the& goals, (2) transition tO a community-b~ed program of education, vocational training or work, and (3) reside in a home setting (own home. foster home, relatives,guardians, or SIL).
(end of Day Treatment)
M M E D I A T E OUTCOMES
BOYSVILLE OF MICHIGAN: DAY TREATMENT MODEL
Figure I
•
home~tang (own hw.e,
neighlx)n,chu~.hes).
supports (i.e. relatives,
5.5 Familieswill develop • network of positive formal and informal
5.4 FamiF,eswill have knovAedgeof resomCns, sentice5and opportunities in their own community.
5.3 Avoid recidlv~m to out-of-home placement: 75% of the youth v~o successfullycomplete the program shag not recidivate within one year of completion. Reddlvkrr is defined as out-of-home piacelnent (OVer72 houri in jail, prison, or child cadng imtitutiom.
5.2 EducatiomdC~ln: For even/6 monlhs in prognm'~youth rag attain an average gain for one year in math and reading.
fostw home, rebtives, guwdians, of 51L).
5.t Successfu4compledon: 80% of the youth kz progmn lot at leut .t0 days ...,ill (I) comp~te their gouts.(2) transition to a e~nmunity-based program of education, vocational training or work, and O) midein e
INTERMEDIATE OUTCOMES (3 and12 mmtlhs a~ler an, of Day Treatment)
t
~11 ~
edt5 edult ~ ju~d~ ~ t m ) .
~,d
me~y~(not
6.1 Y ~
t LONGRANGE OUTCOMES 0 endSl~ws
Second, team members gathered all existing documentation about the program (i.e., contracts, budgets, license requirements, program descriptions, accreditation standards, recent profile reports, the previous logic model) and incorporated the documents into a resource notebook. Duplication of effort was minimized by building the logic model from a base of information available from historical and current program documentation. Additionally, this process of gathering the information served to familiarize all team members with the expected program standards and requirements. When the process was tried with scant documentation, participants were open to describing their preferred or "fantasy" program rather than the actual program they were required to provide. In the third step, the facilitator began interviewing the team members in a focus group format. Open-ended questions were asked. The responses were used to complete each of the columns onthe model. The "client and system conditions" column was completed first. Team members described who was eligible for service, client need, and the target and scope of intervention, and provided explanation as to why the program was added to the service delivery system. With this discussion, staff expressed their belief that intervening with the client's community would reduce the youth's risk of recidivism and sustain the effects of their treatment efforts. However, it was clear that resources were not in place to add "community" to the program. This boundary for intervening was explicitly stated in the model (see 1.6, Figure 1). The "outcomes" columns were completed next before the program activities were described. This column completion sequence was found to be more effective than completing the columns in order from left to right. Team members were able to focus discussions on how the resolved needs of the clients were directly related to client change without introducing their own contribution to the process. Conversely, when the program model was designed from left to right, members spent considerable time justifying their services and were unable to untangle their service outcomes (in-home contacts, attendance at family sessions) from the client outcomes (improved family functioning). The outcomes should reflect a comprehensive list of those performance objectives required by the funder and those valued and mandated by the agency. For example, the immediate youth outcomes (e.g., program completion, return to a community-based education program, residence in a home setting, and educational gain) were drawn from the contractor (Michigan's state social service department). Based on the agency's value of family centeredness, team members identified family outcomes such as "families will develop a network of positive formal and informal supports" (see 4.5, Figure 1). For many program staff, this was the first opportunity to learn about the origins of each outcome and the expectations of the program. The program components and associated activities were described last. Team members summarized the services into three major program components (e.g., day treatment, ongoing tracking, inhome) and their more specific staff activities (e.g., family therapy sessions two times per month). As the columns were completed, drafts were sent to team members between subsequent meetings for review. This improved the efficiency of the meetings. Finally, the detailed answers from the interview were organized into a written program narrative. This narrative provided readers with a more detailed explanation and description of the various components of the model. Discussion and refinement of both the model and the narrative occurred at each meeting until consensus was reached. The resource notebooks were referenced when team members were in disagreement about issues such as a standard, the origins of an expectation, or a licensing requirement.
Application of the Program Specification Process Next, applications of the program specification process for evaluation planning, monitoring of program operations, and meeting quality improvement requirements are described.
212
The Journal of Behavioral Health Services & Research
25:2 May 1998
Evaluation Planning After the logic model and narrative were completed, team members identified information needed to answer three key evaluation questions: Who is being served? Are the services being delivered according to the program specification? Have the youth/families achieved the intended outcomes? At this point, required agency information needs associated with all of Boysville's longterm programs were integrated into the evaluation planning discussion. For example, youth and family members receiving services through the Day Treatment Program will be contacted by Boysville program evaluation staff at 3 and 12 months after program completion to gather information regarding their satisfaction with the Boysville program and their adjustment in the community. This data collection effort is mandatory for all of Boysville's long-term treatment programs. During this evaluation planning discussion, team members requested measurement of almost all staff activities, explaining that all of their work was critical to the program and should be measured. However, Boysville co-workers have learned that data collection efforts fall short when the realities of their paperwork are juxtaposed with their daily work with clients. As a result, the program evaluator used the logic model to guide the group through a selection of key activities and outcomes. These decisions were based on external reporting requirements, internal evaluation needs, feasibility, and relevance to the program model. The data collection tools were selected based on their psychometric properties, utility, clinical relevance, programmatic necessity, intrusiveness, and feasibility. The final Day Treatment Evaluation Plan included one new measure developed specifically for the Day Treatment Program (Tracking Activity Sheet), a number of Boysville instruments (Intake Data Sheet, Youth Coping Inventory, Family Attachment and Changeability Index-8, Family Contact Sheet, Closing Data Sheet), and the standardized Woodcock-Johnson Achievement Battery-Revised to measure educational gain.
Monitoring Program Operations Through the program specification process, the Day Treatment Program manager and program staff had identified the expected program activities and the program performance objectives. The logic model and narrative served as a template against which to compare the actual program performance with the program standards. 1oAfter the data collection tools were selected, the data collection effort was initiated by program staff. Using the agency's client information system to verify, manage, and analyze the information, the program evaluator generated a Program Operations Report listing the program's performance indicators and results. A few of the indicators and results for the Day Treatment Program are presented in Table 1. The indicators are derived from multiple sources: the program theory, agency-based research, empirical support from the field, and agency values. The first indicator describes the number and percentage of face-to-face direct contacts with family held in the family home (75%, Table 1). Services to family in the family home is an expectation that is drawn from the program 5 theory and internal research. The Day Treatment Program, built on an ecological framework, draws from a systems approach and is grounded in the agency's practice principle of family centeredness. Additionally, internal research on the association of face-to-face family contacts and youth adjustment in the community one year after program supports this indicator as a measure of effective practice. The second indicator describes the number of youth leaving the facility (program) without permission (truancy incidents from facility = 3, Table 1). The number of truancy incidents is presented along with the number of youth returned to the program following a truancy incident (one youth returned). Each program has a truancy prevention and recovery plan. The program staff understand that truancies can occur and viewed them as a symptom of other issues with which the youth may be struggling. Staff are expected to locate and return the youth to the program. Program completion rate is the third
Program Specification
SAVAS et aL
213
Table 1 Day Treatment Indicators and Results Indicators Face-to-face direct contacts held in the home (number in home, number face to face) Truancy incidents from facility Truanted youth returned to program Program completion rate Of those who completed program, number of youth returned to parents, relatives, adoptive family, foster home, or supervised independent living
Day Treatment Program 75% (146 of 195) 3
1 91% (10 of 11)
100% (10 of 10)
indicator and describes the number and percentage of youth leaving the program after completing all or most of their treatment goals (91%, Table 1); this indicator described the number of"successful program releases or Boysville graduates" Of those youth who completed their goals, the last indicator presents the number and percentage who were returned or maintained a home setting (i.e., parents, relatives, adoptive, foster home, or supervised independent living). In addition to the internal operations report presented in Table 1, a separate report was generated for the Day Treatment Program's contract renewal meeting describing the program's performance for the past nine months. The tracking activity section of this report is presented in Tables 2 and 3. For example, program stafftrack the youth's whereabouts after he or she leaves the supervised day program (Phase I) and return home for the evening and on weekends. In Phase I, a minimum of one contact is expected per day, with half of these contacts expected to be face to face. During the inhome phase (Phase II), fewer contacts are expected as the youth begins to implement his or her community adjustment plan and the caretakers (parents) resume full responsibility for supervision. This section of the report indicates that the actual tracking activity conducted by the program staff face to face (Table 2) far exceeded the contractor's expectations, whereas the number of phone contacts in Phase II was less than promised (Table 3). The contractor and the manager used the logic model and the report to review the previous year's operations, negotiate new performance objectives, and confirm the program design for the upcoming year. The logic models and program operation reports have been well received by contractors, licensing consultants, and accreditation reviewers.
Meeting Quality Improvement Requirements Appropriateness of services and effectiveness are two important quality improvement domains that necessitate systematic review) '15 The program specification process was a precursor to this effort. Routinely, B oysville staff monitor who is served in comparison with the target population and explore outcome differences among the subgroups served. Within the first year of the Day Treatment Program's implementation, a number of youth were terminated from the program due to unexcused absences and inappropriate behavior. This contributed to an unacceptably low program completion release rate. In response, the manager referenced the logic model to clarify the target population and expressed concern with a possible mismatch between who was accepted into the program compared with the intended population. Specifically, was the program accepting youth with a higher risk for recidivism? Were the youth appropriate for an in-home community-based program? Which youth were leaving the program early and in an unplanned way? After a brief analysis, the youth admitted to the program were determined to be appropriate. In the past four years, the program design has been altered systematically to better serve these youth. Major program changes included an increase and intensification of in-home family contacts and a graduated system of consequences and rewards for youth behavior. Today's program is presented in Figure 1. In addition to these aggregate quality 214
The Journal of Behavioral Health Services & Research
25:2 May 1998
Table 2 Tracking Activity by Phase Day Treatment Phase Contract Standards Minimum of one daily contact (12 youth * 270 days * 1 contact per day) = 3,240 50% of Phase I tracking contacts shall be face to face
Actual Tracking 4,078 contacts 3,395 contacts (83%) were face to face
Table 3
Tracking Activity by Phase In-Home Phase Contract Standards Average of three to four face-to-face contacts per week (6 youth * 36 weeks * 4 contacts) = 864 Average of three to four phone contacts per week, or as needed (6 youth * 36 weeks * 4 contacts) = 864
Actual Tracking 2,696 face-to-face contacts 359 phone contacts
improvement activities, the logic models and narratives were used to identify indicators and standards for random single-case quality improvement reviews. Developing Consensus and Continuity Kumpfer and associates found that the logic model has been useful as a vehicle for developing consensus among direct services providers, program managers, and administrators. 14This finding was echoed at Boysville. After completing the program specification process, program staff and managers reported feeling clearer about what was expected of them in relationship to the overall program. With this process, managers and clinicians had reached a common understanding and knowledge base as to why the clients were in need of particular services, which services were expected, and what outcomes were anticipated. As managers and program staff join and leave program teams, procedures and practices tend to change slightly. Over time, the program models drift away from the intended design. This program model drift is a common concern for Boysville and other agencies experiencing a high degree of staff turnover. To prevent this problem, completed program specifications were used to orient new staffto the program model. Additionally, program teams formally review their program models annually to reestablish a clear understanding of the program theory, target population, and agency and contractual service delivery standards, and to determine if the clients' needs or contractors' expectations have changed in the past year.
Implications for Behavioral Health Services After specifying and implementing over 17 distinct program models at Boysville, a number of quality improvement implications have surfaced. First, Boysville is in a position to determine program integrity, effectiveness, and efficiency. This includes building the capacity to meet many accreditation, contract, and licensing requirements, and establish new managed care contracts. In addition, the logic models and narratives have been useful in redesigning Boysville's client information system. As part of the software requirement analysis, the program specifications clearly identified client conditions, program services, procedures, and client outcomes to include in the agency's new computerized client information system (designed by Anasazi Software).
Program Specification
SAVAS et al.
215
However, it is important to acknowledge that the processes described above do not guarantee that the services provided are of high quality or that the processes will be followed without routine monitoring. For example, the specifications have been helpful for evaluation planning and report design, but the timely and accurate completion of data collection tools is still a challenge that affects managers' ability to utilize the report findings for monitoring program operations and improving services. Additionally, the application of the program specification was designed to support the integration of accountability with service delivery. This integration is developmental. When designing a program specification for a currently implemented program, the facilitator should focus on first articulating the program theory rather than instilling accountability; challenging the staff about their services and outcomes is counterproductive at this early stage. Only after a service pattern or outcome trend has been established should the program staff be held accountable for their outcomes and the strategies used to improve those outcomes. In conclusion, Boysville's ongoing investment in program specification makes a significant contribution to service quality improvement and sets the stage for continuous program evaluation.
References 1. Council on Accreditation 1997 Self-Study Manualfor Behavioral Health Care Services and Community Support and Education Services. Volume I: Generic and Service Sections. New York: Councilon Accreditationof Services for Families
and Children, 1997. 2. CurtisLA: Lord, how dare we celebrate?Practicalpolicyreform in delinquencypreventionand youthinvestment.Future Choices: Toward a National Youth Policy 1992; 3(3):44-61. 3. Savas SA: How do we propose to help childrenand families?In: Pecora PJ, SeeligWR, ZirpsFA, et al. (Eds.): Quality Improvement and Evaluation in Child and Family Services: Managing Into the Next Century. Washington,DC: Child Welfare League of America, 1996, pp. 37-52. 4. GrassoA, EpsteinI: Managementby measurement:Organizationaldilemmasand opportunities.Administration in Social Work 1987; 11:89-100. 5. SavasSA, EpsteinI, GrassoAJ: Clientcharacteristics,familycontacts,andtreatmentoutcomes.ChiMand Youth Services 1993; 16:125-137. 6. CollinsM, EpsteinI, BarbadnO, et al.: Re-designinga clinicalinformationsystem:A descriptionof the process in a human service agency.Computers in Human Services 1996; 13:19-36. 7. Pecora PJ, Fraser MW, NelsonKE, et al.: Evaluating Family-Based Services. New York: Aldinede Gruyter, 1995. 8. ZirpsFA, Cassafer DJ: Qualityimprovementin the agency:Whatdoes it take? In: Pecora PJ, SeeligWR, ZirpsFA, et al. (Eds.): Quality Improvement and Evaluation in Child and Family Services: Managing Into the Next Century. Washington, DC: Child WelfareLeague of America, 1996, pp. 145-174. 9. TragliaJJ, MassingaR, Pecora PJ, et al.: Implementingan outcome-orientedapproach to case planningand serviceimprovement. In: PecoraPJ, SeeligWR, ZirpsFA, et al. (Eds.): Quality Improvement and Evaluation in Child and Family Services: Managing Into the Next Century. Washington,DC: ChildWelfareLeague of America, 1996, pp. 77-98. 10. WholeyJS: Evaluation and Effective Public Management. Boston:Little, Brown, 1983. 11. WholeyJS: Assessingthe feasibilityand likelyusefulnessof evaluation.In: WholeyJS, HatryHE NewcomerKE (Eds.): Handbook of Practical Program Evaluation. San Francisco:Jossey-Bass, 1994, pp. 15-39. 12. Weiss CH: TheoryBased Evaluation:Past, Presentand Future.Paper presented at the annualmeetingof the American EvaluationAssociation,Atlanta,GA, November8, 1996. 13. Chen H, Rossi P: Using Theory to Improve Program and Policy Evaluations. New York:Greenwood, 1992. 14. KumpferKL, Shur GH, Ross JG, et al.: Measurement in Prevention: A Manual on Selecting and Using Instruments to Evaluate Prevention Programs. DHHS CSAP TechnicalReport-8. DHHS Pub. No. (SMA) 93-2041. Rockville,MD: Center for SubstanceAbusePrevention,1993. 15. An Introduction to Quality Improvement in Health Care. OakbrookTerrace, IL: Joint Commissionon Accreditationof Healthcare Organizations,1991.
216
The Journal o f Behavioral Health Services & Research
25:2 May
1998