The first pioneer in this field was Ernst Codman who in 1920 established the ... Maryland Indicator Project is the only project in which the idea of Codman is sys-.
CHAPTER 8
Development and use of an indicator system for an addiction treatment centre
Udo Nabitz, Jan Walburg
Published as: Nabitz, U. & Walburg, J. (2002). Development and use of an indicator system for an addiction treatment centre. International Journal of Health Care Quality Assurance, /j, 49-5
171
CHAPTER 8
Abstract The objective of this project was to develop and implement an indicator system, which delivers concise, specific and relevant information for the managers of treatment departments. The project was carried out in the Jellinek Centre, which is a treatment service for addiction in Amsterdam in the Netherlands. During a period of ten years the indicator system called Profile Package was developed and implemented in all departments. The result of the project is a system and an infrastructure that produces a Profile Package for each team every quarter. Five critical indicators are defined and more than 40 additional indicators are used by the departments. On the bases of these indicators, the performance of a team can be monitored. An annual reviewing cycle and goal finding procedure is established, which stimulates the result orientation. The trend information of the five critical indicators shows that the results of the departments are positive.
172
An indicator system
Introduction There is a growing emphasis on resulr and performance of health care organizations. The first pioneer in this field was Ernst Codman who in 1920 established the term "The End Result Clinic" (Codman, 1996). He proposed that the results of a clinic should be monitored and analyzed in order to improve the work of the professionals and the processes of the organization as a whole. It took many years before the ideas of Codman were taken seriously. Since 1990 we find studies on performance indicators and improvement strategies of hospitals. Most of the publications discuss related conceptual and technical topics (Aveyard, 1997; Shaw, 1996). Some describe pilot projects of hospitals (Garnik, 1994; Turpin, 1996) or give examples of improving the ambulatory care (Norman, 1996). The Maryland Indicator Project is the only project in which the idea of Codman is systematically followed through on a broader scale (Kazandijn, 1992). However the publications on the Maryland Indicator Project are limited and mostly related to technical and organizational problems (Kazandjin, 1996). Also the experiences with the nationwide indicator systems of the Joint Commission IMSL and ORYX are not yet disseminated broadly (Joint Commission, 1996). Besides those large projects a few experiments with indicator systems are earned out b v small organizations (Sorensen, 1987). Those projects focus on indicator systems for treatment departments (Colson, 1994; Es, 1995). One of those indicator projects is called the Profile Package of the Jellinek Centre in Amsterdam, the Netherlands, which is described in this article.
Objective An indicator system had to be developed and implemented, which could deliver concise but relevant information to support result-oriented management of the departments of the addiction treatment centre. The management also requested that the reporting should be frequent and that the statistics should be easy to understand, and based on the existing information systems.
Setting The Jellinek Centre is a treatment service specialized for people with addiction problems. Every year there are about 5,000 patients treated by a staff of 500 professionals. There are 15 services ranging from prevention and counseling departments through to outpatient and detoxification programmes as well as therapeutic communities. The annual budget of the centre is about 25 million Euro (Directorate
173
CHAPTER 8
Jellinek Centre, 1998). The top management has, like many health care organizations, delegated much responsibility and authority to its clinical departments. That means that the managers of these departments are both result and resource responsible. Responsibility and authority can only be brought into practice in a constructive way if relevant and objective information is available. Therefore an indicator system was needed.
Approach and results Phases and project organization The project was carried out in three phases, which are illustrated in Figure 1. In the first phase several experiments were started to test a prototype and the approach. We limited ourselves to three departments. In the second phase we developed an electronic data processing system and used it for the data processing for six departments and in the third phase the tested indicator system in all the departments of the centre was introduced. Figure 1 : Project phases Post-Phase Implementation 1 Development Experiment Pre-Phase
3. Phase; 15 departments
§ 2. Phase: 6 departments
, „. , . , 1. Phase: 3 departments 1
1
1988
1992
1
1994
1999
A small team was in charge of the project and remained consistent over time. The members had work experience in the treatment centre in various positions such as system management, research, administrative management and patient work. The team had the full support of both the top management and the management of the departments. The commitment by both top and department management and a solid project planning proved to be very important for the success of project. During some phases a data manager, an external consultant and a software firm supported the team.
174
An indicator system Experimental Phase Information infrastructure and process model
In 1989 a study was conducted to specify the mission and to clatify the functions, processes and infrastructure of the treatment centre. One of the results of this study was a conceptual process model with the distinction in primary processes, support processes and steering processes. Another result was the architecture for the information systems and the databases (Verkooyen, 1993). This process model in combination with the detailed architecture for the information systems and the databases became the basis for the indicator system. The architecture for the information system gave stability through the different organizational changes and was used as a blueprint to set up a computer network, to start new patient registrations, personnel- and financial systems. Specifying indicators for three departments
In order to specify indicators, interviews were held with the department managers to gain insight into their information needs. We limited the interviews to three departments which were well organized and had a substantial turnover of patients. The managers wanted to have productivity figures, statistics on the target groups, results and figures on personnel and costs. In other words, they wanted to see integrated information that included indicators over input, process, output and resources specific to their own department. Above all, they wanted limited, relevant, reliable and readable information with a standardized layout, delivered at predefined intervals. They were very clear that all the information had to be retrieved from the databases already available and that introducing a new information system and additional data collections would not be accepted. The needs of the managers were analyzed in the light of the existing information systems. The information systems and the databases could answer the needs of the department management but only a fraction of the data was available in the databases. Therefore we started an improvement project for the data administration and data-entry, which lasted several years. Although we knew that data would be incomplete, we defined an ideal set of 43 indicators to start with. At that point in time the databases only had reliable data for about 20 indicators available. For the rest of the indicators the data was missing. Prototype
On the basis of the needs of the department management and the specified indicators we prepared a prototype for a report for one department. That prototype was a data sheet with 43 indicators. The indicators were grouped into admission, referral, target group (Table 1) and a standard timeframe of three months was introduced, which would meet the need of the managers to have frequent, but not too frequent,
175
CHAPTER 8
Table 1: List of 43 indicators defined in the experimental phase Category
Indicator
No.
Category
Indicator
No.
Admission
Total admission
1
Referral
Short admission
2
Interna! referral Externa referral
25
Long admission
3 4
Referral
T a r g e t Group
Substance abuse
T r e a t m e n t results
Interna! referral
No refferai Productivity
Bed occupied
24 26 27
External referral
S
Percentage occupation
28
No referral
6 7
Absentee days
29
Percentage absentees
30
Counselling
31 32
Regional percentage Percentage women Percentage Dutch Alcohol
8
Services
Advice
9
Drugs
10 11
Medicine Gambling
Personnel
Number of staff Contract days
33 34
12
Staff turnover
35
Sick leaves events
Combination
13 14
Sick leave days
36 37
Total discharge
15
Percentage sick leave
38
Regular discharge
16 17
Number of dropout Disciplinary discharge Percentage regular discharge Short admissions
Number of discharge
Long admissions
Time in programme Number of discharge Time in programme
Costs
Medication
39
Household material
40 41
18 19
Other material expenses Personnel
20 21
Total costs
42 43
22 23
information available. The prototype was discussed with the department managers, adjusted and called Profile Package (Donker, 1984). The prototype was a two-dimensional sheet with indicators on the vertical axis and on the horizontal axis five calendar quarters and cumulative annual results. This format was used to develop spreadsheets and to report to three departments on a quarterly basis and each report was discussed with the managers (Nabitz, 1984). During the whole project often years we followed a consistent protocol, that each quarter a Profile Package was delivered, presented and discussed. The discussions with the department managers were informative and helpful for the managers and the project team. Next to the quarterly discussion we introduced an annual review of the indicators. That means the indicators remained stable for one year but were redefined in the autumn for the next year. That gave stability and flexibility. There was also a report to the top management of the Jellinek centre in an aggregated format. The agreement was made that during the experimental phase no sanctions or rewards from the top management would be given. The emphasis was on learning to work with an indicator system. The experiences of the first phase, such as the indicator set and the prototype was presented during a conference about the Profile Package for which other Dutch addiction treatment services were invited (Nabitz, 1993)-
176
An indicator system
Lessons learned
We learned in the experimental phase that a stable technical infrastructure is necessary from the very beginning. During the whole project the infrastructure was the backbone of the indicator system. Next we managed to keep the project small and did not get lost in technical, administrative and organizational problems and spent time to establish a dialogue with the teams. The use of a prototype of the Profile Package with incomplete data helped to shape the expectations of the management and made clear how the final reporting would look like. Development phase Profile Packages for six departments
After the work with the prototype of the Profile Package with three departments, we included six departments in the next phase. We realized that a spreadsheet application, which we used, was not the right tool for reporting 43 indicators in five different timeframes for six departments. The data management became very difficult and time consuming and the data integrity of the figures concerning each department at each point in time was not guaranteed. 1290 data points had to be handled and mistakes were not acceptable. Therefore a database management system was introduced. The documentation, registration and data entry had improved during the experimental phase but still needed attention. Therefore we started with administrative feedback and control mechanisms to focus on the timelines, reliability and validity of the data. New forms for the documentation and registration (Nabitz, Veldhuizen, & Warlee, 1993) were implemented and protocols for delivery and auditing were made. Control figures were used in order to check the steps of the data processing from the first registration of the event to the reporting for the teams. Development of PROFIS
Through our work with the six departments the complexity of the data management was growing. In order to handle the technical issues a software house built an information system to structure the interfaces, manage the data and calculate the indicators for the departments. The interfaces were, up to that point, manual interfaces, meaning the data was delivered on paper or on floppy. Those interfaces had to be automated as shown in Figure 2. Three functions were separated in the system. First the raw data had to be checked and stored in a source database. Then the data had to be recalculated according to specific rules in order to come to indicator values. For example the dropout percentage is the quotient of the number of dropout and the number of discharge. And finally the indicators with the data values of each timeframe and each department had to be stored in a database to be able to report fast. The data-
177
CHAPTER 8
base management system was named PROFIS. The interface and logical structure is shown in Figure 2. Defining statistics
While developing the PROFIS system we experimented with weights and risk adjustments of the statistics which were reported in the Profile Package. Our guiding principle was that the value of an indicator should be evident and not difficult to explain to the department managers and the professionals. We eventually decided to use simple statistics such as counts, e.g. number of patients admitted in a period, percentage sick leave days, money spent, and number of days in treatment. We also added the precise definitions of all indicators on the back of the Profile Package, such as: Short admission: The number of patients admitted in the reporting period to the short treatment programme of the department. Sick leave days: The number of days of all sick staff members in the reporting period as defined by the Personnel Department. Figure 2: Data interface and PROFIS
Interfaces
[i =.
4 \ : PROFIS (
Source data
178
Rules )
Indicators
An indicator system
Communication with the department management
While working on these technical issues we continued producing the Profile Package each quarter. We discussed each delivered Profile Package with the department managers, and collected their suggestions. During the talks with the management it also became clear that they wanted to introduce performance goals for each indicator and wanted to calculate prognoses for each indicator. We considered and discussed these suggestions, decided about them during the annual reviewing meeting and developed a prototype for goals and prognoses. During the developing phase, the project became so complex that it exceeded the resources available to the treatment centre. The data management took more time than expected, the data processing appeared more complex and the department managers became more demanding, mainly because they saw the importance and relevance of the information which they were receiving on a regular basis. At the same time there were still some departments waiting for their own Profile Package. Therefore we applied for a grant from a national programme designed to stimulate medical information systems and received financial support for three years. Lessons learned
In the phase of developing the Profile Package we learned never to deliver data to the department managers without comment. Communication and discussion about their results in an active and creative way was crucial. That took time and patience but it was the only way to understand the needs of the managers and to motivate the staff to use indicators. We were only able to handle all the data of the departments by building a database management system, which had interfaces with the other database systems, handled the integrity of the data and produced speedy output. Implementation phase Changed environment
A major change of the organizational structure of the addiction treatment services in Amsterdam took place in 1994. There were new departments added to the centre and the departments were grouped into divisions. Therefore the position and function of the Profile Package in the new organization had to be redefined. In a policy document (Nabitz, Veldhuizen, Warlee, 1993) the responsibility and authority of the department managers concerning the results of the Profile Package were specified and aligned with the new structure. Profile Packages for fifteen departments
During the organizational changes the delivery of a Profile Package was possible because of the information system PROFIS which was flexible enough to redefine the
179
CHAPTER 8
units and parameters to adapt the system to the new organizational structure. PROFIS also made it possible to increase the number of services from six to fifteen departments and to produce three divisions' Profile Packages. The data processing was efficient and fast. With the original spreadsheet application we needed several weeks to refresh the data and to deliver output, but with PROFIS the total production for the departments and the divisions took two days. Auditing the data from the information systems remained a time consuming task, but after many talks with the system administrators the data was delivered within 10 working days after the end of the reporting period. Eventually it was possible to deliver a Profile Package for each department in the third week after the reporting period. Deviation profile and Goalfinder
During the implementation all department managers of the treatment centre became familiar with the reporting format and began to trust the information delivered to them. They started to feel comfortable with two years data broken down into annual quarters in their hands. The managers opened discussions about the results in their teams and began to set goals. They started to discuss targets for their productivity, for their patient groups and for their financial results. Once the targets were formulated we were able to calculate the deviation from the target for each indicator and thus reflect positive or negative deviation in a figure which we called the Deviation Profile. With a glance at the Deviation Profile one could see how close a department was to its targets as shown in Figure 3. In the beginning it was important that the top management did not impose targets but that the department managers first discussed the goals in their teams. Later the department managers used the information of their Profile Package for their quarterly report. In this way an effective bottom-up reporting system was established. Soon after the targets were introduced and the Deviation Profile was part of the Profile Package there was the need to align the targets of the different teams. We developed a method, which was called the Goalfinder. Once a year all managers were invited to a meeting in which they could compare and align their department target. A computer driven overhead projection was used, which made it possible to extrapolate from the targets of the individual departments the goals of the division and of the whole treatment centre. In the Goalfinder meetings all managers, top, division and department managers were represented. The meetings became, next to the budget and policy meetings, one of the most important annual meetings for the management. Merging information systems and reviewing the approach
Fundamental changes to the information systems were introduced in the Jellinek Centre in 1996. Four systems covering the primary care process of the clinics were
180
An indicator system
F i g u r e 3 : D e v i a t i o n Profile
Goal -150% TOTAL ADMISSIONS
-100°/
-50%
o%
+ 50°-
1009
+ 150%
totale admissions (1) long stay admissions (2) short stay admissions (3) REFERRAL FROM internal service (4) external service (5) no referral (6) TARGET GROUP regional percentage (7) percentage women (8) percentage non Dutch (9) SUBSTANCE ABUSE alcohol (10) opioids (11) cocaine and stimulants (12) medication and pills (13) amphetamines (14) cannabis (15) gambeling (16) not otherwise specified (17) TREATMENT RESULTS total discharge (18) regular discharge (19) dropout (20) irregular discharge (21) dropout percentage (20) LONGSTAY TREAT. RESULTS discharge long stay (23) time in programme(tip) (24) SHORT STAY TREATMENT RESULTS discharge short stay (25)
time in programme (tip) (26) REFERRAL TO internal service (27) external service (28) no referral (29) Deviation Profile of the first 29 indicators of De Obrechtstraat, a treatment programme of the Jeilinek. The lines are a combination of the year goal and the year prognosis. Five indicators (5, 11,12, 13, 15) are positive, which signals that the year goal will be excede by > 150%. One indicator (6) is negative which means the year goal will not be achiefed {< 150%). Most indicators meet the goal of the year. They vary around the goal line in the middel.
181
CHAPTER 8
replaced by one integral information system. Although the change-over took much longer than expected, we could eventually reduce the number of interfaces from eight to four, which was a big advantage for the production of the Profile Package. After the successful migration to the new system and several years of experience with the indicator system we decided to review our approach and our work systematically. First, we invited colleagues from other organizations to check and audit our procedures and methods and asked for recommendations for improvements. These meetings were very constructive and brought forth many suggestions. One of these suggestions was to include data from non-automated databases, such as data from patient satisfaction surveys or data concerning the results of the clinical work. Colleagues also suggested to further automate the data processing and move away from the paper output to an intranet site. Secondly, we developed a questionnaire and asked the 28 managers and staff members who received the Profde Package to evaluate it. The survey was anonymous in order to stimulate critical responses. Everybody responded and the answers gave good insight into the use and value of the Profile Package. The majority of the users were pleased with the indicator system and felt that the Profile Package was necessary for responsible and result-oriented management. The managers of the departments said that working with indicators is a good way to help them and the team manage facts and focus on targets. In the questionnaire we also asked the managers of the departments to list the critical indicators which should be used for all departments. The following five indicators were selected: admission rate, productivity, dropout, sick leave of the personnel and one financial parameter. Those five indicators, (Table 2), were to become the standard for all departments and a base for future benchmarking.
Table 2: Five critical indicators. Example of a long stay treatment department
Indicators IMC Admissions Dropouts Bed occupancy Staff sick-leave
(%)
Costs in guilders
182
2nd quarter 1998
Results 1997
Goals 1998
Prog noses 1998
19
96
120
85
2
11
18
19
1478
5973
5850
5932
12%
8%
10%
12%
219 054 fl
847 694 fl
827 667 fl
871 049 fl
An indicator system
Figure 4 : Example of a Profile Package
Profile Package Departments of the Jellinek Centre Programme De Obrechtstraat 4th Quarter 1998 Programme leader
4th Q 1998
OBP. Jellinek Centre
RESULTS 1997
GOALS 1998
j PROGNOSIS j
1998
CRITICAL INDICATORS total admissions
(1)
2?
126
120 :
dropout
(2)
5
25
20 •
19
1.766
6.910
6.910 i
6.999
sick leave percentage
(4)
total direct costs
(5)
3%
9%
7%:
122
3%
fl 209.511 fl 742.868 fl 800.298; fl 744.086
The results of the 5 critical indicators of the treatment programme are given on the first page of this Profile Package. On the Datasheet, which are the next two pages, the trend information over 5 quarters concerning the 71 indicators of the treatment programme are presented. On the last page the definitions of the indicators are listed. The Profile Package has two additions: first the deviation profile and second a listing of the missing values.
Department Medical Informatics, The Jellinek Centre, October 1998
183
CHAPTER 8
OBR Jellinek OM003
4th Q
|
1st Q
2ndQ
1997
]
1998
1998
3rd Q j 1998
|
4th Q
RESULTS;
GOALS
PROG.
1998
1997
1998
1998
ADMISSIONS total admissions
(1)
29
32
33
30
27
126
120
iong stay admissions. . .
(2)
15
7
10
8
9
49
45
34
short stay admissions. .
(3)
17
25
25
23
8
82
75
81
36
120
63
122
REFERRAL FROM internal
(4)
no referral.
(6)
4
8
7
20
20
16
13
14
2
6
8|
11
12
0
1
28
Oj
24
TARGET GROUP [regional-percentage] .
(7)
66%
88%;
76%
83%
85%
79% |
90%
83%
[percentage women] . .
(8)
38%
31%
15%
27%
37%
25%
20%
28%
[percentage non Dutch]
(9)
10%
6%
6%
10%
11%
10%;
10%
8%
. (10)
26
22
120=
120
110
(li)
2
1
1
1
2
3
0
5
cocaine and stimulants . (12)
1;
0
0
2
1
1
0
3
SUBSTANCE ABUSE alcohol . . . . opioids medication and pills . .
(13)
0
0
0
1
1
2
0
2
amphetamines
(14)
0
0
0
0
0
01
0
0
cannabis
(15)
0
0
1
0
1
0
0
0
gambling
(16)
0
0
0
0
0
0!
0
0
not otherwise specified
(17)
0
0
0
0
0
Oi
0!
0
TREATMENT RESULTS total discharge
(18)
26
35!
30
30
30
12-1
120
125
regular discharge
(19)
19
23
21
25
22
88:
90
91
dropout
(20)
4
5
6
3
c
20
20
19
irregular discharge . . . . (21)
3
6
3
2
3
8
10
14
[dropout percentage] . . (22)
27%
31%
30%
17%
27%
27%
25%
26%
LONG STAY TREAT. RESULTS discharge long stay . . , (23)
10
18
5
7
10
time in programme . . .
75
88
77
so
103
(24)
46;
45
40
78
90
90
SHORT STAY TREAT. RESULTS discharge short stay . .
(25)
19!
19j
25
24 j
23
83;
75
91
time in programme , . ,
(26)
37:
32
34
43 J
34
40:
40
36
20
25
24
22
24
86
80
016
6
9
6
3
5
35;
20
28
0
0
0
0
1
3;
15
1
(30)
1.709
1.687
1.750
1.796
1.766
6.910
6.999
[occupancy percentage] (31)
98%
99%
101%
103%
!o:%
99%:
100%
101%
REFERRAL TO external
(28)
PRODUCTIVITY occupancy absentee days
6.910
(32)
197;
172
227
226
198
948|
800
823
[absentee percentage] . (33)
12% |
10%
13%
13%
11%
14% j
12%
12%
PRODUCTIVITY TELEPHONE five minutes and shortei (34) longer than five minutes (35)
184
-j i
-
"1I
-
An indicator system
OBR Jellinek OM003
4th Q
1st Q
2ndQ
3rd Q
4th Q
RESULTS
GOALS
PROG.
1997
1998
1998
1998
1998
1997
1998
1998
PERSONNEL n u m b e r of staff
(36)
11,1
man days
(37)
564
(38)
0
staff with sick leave . . . (39)
8
sick leave days
(40)
199
[sick leave p e r c e n t a g e ]
(41)
8%
(42)
fl i 0 4 . / 4 /
-
9,8
10,7
10,0
499
542
506
10,1
9,7
10,2
1.969
2063
1
0
3
2
6
5
11
22
20
29
35
37
43
571
560
153
3%
3%
3%
9%
7%
3%
fl 189.342
fl 1B4./6Ü
fl 209,511
4^ 6bb
fl 80&29S
fl /44.ÜÖ6
-
COSTS fl IbG.4/4
-
MANAGEMENT INDICATORS (43)
19
19
19
19
19
19
19
19
(44)
1.512
1.515
1.523
1.570
1.568
5.962
6.000
6.176
2
6
tip s h o r t e r 15 d a y s . . .
(45)
cost effectiveness. . . .
(46)
c o s t s of a day
(47)
3
2
3
10
8
14
11 10.776 fl 6 . 9 7 7
f19.016
fl 7.390
fl 9 . 5 2 3
fl 8 . 7 6 5
fl 8 . 8 9 0
fl 8 . 2 2 7
fl 9 5
fl 108
11 103
fl 119
fl 107
fl 116
fl 106
11 120
CLIENT SATISFACTION (48)
42%
70%
86%
84%
73%
49%
100%
78%
respectful at intake . . . (49) information a t intake . ( 5 0 )
100%
100%
100%
91%
88%
93%
80%
95%
38%
71%
76%
67%
50%
61%
80%
66%
(51)
75%
77%
65%
76%
63%
63%
80%
70%
communication
(52)
57%
73%
65%
57%
56%
58%
80%
63%
treatment
[response percentage]
(53)
100%
87%
72%
76%
75%
89%
80%
78%
respectful at t r e a t m e n t . ( 5 4 ) (55) discharge
100%
87%
83%
76%
81%
96%
80%
82%
100%
69%
67%
71%
81%
82%
80%
72%
effect
(56)
100%
93%
89%
86%
69%
98%
80%
84%
t r e a t m e n t in groups . .
(57)
75%
81%
83%
81%
75%
75%
80%
80%
rules a n d regulations. .
(58)
82%
81%
83%
81%
63%
81%
80%
77%
medical care
(59)
75%
87%
88%
86%
100%
81%
80%
90%
(60)
57%
62%
67%
52%
56%
62%
80%
59%
free t i m e
CLINICAL IMPROVEMENTS physical health
(61)
100%
93%
100%
100%
93%
98%
80%
97%
social position
(62)
60%
55%
69%
44%
71%
50%
80%
60%
alcohol use
(63)
86%
93%
100%
95%
88%
97%
80%
94%
soft d r u g u s e
(64)
0%
25%
75%
50%
75%
63%
80%
56%
hard drug u s e
(65)
0%
67%
100%
67%
50%
63%
80%
71%
medical use
(66)
100%
0%
75%
75%
50%
90%
80%
50%
gambling
(67)
0%
50%
50%
50%
50%
63%
80%
50%
legal
(68)
0%
0%
50%
0%
33%
38%
80%
21%
(69)
33%
0%
57%
40%
67%
56%
80%
41%
social
(70)
40%
33%
50%
30%
50%
51%
80%
41%
emotional
(71)
83%
43%
83%
53%
42%
75%
80%
55%
Some examples for the definition of the indicators which are listed on the fourth page of the Profile Package: 1) total admissions: The number of clients admitted for treatment in the reporting period. 9) percentage non Dutch: The percentage of clients whose place of birth is Suriname, Antilles, Turkey or Morroco or with parents who are born in those countries. 20) dropout: The number of clients, who have terminated the treatment earlier than planned. 42) total direct costs: The sum of all expenses exclusive projects, infrastructure and central overhead, which are charged to the programme by the financial department (KOSO 95 999). 50) information at intake: The percentage of clients, who were much pleased or pleased with the information given about the treatment during the intake. 69) family: The percentage of clients, who came with family problems and who state that there has been improvement on that ASI domain through the treatment.
I8 5
CHAPTER 8
Profile Package layout
The review of the approach resulted in an improved layout of the Profile Package. In Figure 4 the Profile Package of the inpatient treatment programme is shown, which is composed of four pages. On the front page is a department name and the name of the manager of the department, the reporting period and the results of the five critical indicators. The results are shown for the last quarter, the annual results of the previous year, the targets and prognoses. On the front page there is also a short comment text. The following two pages of Figure 4 are the datasheet with five columns with quarterly results and three columns with targets and prognoses and results of the previous year. The datasheet is different for each department. Several departments had added statistics on patient satisfaction and clinical effect measurements. On the backside of the Profile Package are all definitions of the indicators be found, which makes the interpretation of the data easier. There is always an added A4 with the deviation profile, which is shown in Figure 3. The layout of the Profile Package has become a standard in the organization. Many of the reports are aligned with the format and the layout of the Profile Package. Example of improvements
The goal of the project was to deliver limited but relevant information to support result-oriented management. In Figure 5 the results on five critical indicators over five years of the detox department are presented. We can see that the sick leave of the staff was reduced by almost 70%, that the dropout rate of the patients reduced by 40% and that the costs were reduced by 30%. At the same time the number of admissions increased by 10% and the days that a bed was occupied increased slightly. The improvements are obvious, but for the teams the trend of the five critical indicators is more important. Figure 6 shows on the x axis the quarters over five years and on the y axis the scale for the indicators. We can see that costs are reduced with some fluctuation. The biggest fluctuation is shown in the sick leave, which has a peak in the 1st quarter of 1994 and again in 1995. This type of trend information helps a team to relate their actions to results. It exceeds the aim of this article to analyze the trend information in detail, but we want to emphasize that the quarterly feedback can give insight into the effects of the actions and is an important motivator for improvements. Lessons learned
When a team is familiar with the indicators and the reporting system, it is a logical step to define goals for each indicator. It is our experience in the implementation phase that a procedure to set goals like the Goalfinder is a challenge for a team and makes the reporting system more important. However such a development has to
186
An indicator system
Figure 5: Example of improvements of the detox department after 5 years
Deterioration
Improvement
Sick-leave staff Dropouts Total direct costs Admissions Bed occupancy -30%
-10%
+10%
+30%
+50%
+70%
+90%
be carried out in a dialogue and the goals have to be derived from the facts of the previous period. Furthermore we learned to focus on five critical indicators, which are consistent across all departments of the treatment centre and which stimulate the teams to compare, to benchmark and to search for best practice. The most important lesson, which we learned during this project was that the information feedback loop is necessary but quite difficult to establish.
Conclusion The introduction of an indicator system is a long process in a constantly changing environment. When we started, we estimated that a period of three years would be sufficient, but it took us three times as long. The system, which we developed in the addiction treatment centre, called Profile Package, is focused on the departmental or team level. The project can be seen as one of the examples of Codman's idea of an End Result Clinic. To our knowledge it is the first time that such an overall indicator system on the team level is developed, implemented and in use in an addiction treatment centre. The Profile Package covers five critical indicators - admission, productivity, dropout, sick leave and costs - and about forty department specific indicators. It is delivered one month after the end of the annual quarter to the manager of the department. That means it is not too frequent, but it is frequent enough to support fact-based management. On the basis of the quarterly Profile Package the depart-
187
CHAPTER 8
Figure 6: Trends of 5 indicators over 5 years of the detox department
500
3 Q 4 Q 1 Q 2 Q 3 Q 4 Q I Q 2 Q 3 Q 4 Q 1 Q 2 Q 3 Q 4 Q 1 Q 2 Q 3 Q 4 Q 1 Q 2 Q 1992 92 93 93 93 93 94 94 94 94 95 95 95 95 96 96 96 96 97 97
•"'Costs quarterly in 1000 fl * • Sick leave quarterly in days -^Number of quarterly admissions
3Q 1997
•&- Bed occupancy in percentage ••" Number of dropout clients
ment manager writes a summary, which is then consolidated into an overview report to the directorate of all departments. In addition each department manager discusses the results with the team in order to take actions, when needed. This cycle of analysis, discussion, communication and reporting is by now well established and related to an annual reviewing and a Goalfinder meeting. The Profile Package is part of the infrastructure of the centre. We can see that the feedback of information helps teams to improve their efficiency and effectiveness. By now some teams combine patient satisfaction results with the results of the Profile Package, which supports clarity and transparency of their performance. The feedback of results seems to have an impact on the quality of care. We will follow those preliminary findings in the next years. Several health care organizations in the Netherlands have learned from our expe-
188
An indicator system
riences and have started to introduce a Profile Package into their own organizations. Their experiences show that the approach can be transferred to other health care services. According to the experience there are three critical success factors for an indicator project: First, we think that the combination of bottom-up and top-down approaches is very important. We listened carefully to the needs of the department managers and we had a strong commitment from the top of the organization. Second we are convinced that the use of advanced information technology — databases and interfaces - in order to process the data for the specific indicators quickly and regularly are a crucial success factor. The third critical success factor is the continuity of the project team and a stable infrastructure. There was a consistent core team over the years, familiar with the organization and dedicated to the project. The next step for the indicator system is the integration of more clinical effect indicators for the departments. We want to combine those new indicators with a conceptual change of the Profile Package in the direction of the Balanced Score Card or the result criteria of the EFQM Excellence Model. Another important topic for the coming years is the alignment of other reporting systems with the Profile Package and the dissemination of the information via the intranet of the treatment centre.
Acknowledgements The authors would like to thank the staff of the Jellinek Centre in Amsterdam, but above all Panos Nanitsos, database manager; Ad van der Tholen, Manager of the Clinical Division and Toos Verkooyen, Head of the Information and Computer Department. The software was developed by the Rorema Group. The project was funded by the programme "Transparant" of the Ministry of Health, Welfare and Sports.
Reference list Aveyard, P. (1997). Monitoring the performance of general practices. Journal of'evaluation in clinical practice, 5, 275-281. Codman, E. (1996). A Study in Hospital Efficiency. Oakb rook Terras, Illinois: Joint Commission. Colson, P. (1994). Indikatoren voor Ziekenhuizen [Indicators for Hospitals]. In Handboek Kwaliteit van Zorg [Handbook of Quality of Care]. Utrecht: De Tijdstroom. Directorate Jellinek Centre (1998). Jaarverslag 199J De Jellinek [Annual Report 1997. The Jellinek Centre.}Amsterdam: The Jellinek Centre. Donker, M. C. (1984). Het Profielpakket [The Profile Package]. Utrecht: Nederlands centrum Geestelijke volksgezondheit (NcGv). Es van, S.Q. (1995). Indicatoren in de gezondheidszorg [Indicators in health care]. Rotterdam: Erasmus University.
189
CHAPTER 8
Garnik, D. W. et al. (1994). Measuring Quality of Care. Fundamental Information from Administrative Data Set. International Journal of Quality in Health Care, 6,163-177. Joint Commission (1996). Primer on Indicator Development and Application. Oakbtook Terrace: Joint Commission. Kazandijn, V. A. et al. (1992). Matyland Indicator Project Going Interactive and International. Hospital Peer Review, 173. Kazandjin, V. A. et al. (1996). Do petformance indicators make a difference? The Joint Commission Journal on Quality Improvement, 22, 482-491. Nabitz, U. (1993). Kennismaken met het Profielpakket. [Documentation of the Seminar: Introduction to the Profile Package]. Amsterdam: The Jellinek Centre. Nabitz, U., Veldhuizen, W., & Warlee, T. (1993). Formulieren in een nieuw jasje. [Improving Registration Processes]. Amsterdam. Nabitz, U., Nanitsos, P., Waarlee, M. (1984). Het Profielpakket [The Profile Package ] Amsterdam: The Jellinek Centre. Norman, L. A. et al. (1996). Computer-assisted quality improvement in an ambulatory care setting. The Joint Commission Journal on Quality Improvement, 21,116-131. Shaw, C. D. (1996). Health care league tables in the United Kingdom. Journal of'quality in clinical practice, 17, 215-219. Sorensen, J. E. et al. (1987). Managing mental health otganizations with 25 key performance indicators. Evaluation and Programme Planning, 10, 239-247. Turpin, R. S. et al. (1996). A model to assess the usefulness of performance indicators. International journal for quality in health care, 8, 321-329. Verkooyen, T. (1993). Informatie Strategie Plan [Information Strategy Plan] Amsterdam: The Jellinek Centre.
190