Professional Activities - NCBI

3 downloads 1477 Views 350KB Size Report
Harold S. Luft. The Professional Activity Study (PAS) of the Commission on Professional ... One problem with data drawn from abstracting services has been that.
The Professional Activity Study of the Commission on Hospital and Professional Activities: A User's Perspective Harold S. Luft The Professional Activity Study (PAS) of the Commission on Professional and Hospital Activities is a valuable resource for researchers interested in examining case mix, the process of treatment, and certain clinical outcomes. While there are other sources of case abstract data, the PAS provides, by far, the largest number of hospitals and offers a reasonable geographic distribution. The regional diversity is important for two reasons. First, clinical practice pattems vary substantially across regions, so that analyses based on one area may not be appropriate for other areas. For example, PAS data for length of stay by region indicate that cutoffs for the 75th percentile in the East approximate the 90th percentile in the West. Second, as states begin to introduce various types of prospective reimbursement systems and payments on the basis of schemes like Diagnostically Related Groupings (DRG), actual or recorded practices may shift in response to such incentive changes. Data drawn from only one or two states may thus be subject to substantial biases. One problem with data drawn from abstracting services has been that hospitals traditionally participated on a voluntary basis, and, because the service involved costs for both on-site abstracting and analysis, one might suspect that participants were a biased sample. Some states now require the use of PAS, thus eliminating that source of bias and enabling researchers to approximate more nearly population-based analyses. Furthermore, the substantial market penetration of PAS in other, nonmandatory states, means that in many local market areas all the hospitals are PAS participants. Address communications and requests for reprints to Harold S. Luft, Professor of Health Economics, Institute for Health Policy Studies, School of Medicine, University of California, San Francisco, 1326 Third Avenue, San Francisco, California 94143. This work was partially supported by a grant from the National Center for Health Services Research (HS-04329).

350

Health Services Research 18:2 (Summer 1983, Part II)

The long-term presence of PAS in abstracting provides yet another advantage. Records for specific hospitals may be examined over time to test for changes in case mix and utilization of selected services. In some instances, where hospitals utilize a unit record system, individual patients can even be traced over several years to examine readmissions. A final, extremely important advantage of using PAS data is that the CPHA staff include many professionals who are willing and able to work with researchers in designing and undertaking investigations. This extends not only to the complex tasks of manipulating often millions of case abstracts, but in developing ways to assure the researcher that the data are cleaned and checked. A cooperative and intelligent staff is even more important when one attempts to do special maneuvers in order to compensate for some of the limitations in the data. For example, if confidentiality issues preclude the release of hospital-specific data, CPHA staff can generate the appropriate tables [1]. More complex studies using regressions can be undertaken by merely having the staff generate a correlation matrix that can be analyzed by the researcher [2]. In yet another instance, a contractor was able to work with individual hospitals, local medical associations, and CPHA to link physician identifiers across hospitals [3]. (For confidentiality reasons hospitals assign their own physician codes which are known neither by other hospitals nor by CPHA.) In spite of its many advantages, there are still some important limitatins to the PAS data. (In fairness to CPHA, I should underscore that these problems are primarily related to case abstract data, regardless of its source.) Various studies by the Institute of Medicine [4,5,6] have found substantial discrepancies in coding when hospital data have been reabstracted. Discrepancies of 35 to 43 percent were found, usually due to differences in the selection of the primary diagnosis [7]. Thus, the source of error is in the initial coding at the hospital. While the coding discrepancies vary according to diagnosis [7], this may not cause the researcher much difficulty if the errors are randomly distributed across hospitals. However, one might suspect that such errors are not random. House staff in teaching hospitals probably list more diagnoses-even trivial ones-than physicians in community hospitals. Hospitals may differ consistently in the quality of their abstracting. More importantly, as abstract data are increasingly used to determine payment, the spectre of "DRG creep" has arisen [8]. Simborg has shown that purposive reordering of primary and secondary diagnoses can increase reimbursement under a diagnosis related group payment system. (Similar effects can be obtained by selectively correcting errors in the initial coding.) Simborg's analysis was based on a hypothetical situation, but the recent proposal by HCFA to use case-mix adjusted indices for Medicare

Health Data Sources

351

reimbursement has motivated hospitals to reexamine their previously submitted Medicare patient abstract data [9]. This suggests that reimbursement incentives may result in some intentional biasing of diagnosis coding. Even though the PAS system includes substantial representation of hospitals from all regions and all hospitals from all states, researchers should not assume that participating hospitals are a representative sample. As Mullner and Kobrinski [10] indicate, PAS hospitals are above average in size, and there is a higher than average market penetration in the North Central states. Furthermore, while complete coverage of all hospitals in selected states is an advantage for some purposes it makes the overall results more sensitive to peculiar situations in those states. While little is known about the importance and impact of these potential biases, this can be seen as a fruitful area for research. For instance, comparisons of case mix patterns over time in states with and without DRG-related reimbursement could provide estimates of the real impact of "DRG creep." The importance of the uneven PAS market share could be tested through comparisons with representative national data from the Hospital Discharge Survey or Medicare files. In both cases, the studies should go beyond the identification of discrepancies to the estimation of how the discrepancies would bias various types of analysis. Other studies of data quality may be more difficult to design and carry out. For example, Mullner and Kobrinski [10] indicate that the patient care data are optional, yet we do not know how frequently it is available or how usable it could be to researchers. If such information were to be consistently coded and were valid and reliable, it could be of substantial value in certain types of analyses. A further step might be the development of a joint task force involving representatives of the participating hospitals, CPHA, and researchers in identifying relatively simple and inexpensive changes to be made prospectively-this would vastly increase the usefulness of the data from a research perspective. The PAS is currently a valuable resource; with some additional information and work, it could become a veritable gold mine.

REFERENCES 1. Luft HS, JP Bunker, AC Enthoven. Should

operations be regionalized? An

empirical study of the relation between surgical volume and mortality. New England Journal of Medicine (December 1979) 301:1264. 2. Luft HS. The relation between surgical volume and mortality: An exploration of causal factors and alternative models. Medical Care (September 1980) 18:940. 3. Development of data files on hospital utilization and community character-

352

4. 5. 6.

7. 8.

9. 10.

Health Services Research 18:2 (Summer 1983, Part II)

istics. Contract 233-78-3001 by SystemMetrics, Inc., Santa Barbara, to the National Center for Health Services Research. Institute of Medicine. Reliability of Hospital Discharge Abstracts. (Report of a Study by the IOM.) 1977. Washington DC: National Academy of Sciences. Reliability of Medicare Hospital Discharge Abstracts. (Report of a Study by the IOM.) 1977. Washington DC: National Academy of Sciences. Reliability of National Hospital Discharge Survey Data. (Report of a Study by the IOM.) 1977. Washington, DC: National Academy of Sciences. Demlo LK, PM Campbell, SS Brown. Reliability of information abstracted from patients' medical records. Medical Care (December 1978) 16:(12)995-1005. Simborg DW. DRG creep: A new hospital-acquired disease. New England Journal of Medicine (June 1981) 304:1602. Health Policy Week (November 1982) (Chevy Chase, MD: Key Communications) 11:3. Mullner R, EJ Kobrinski. The professional activity study of the commission on professional and hospital activities. Health Services Research (1983) 18:

(2)343-347.