Use of Computer Technology to Modify Objective ... - CiteSeerX

2 downloads 9666 Views 58KB Size Report
Oct 1, 2005 - College of Dentistry, Texas A&M University System Health Science Center. ... clinical examination, clinical competence, computer technology.
Use of Computer Technology to Modify Objective Structured Clinical Examinations Lavern J. Holyfield, D.D.S.; Kenneth A. Bolin, D.D.S., M.P.H.; Kathleen V. Rankin, D.D.S.; Jay D. Shulman, D.M.D., M.A., M.S.P.H.; Daniel L. Jones, D.D.S., Ph.D.; Becky DeSpain Eden, B.S.D.H., M.Ed. Abstract: Objective structured clinical examinations (OSCEs) are multistationed clinical examinations that have been shown to be effective in testing students’ ability to integrate the knowledge, skills, and attitudes acquired during their preclinical and clinical training and experiences. The original OSCE for the third-year Preventive Dentistry course at Baylor College of Dentistry was based on the traditional format consisting of four sections of twelve stations with a group of twelve students rotating through each of the sections simultaneously. This arrangement allowed for examination of one-half of the class. The other half of the class took the exam on an alternate date. To reduce the disruption caused by the students’ moving from station to station and to allow for examination of the entire class in one setting, the traditional concept was modified using computer technology, and the twelve stations “moved” via a PowerPoint presentation while students remained stationary. Questions on both exams provided a means for testing data interpretation, diagnostic skills, and, to some extent, interpersonal skills. The overall atmosphere during the computer-based examination was less chaotic. Each student received identical instructions, explanations, and time allotments to respond to the information presented. The ratio of faculty to students required to monitor the exam was less than required for the traditional format. Additionally, since there was no need to allow time for student transition, the total time required to administer the exam was reduced. Thus, objective assessment of the entire class was accomplished using fewer faculty members and less class time and with less disruption for the students. Dr. Holyfield is Assistant Professor; Dr. Bolin is Assistant Professor; Dr. Rankin is Professor; Dr. Shulman is Professor; Dr. Jones is Professor and Chair; and Prof. Eden is Associate Professor—all in the Department of Public Health Sciences, Baylor College of Dentistry, Texas A&M University System Health Science Center. Direct correspondence and requests for reprints to Dr. Lavern J. Holyfield, Baylor College of Dentistry, 3302 Gaston Avenue, Room 711, Dallas, TX 75246; 214-828-8485 phone; 214-874-4523 fax; [email protected]. Key words: objective structured clinical examination, clinical competence, computer technology Submitted for publication 1/12/05; accepted 7/14/05

T

he prevention of oral disease goes beyond periodic examinations, oral prophylaxis, and the incorporation of fluoride therapy into patient care. Students must be competent in assessing the oral health risks and making informed choices about risk management. They must also be skilled in communicating the necessary recommendations to the patient. These concepts form the foundation of the preclinical preventive curriculum. Clinical competence may be regarded as the mastery of a body of relevant knowledge and the acquisition of a range of relevant skills, which would include interpersonal, clinical, and technical components.1 To assess the clinical competence of D3 students in the area of clinical preventive dentistry, the Department of Public Health Sciences employed the use of objective structured clinical examinations (OSCEs). The effectiveness of an OSCE in assessing clinical competence is well documented. OSCEs have the capacity to improve the validity and reliability of assessments of many aspects of clinical competence.2

October 2005



Journal of Dental Education

During the fall 2003-04 semester, the first OSCE was administered using the traditional format similar to that described by Martin and Jolly3 and Langford et al.4 The examination design included twelve timed (five minutes each) stations. Two of the stations, designated as “rest stations,” were void of exam materials, serving only as vehicles to facilitate examination of the maximum number of students per setting. Forty-three students (one-half of the thirdyear class) were examined, with an average of eleven students rotating through each of the four sections. The same exam was administered to the remaining students in the class on an alternate day. The ten “assessment stations” contained information and materials required to evaluate data interpretation, diagnostic, and interpersonal skills. For example, as a means of assessing the students’ ability to accurately diagnose oral disease, at one station, students were instructed to review radiographs and intraoral photographs, describe the condition, and indicate the recommended treatment protocol. An-

1133

other station provided a comprehensive medical history, dental history, and dental charting. Based upon the findings, students were expected to expound on the risk for dental caries and other oral disease and to recommend preventive therapy and patient education consistent with their findings. Finally, in assessing interpersonal skills, staff members who were carefully trained to portray patients with specific oral health problems were present at one station. These trained patients are commonly known as standardized patients (SPs) as described by Johnson et al.5 because they provide a standard presentation of symptoms from student to student so patient variability will not be a factor in assessment of performance. Students were expected to know which questions to ask the patient in order to complete portions of the oral disease risk assessment form. Responses would be scored by faculty at the end of the exam. Prior to administration of the traditional OSCE, four faculty members worked together to set up the exam stations over a period of one and a half hours. Six faculty members and four staff members assisted during the exam. The overall process of conducting the traditional OSCE required more than three and a half hours. During the post-exam debriefing, participating faculty and staff concluded that the logistics of planning and implementing an OSCE was daunting and consumed all of their energy. These sentiments echoed those of our colleagues in the Department of Pediatric Dentistry at Baylor College of Dentistry (BCD), who observed in a 2002 article that the use of an OSCE-based testing format is time-consuming and labor-intensive and requires extensive resources.6 We also observed that administration of an OSCE can be chaotic, with students often getting confused as they moved from station to station even though arrows clearly indicated the path of movement. However, this examination format also provides invaluable feedback that allows for counseling of students regarding their performance, identifies students who would most benefit from remedial teaching, and provides faculty with insights about students’ comprehension or confusion regarding basic concepts.3 As a department with limited faculty resources we recognized the value of OSCE-based evaluations. However, the challenges posed by the traditional OSCE format led to a modification of the concept with the goal of reducing the faculty “cost” while maintaining many of the pedagogical benefits.

1134

Methods To reduce the material and human resources and diminish the inherently chaotic nature of the OSCE, BCD Public Health Sciences (PHS) faculty investigated the feasibility of a computer-based OSCE format. The computer-based OSCE was implemented in the following manner based on a model of test development developed by Newble who proposed that there are three steps necessary to establish content validity of a competency assessment. Step one is to identify the problems or conditions that the candidate needs to be able to assess and manage competently. Step two is to define the tasks within the problems or conditions in which the candidate is expected to be competent. The construction of a blueprint or grid is the third step, as a way of defining the sample items to be included in the test.2 To implement this planning process, eight members of the clinical preventive faculty met on numerous occasions to establish the content of the OSCE. In planning for the computer-based exam, as with the traditional exam, each faculty member contributed one or more questions for consideration. Questions were discussed, edited, or deleted based upon input from the team. The content of the computer-based exam paralleled that of the traditional exam. The examination included questions that assessed the students’ ability to predict the existence of and/or treatment for xerostomic conditions or to determine whether cavitated and non-cavitated lesions should be treated using sealants, minimally invasive restorations, traditional restorative procedures, or appropriate remineralization therapy. Proper use of the 5As for tobacco use intervention (Ask about tobacco use; Advise to quit; Assess oral tissues, patterns of tobacco use, readiness to quit; Assist with educational materials and counseling; and Arrange for follow-up) was assessed. Likewise, the examination tested students’ ability to utilize information regarding a patient’s diet, oral hygiene conditions, and other environmental influences in order to assess the risk of subsequent oral disease. Based upon the relative degree of difficulty, a point value and time limit were assigned for each station. All radiographs, mock patient records, models, and other graphic materials were transferred into electronic format and incorporated into PowerPoint images. As a result, the disruption that presented when students moved from station to station was re-

Journal of Dental Education ■ Volume 69, Number 10

duced. Additionally, through the use of computer technology, it was possible to test the entire class simultaneously. The group of eighty-seven students was divided into color-coded sections that corresponded to the four sections in the 100-cubicle laboratory in which individual computer monitors were available for each student. During the week prior to the exam, students were given a general overview of the information that would be covered. During the exam, five faculty and/or staff members were present: three monitors, one examiner, and one for technological support. Answer packets with a set of customized labels were distributed to the students who were instructed to place a name label on each page of the packet. Instructions regarding the time for each question were given prior to projecting the images. PowerPoint images were advanced based on designated time intervals presumed necessary for students to successfully assess and respond to the information. The time intervals ranged from three to seven minutes based on the perceived degree of difficulty. The total computerbased OSCE was completed in sixty-five minutes.

Results Faculty impressions of the examination included the observation that the overall atmosphere during the computer-based OSCE was less chaotic than during the traditional format. It was also noted that there was no time “wasted” at stations where less time was required than the allotted five minute time limit imposed per each question during the previous exam. Additionally, any responses to student requests for clarification or other information were given to the entire group simultaneously with each student receiving the identical instructions and explanations. Other positive results included a reduction in the number of faculty required to monitor the exam during the computer-based format as compared to the number required for monitoring with the traditional format. Of significance also was the reduction in the amount of time required to administer the computer-based exam because there was no need to allow time for student transition. Analysis of class scores yielded no significant differences for the students who participated in both the traditional exam and the computer-based exam (two-tailed t-test; α =.05). Similarly, in evaluating student performance on the same computer-based exam, scores for the current class were not significantly different from

October 2005



Journal of Dental Education

the scores of their predecessors (two-tailed t-test; α =.05). To determine the students’ perspective on the computer-based OSCE, it was decided that a focus group discussion would be advantageous. Since the current third-year dental class had experienced the traditional OSCE format on numerous occasions during the previous year through at least two other courses and were the most recent recipients of the computer-based exam, it was deemed they would provide more reliable feedback than the fourth-year class. Approximately 10 percent of the class—consisting of an ethnically diverse group of students, high scorers, average scorers, class leaders, and other exemplary scholars—participated in the focus group. Discussion was controlled only to the extent that it was limited to comparison of the computer-based examination format with that of the traditional examination and that discussion regarding content and perceived fairness of any particular exam was not allowed. Based on the discussion, it was the consensus of the students that the computer-based format was less stressful and chaotic. All but one preferred the varying time intervals per question rather than the set five minutes. Consistently, students indicated that the overall time required and the condition that the entire class was examined in a single setting were positive aspects of the exam, specifically in terms of equality in access to the exam information and uniformity of instructions.

Discussion Dental institutions across the United States face a shortage of faculty, and it is predicted that these shortages will be greater in the future. Despite the shortage, it is imperative that quality instruction and training be provided for dental students. The modification of the traditional OSCE to a computer-based format appears to be an effective, innovative means of doing so. The ease of facilitation of the computerbased OSCE format was a welcomed improvement as were the decrease in the time required to set up and administer the OSCE, the decrease in personnel, and the ability to simultaneously provide consistent instructions and information to the entire class. The students appeared to be more relaxed during the computer-based exam than during the multistationed exam administered during the fall semester. One of the major shortcomings of the computer-based format, however, is the inability to use

1135

standardized patients, thus rendering us unable to assess the students’ interpersonal skills. Likewise, during the administration of the same computer-based exam to the subsequent third-year class, it was noted that the students seemed rushed to complete the questions for which a three-minute time limit was allotted. Modifications in future exams will include adjustment of time intervals for some of those questions and the addition of two three-minute intervals at the midpoint and the end of the exam to allow students sufficient time to review and edit responses. Also under consideration is the development of a web-based OSCE that will allow students to complete the exam in individual settings; however, the total length of time allotted per exam, the dates of availability, and certain other parameters will need to be established prior to implementation of this format. This will become even more feasible for future classes at BCD since, from this academic year (2005) onward, all first-year students will be issued standard notebook computers—a growing trend among U.S. dental schools. To more effectively assess the interpersonal skills in future versions of this exam, video vignettes will be added in which standardized patients will provide medical and dental history information. To demonstrate their ability to effectively communicate with their patients for the purpose of obtaining complete and accurate information upon which to base their clinical assessments, students will be required to listen and provide a set of questions that they would ask each patient or answer questions based on the video.

Conclusion The computer-based OSCE format was less time-consuming and labor-intensive than the tradi-

1136

tional format, requiring fewer human and material resources. Development of the computer-based format provided a means for calibration among the preventive faculty through the discussion of questions and materials for use in the exam. While the computer-based format allowed only a limited means of assessing interpersonal skills, these skills are evaluated through other means. Essential clinical experiences in preventive dentistry for each student include the presentation of a total of five patient cases in a clinical setting. Other aspects of clinical competence, however, were assessed satisfactorily. Additionally, the ability to examine the entire class in one setting with less class time and less disruption was a significant benefit. More importantly, the computer-based examination, like the traditional format, provided invaluable information regarding aspects of clinical competence that required reinforcement in the effort to prepare students for clinical practice.

REFERENCES 1. Newble D. Assessing clinical competence at the undergraduate level. Med Educ 1992;26(6):504-11. 2. Newble D. Techniques for measuring clinical competence: objective structured clinical examinations. Med Educ 2004;38:99-203. 3. Martin I, Jolly B. Predictive validity and estimated cut score of an objective structured clinical examination (OSCE) used as an assessment of clinical skills at the end of the first clinical year. Med Educ 2002;36:418-25. 4. Langford N, Landray M, Martin U, Kendall M, Ferner R. Testing the practical aspects of therapeutics by objective structured clinical examination. J Clin Pharm Ther 2004;29:263-6. 5. Johnson J, Kopp K, Williams R. Standardized patients for the assessment of dental students’ clinical skills. J Dent Educ 1990;54(6):331-3. 6. Zartman R, McWhorter A, Seale N, Boone W. Using OSCE-based evaluation: curricular impact over time. J Dent Educ 2002;66(12):1323-30.

Journal of Dental Education ■ Volume 69, Number 10

Suggest Documents