Medical Teacher, Vol. 25, No. 6, 2003, pp. 632–642
The use of handheld computers in scenario-based procedural assessments R. KNEEBONE1, D. NESTEL1, J. RATNASOTHY2, J. KIDD2 & A. DARZI2 Department of Surgical Oncology and Technology, Imperial College London, UK; Centre for Medical and Health Service Education, Monash University, Australia 2 Department of Psychological Medicine, Imperial College, London, UK
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
1
SUMMARY This paper describes the authors’ experiences of using handheld computers within scenario-based formative assessments aimed at developing clinical procedural skills. Previous experiences of using paper forms in these assessments were problematic. Multiple paper forms were generated and data sets were sometimes incomplete. Forms adapted for use on handheld computers offer significant potential advantages over paper-based versions. These include streamlining the process of data collection, entry and retrieval, thereby reducing data loss and providing learners with immediate and cumulative feedback on their performance. All participants in this study found the Personal Digital Assistant (PDA) forms easy to use. Further adaptation, together with increased familiarity with PDA technology, will address users’ feedback by providing more space for free text and a larger visual field. Technical expertise is required for the development and delivery of PDA-based forms, but their potential for use in formative and summative assessments is considerable.
Introduction Healthcare professionals are expected to carry out a range of clinical procedures such as inserting a urinary catheter or closing a wound in the skin. Performing such a procedure on a conscious patient requires a combination of technical and communication skills. Although both sets of skills are commonly taught, they are often taught separately. Learners are not given the opportunity to integrate the two skill sets until they are required to carry out procedures on real patients. This causes anxiety to students and has considerable potential for harm to patients. We have developed an innovative solution, using inanimate models linked to simulated patients to create realistic quasi-clinical scenarios (Kneebone et al., 2002; Nestel et al., 2003). Each performance is rated by an expert observer, who uses paper-based rating scales. Evaluation has identified several problems around using such paper-based forms during data collection for formative assessment. This paper explores the potential of using hand-held computers as an alternative means of data collection and outlines significant issues we have encountered in the process of this exploration.
takes place (Figure 1). Each learner carries out a procedure on a simulated model ‘attached’ to a simulated patient (Figure 2). The process is observed in real time by two tutors, one with expertise in communication skills and the other in technical surgical skills. These tutors watch from an adjoining room, using ceiling-mounted video-recording equipment. At the end of each procedure structured feedback from the simulated patient provides the ‘patient’s’ perspective, while the expert tutors give specific guidance on aspects of technical and communication skill. The procedure and the subsequent feedback are recorded, and the learner reviews the videotape immediately after the performance. This provides the learner with an opportunity to reflect on feedback from the simulated patient and expert tutors as well as focus on aspects of the procedure that were most important to the learner. While the two expert tutors are observing the procedural scenario, each completes a rating form that combines binary checklist items with a series of Likert-type global scales (Appendix 1 & 2). Immediately after the procedure the simulated patient completes another form (Appendix 3). The learner, when reviewing the procedure, completes identical rating forms to the expert tutors’. Five sets of paper scoring sheets are therefore generated for each episode.
The need to change
Background
Preliminary qualitative evaluation of this process with two procedures carried out by 102 medical undergraduates provided strong support for the effectiveness of the learning experience it provides (Kneebone et al., 2002). Observation and group interviews identified the data collection process as a possible area for improvement. Given that verbal feedback after the procedure is focused and limited by time, our intention was to provide immediate comparisons between ratings of experts and learners of the entire procedure, thereby encouraging informed self-criticism of a broad range of skills. The practical difficulty of simultaneously analysing and interpreting multiple data points from several paper rating forms prevented us from achieving this aim. Subsequent data transfer from paper to a computer database highlighted significant weaknesses in the data collection process, with several rating forms being incomplete. More-
Our integrated approach to learning invasive clinical procedures uses detailed scenarios to create an illusion of clinical reality. At the heart of our conceptual model is the interaction between learner and ‘patient’ as the procedure
Correspondence: Roger Kneebone, Department of Surgical Oncology and Technology, Imperial College London, 10th floor QEQM Wing, St Mary’s Hospital, Praed Street, London W2 1NY, UK. Tel: 020 7886 7930; email:
[email protected]
632
ISSN 0142–159X print/ISSN 1466–187X online/03/060632-11 ß 2003 Taylor & Francis Ltd DOI: 10.1080/01421590310001605660
Handheld computers in assessment
Instruction
A model for teaching and learning Key Communication skills tutor Technical skills tutor
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
STUDENT
Observation
Rating with PDA
Procedure
Feedback
Simulated patient
Reflection
TIME 10 minutes
5 minutes
15 minutes
Figure 1. Teaching–learning model for integrating procedural skills.
Figure 2. Scenario-based assessment for gastrointestinal endoscopy. 633
R. Kneebone et al.
over, the process of entering data from paper onto a computerized database is time-consuming, tedious and prone to error. We therefore decided to explore the use of handheld computers (Personal Digital Assistants or PDAs) as an alternative to paper forms, building on work initially developed in Vienna (Schmidts, 2000).
Research questions
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
Our aim in this study was to investigate practical issues around the use of handheld computers in the context described above. We hypothesized that the use of PDAs might offer three specific advantages: (1) an acceptable computer-held alternative to paper rating scales for multidimensional skills assessment; (2) simplification of the data collection process, with a reduction in missing and corrupted data; (3) the ability to provide immediate comparisons between data from several observers and from different occasions, thereby increasing the potential for learning.
PDA’s touch-sensitive screen using a stylus, and navigate between the various fields within a form (Figure 3). Internal constraints can be provided to ensure that all fields are completed before a form can be closed. Three electronic forms were created, each based on the paper forms described above. These comprise communication skills, technical skills and simulated patient satisfaction. Each form allows navigation in two ways: a ‘record view’ mode (Figure 4a) and a ‘field view’ mode (Figure 4b). The record view allows the user to see and edit all fields at once, whereas the field view displays each field in sequence. The user is free to alternate between each view at will. Adaptation of the checklist items resulted in some items from the paper-based forms being abbreviated or shortened to fit within view modes. After each rating episode, data are uploaded from each PDA to a Microsoft Access database on a laptop personal computer (PC) via a proprietary docking cradle (Figure 5). Once uploaded to the PC’s database, rating data are automatically deleted and removed from each PDA to ensure confidentiality of data. Use of the PDAs
Methods Development of the PDA software We developed an electronic version of our paper rating forms for use on a PDA. Although a wide range of PDA models is available, there are only two leading operating systems, namely the ‘Palm Operating System’ (PalmOS) and ‘Pocket PC’. We chose the Palm OS because of previous experience in writing programs for this operating system and because of the relatively low cost of individual units. The software used in this study was written using Pendragon Forms for Windows (Version 3.2), a high-level programming tool for designing ‘forms’ for use on a PDA. Each ‘form’ occupies the PDA’s entire screen, and the designer can create check boxes and pop-up lists within individual fields. This enables the user to enter data on the
A preliminary study took place during a three-day course for nurse practitioners in minor surgery at St Mary’s Hospital, London in 2002 (Nestel et al., 2002). During the first two days of the course, workshop and small-group sessions covered technical and communication skills relating to minor surgery. On the third day all participants underwent scenario-based formative assessment as described above, being required to carry out ellipse excision of a simulated skin lesion and closure of the resulting wound with sutures. Before the procedures began, all participants (tutors, nurses and simulated patients) were issued with a handheld computer. A 30-minute interactive group session provided supervised training in the use of PDAs, using Palm emulation software to project a magnified image of the PDA rating form
Figure 3. Form designer. 634
Handheld computers in assessment
(a)
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
(b)
Figure 4. (a) Record view and (b) field view.
Figure 5. Uploading the rating data.
onto a screen. All participants were introduced to the rating scales and taken through the rating process while practising with their individual handheld computer. One-to-one support was provided where necessary.
All ratings during the subsequent scenario sessions were carried out using PDAs. After each procedure, data were uploaded from the PDA to a laptop computer using the proprietary docking cradle as described above. 635
R. Kneebone et al.
A second training course incorporated the PDA in scenario-based assessments for nurse practitioners learning gastrointestinal endoscopy skills at St Mary’s Hospital, London in 2002. In these scenarios, the task was to conduct a sigmoidoscopy on a virtual reality (VR) simulator strategically placed alongside a simulated patient. Rating scales were similar to those described above, although in this scenario the learners received feedback on technical skills in the form of quantitative data generated by the VR simulator rather than an external observer.
Evaluation
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
Our aim was to test the three hypotheses outlined above, and to identify significant areas for improvement. Evaluation therefore consisted of the following components: (1) observation of the process by the research team; (2) moderated group interviews with all participants, using a semi-structured interview guide. Interviews were recorded, then thematically analysed using standard qualitative methods.
Results Observations During the scenarios, one member of the research team ( JR) managed the process of using PDAs, ensuring that each user (learner, simulated patient and expert raters) had a uniquely identifiable handheld computer and that key equipment such as screen styluses was always available. Data from each PDA were uploaded onto the laptop PC after each scenario-based assessment. Observations confirmed the pivotal importance of having a dedicated team member to manage the data entry and uploading process. For both studies, technical problems with printers prevented us from generating an immediate printout of comparative data between learners and observers.
Group interviews Nurses. All nurses on both courses participated in the group interviews (n ¼ 14). After brief training all participants found the PDAs easy to use, even though few participants had any previous experience of using handheld computers. Most nurses thought that the 30-minute preparation was adequate and that learning in a group was appropriate. The combination of verbal instruction with the interactive computer projection of the PDA’s image onto a screen proved helpful. Although some nurses reported difficulties with using the PDAs such as navigation through screens and the PDA automatically turning off after inactive periods, these problems were easily remedied with brief practice and explanation. Most thought that handheld computers added an enjoyable dimension to the scenario-based assessments and were much more interesting than paper forms. All nurses appreciated the opportunity of piloting new technology and expressed feeling valued as a consequence. When asked for negative responses, some nurses expressed high levels of anxiety about the whole process, in 636
spite of the fact that the scenario-based assessments were designed to be wholly formative. These nurses suggested that introducing the PDAs on the preceding day would have been preferable, thereby avoiding an additional challenge immediately prior to their assessment. Tutors. Of the seven tutors in the study, six reported previous experience in using handheld computers. All tutors found the PDAs easy to use, although two expressed a preference for working with the paper version of the rating form because an A4 page provides the entire form in one visual field. Other comments related to the absence of space for free text comments on the electronic form. Tutors reported that the absence of this space changed the way they managed their observations. In addition, paper forms allow the use of marginal notes and the jotting down of videotape time frames relating to significant scenario segments for future review. The current PDA version of the rating form does not have this capacity. Simulated patients. Four simulated patients participated in the study; none had previous experience in using PDAs and all expressed anxiety about using them. However, all found the group training session effective in allowing them to use the PDAs. Some simulated patients reported difficulties with small font size, lack of contrast on the PDA screen and manipulation of the stylus. However, with guided practice, appropriate light and adequate time, these issues were resolved. Discussion This evaluative study set out to identify both strengths and weaknesses of using new technology for learning procedural clinical skills. The overall response from tutors and learners was positive. All participants found the PDAs relatively easy and enjoyable to use, requiring only a brief introductory session. The PDAs also provide alternatives in the presentation of data so users can select their preferred mode. In this study, the PDAs substantially reduced the use of paper and minimized the possibility of incomplete data sets. The transfer of data to a central base is instantaneous and retrievable within moments. This enables comparisons between expert and learner ratings together with those of the simulated patients. Data management is significantly more efficient and reliable than the traditional paper-based method. We experienced no difficulties in this process but it is possible that data could be lost irretrievably as a result of technical problems. Transferring data after each procedure minimizes potential data loss. This study has highlighted several difficulties in the use of the PDAs. First, the process of creating computer-based forms was considerably more demanding than creating the forms on paper. Both initial design and subsequent adjustment of the forms required specialized expertise. Second, the screen size of the PDAs limits the amount of information that can be presented in one visual field. It may simply be that raters need time to adjust to new ways of working, and that providing a reference copy of the paper rating form may be sufficient to accommodate this limitation. Third, effective management of the assessment process is crucial. In our pilot studies, 12 PDAs were being used simultaneously during the scenario sessions. Each PDA was
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
Handheld computers in assessment
numbered to ensure that it was in the right place at the right time and being used by the right person. It proved essential to have a dedicated member of our research team who managed the process of data capture, coordinated the uploading of information and provided troubleshooting where necessary. However, this level of support may not be necessary in future, as we become more familiar with the process of recording and storing data. In addition, simple strategies such as colourcoding all PDAs will minimize confusion and streamline the process. Fourth, we had intended to provide each learner with a computer-generated summary sheet, allowing immediate comparisons between different raters’ assessments. Unfortunately the formatting of this feedback proved too difficult at the time of these studies, largely because of problems accessing printers within our scenario suite. Future studies will evaluate the impact on learning of these integrated summary sheets. Further developments This study suggests that handheld computers have considerable scope for development as a formative assessment tool. From the learner’s perspective, electronic programmes can provide increasing levels of information as the user scrolls through each menu. Learners could select the skills for which they would like more information and delve to the degree of detail that meets their need. Moreover, if learners have the opportunity for repeated assessments, the opportunity to retrieve feedback from previous assessments could be helpful in setting their own learning objectives. Given the importance of timing in relation to maximizing the benefits of feedback, the PDA offers significant advantages. From the rater’s perspective, new versions of our PDA rating forms will include space for the entry of free text comments. We acknowledge that users may require time to learn the specific skills of entering text via a touch-sensitive PDA screen. However, as these skills become more generic in other electronic devices, this will be less problematic. Handheld computers may also prove invaluable in summative assessment, especially in examinations that deal with large numbers of participants. In our study the efficiency with which data could be recorded and stored was highly impressive, and the opportunity for rapid processing and analysis offers learners the opportunity to receive their results instantly. The electronic database also enables results to be disseminated by email. Conclusions From the data in this study, the answers to our research questions are as follows: (1) PDAs offer an attractive alternative to paper-based rating forms. (2) PDAs can simplify the process of data collection and reduce the likelihood of data loss and corruption. (3) Technical difficulties prevented us from demonstrating the use of immediate comparative feedback data for
teaching purposes. However, the insights we have gained from this project have convinced us that PDA technology has significant potential in this area. In summary, the data presented in this paper provide strong preliminary support to the concept of using PDAs for collecting assessment data during quasi-clinical scenarios with simulated patients. Although all the assessments in this study were formative, PDAs have clear potential for summative assessment as well. Dedicated technical expertise remains an essential element for both the development and delivery of this innovation.
Practice points Paper-based rating forms can be adapted for use on handheld computers and offer significant advantages to learners. Technical expertise is required for the development and adaptation of PDA forms and during the process of data collection.
Acknowledgements The authors would like to thank Dr Michael Schmidts for contributing to the early development of this project. Thanks are also offered to Imperial College Centre for Educational Development for funding the research through a Teaching Development Grant 2002.
Notes on contributors ROGER KNEEBONE, PhD FRCS FRCSEd MRCGP, is Senior Lecturer in Surgical Education at Imperial College London. DEBRA NESTEL, PhD, is a Senior Lecturer in Medical Education at Monash University and was formerly at Imperial College London. JOEL RATNASOTHY, BSc MB ChB, is a Surgical Research Fellow at Imperial College London. JANE KIDD, PhD, is Senior Lecturer in Communication at Imperial College London. ARA DARZI, MD FRCS FRCSI FRACS, is Professor and Head of the Department of Surgical Oncology and Technology at Imperial College London.
References KNEEBONE, R.L., KIDD, J., NESTEL, D., ASVALL, S., PARASKEVA, P. & DARZI, A. (2002) An innovative model for teaching and learning clinical procedures, Medical Education, 36(7), pp. 628–634. NESTEL, D., KNEEBONE, R.L. & KIDD, J. (2003) Teaching and learning about skills in minor surgery—an innovative course for nurses, Journal of Clinical Nursing, 12(2), pp. 291–296. SCHMIDTS, M.B. (2000) OSCE logistics—handheld computers replace checklists and provide automated feedback, Medical Education, 34, pp. 957–958.
637
R. Kneebone et al.
Appendix 1: Rating scales for technical skills in wound closure
Student’s name____________________ Assessor’s initials_______
Please circle the appropriate figure in the box for each item.
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
Technical skills
Not done
Done incorrectly
Done correctly
1 Washes hands
0
0
1
2 Prepares trolley
0
0
1
3 Puts on gloves
0
0
1
4 Cleans the wound
0
0
1
5 Infiltrates with local anaesthetic
0
0
1
6 Chooses suitable suture and needle
0
0
1
7 Mounts needle on needle holder
0
0
1
8 Majority of knots are square (reef)
0
0
1
9 Appropriate knot tension
0
0
1
10 Appropriate suture spacing
0
0
1
11 Applies dressing
0
0
1
12 Procedure completed within allotted time
0
0
1
Please circle the appropriate figure in the box for each item Knowledge of specific procedure: Knowledge clearly deficient
Serious gaps
Knows the important steps
Satisfactory overall level
Completely familiar with the procedure
1
2
3
4
5
Never
Seldom
Sometimes
Usually
Always
1
2
3
4
5
Frequently stops, uncertain of next move
Hesitant
Shows reasonable progression
Shows occasional uncertainty only
Well planned, with effortless progression
1
2
3
4
5
Frequently uses unnecessary force
Occasionally uses force
Generally careful but occasional roughness
Usually
Shows great gentleness and sensitivity
1
2
3
4
5
Demonstrated aseptic technique:
Flow of procedure:
Respect for tissue:
638
Handheld computers in assessment
Time and motion: Halting movements
Occasional unnecessary moves
Movements usually satisfactory
Movements well coordinated
Fluid movement without apparent effort
1
2
3
4
5
Very poor
Poor
Competent
Good
Excellent
1
2
3
4
5
Overall rating of technical performance:
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
Comments:
Appendix 2: Paper rating form for communication skills A: Please circle the appropriate figure in the box for each item. Not done
Done poorly
Done well
1 Greeting
0
0
1
2 States full name
0
0
1
3 States role
0
0
1
4 Checks patient’s comfort
0
0
1
5 States purpose of procedure
0
0
1
6 Assesses patient’s understanding
0
0
1
7 Establishes consent
0
0
1
8 Asks if patient has questions
0
0
1
9 Asks if patient has worries
0
0
1
10 Explains procedure appropriately
0
0
1
11 States what has been done
0
0
1
12 States plan of action
0
0
1
13 Checks patient’s understanding
0
0
1
14 Asks if patient has questions
0
0
1
639
R. Kneebone et al.
B: Please circle the appropriate figure in the box for each item. 1. Appropriate use of non-verbal communication: Never
Seldom
Sometimes
Often
Always
1
2
3
4
5
2. Responds to patient’s verbal cues: Never
Seldom
Sometimes
Often
Always
1
2
3
4
5
3. Responds to patient’s non-verbal cues: Never
Seldom
Sometimes
Often
Always
1
2
3
4
5
Never
Seldom
Sometimes
Often
Always
1
2
3
4
5
Always
Often
Sometimes
Seldom
Never
1
2
3
4
5
Never
Seldom
Sometimes
Often
Always
1
2
3
4
5
Never
Seldom
Sometimes
Often
Always
1
2
3
4
5
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
4. Appropriate use of silence:
5. Uses unexplained jargon:
6. Makes empathic statements:
7. Shows warmth:
C: Please circle the appropriate figure in the box for each item Perception of student’s anxiety: Extremely
Very
Moderately
Slightly
Not at all
1
2
3
4
5
Overall rating of patient-centred communication skills: Very poor
Poor
Good
Very good
Excellent
1
2
3
4
5
Comments:
640
Handheld computers in assessment
Appendix 3: Paper rating for simulated patient Please complete the questions below after each procedure. The information will be used by the student to develop his/her professional skills. For each question, circle the number which best reflects your view.
1. Rate the student’s ability to communicate with you during the following phases of the interview.
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
(a) Introduction 1
2
3
4
5
Very poor
Poor
Good
Very good
Excellent
1
2
3
4
5
Very poor
Poor
Good
Very good
Excellent
1
2
3
4
5
Very poor
Poor
Good
Very good
Excellent
(b) Carrying out the procedure
(c) Closure
2. To what extent do you think the student understood your worries and concerns? 1
2
3
4
5
Not at all
Slightly
Moderately
Largely
Completely
3. Overall, how well do you think the student demonstrated: (a) Empathy 1
2
3
4
5
Very poor
Poor
Good
Very good
Excellent
1
2
3
4
5
Very poor
Poor
Good
Very good
Excellent
(b) Warmth
641
R. Kneebone et al.
4. How anxious do you think the student was throughout the procedure? 1
2
3
4
5
Extremely
Very
Moderately
Slightly
Not at all
5. Overall, how satisfied were you with the interview? 1
2
3
4
5
Not at all
Slightly
Moderately
Largely
Completely
Med Teach Downloaded from informahealthcare.com by Univ Rovira I Virgili on 11/30/11 For personal use only.
Please make any additional comments.
642