Anaesthesia 2015, 70, 1401–1411
doi:10.1111/anae.13266
Original Article Design and validation of the Regional Anaesthesia Procedural Skills Assessment Tool* A. Chuan,1 P. L. Graham,2 D. M. Wong,3 M. J. Barrington,4,5 D. B. Auyong,6 A. J. D. Cameron,7 Y. C. Lim,8 L. Pope,9 B. Germanoska,10 K. Forrest11 and C. F. Royse12 1 Consultant Anaesthetist, 9 Research Nurse, 10 Research Assistant, Liverpool Hospital, Sydney, New South Wales, Australia 2 Senior Lecturer, 11 Professor, Macquarie University, Sydney, New South Wales, Australia 3 Consultant Anaesthetist, 4 Senior Staff Specialist, St Vincent’s Hospital, Melbourne, Victoria, Australia 5 Associate Professor, 12 Professor, University of Melbourne, Melbourne, Victoria, Australia 6 Consultant Anaesthetist, Virginia Mason Medical Centre, Seattle, Washington, USA 7 Consultant Anaesthetist, Middlemore Hospital, Auckland, New Zealand 8 Consultant Anaesthetist, Changi General Hospital, Singapore
Summary The aim of this study was to create and evaluate the validity, reliability and feasibility of the Regional Anaesthesia Procedural Skills tool, designed for the assessment of all peripheral and neuraxial blocks using all nerve localisation techniques. The first phase was construction of a 25-item checklist by five regional anaesthesia experts using a Delphi process. This checklist was combined with a global rating scale to create the tool. In the second phase, initial validation by 10 independent anaesthetists using a test–retest methodology was successful (Cohen kappa ≥ 0.70 for interrater agreement, scores between test to retest, paired t-test, p > 0.12). In the third phase, 70 clinical videos of trainees were scored by three blinded international assessors. The RAPS tool exhibited face validity (p < 0.026), construct validity (p < 0.001), feasibility (mean time to score < 3.9 min), and overall reliability (intraclass correlation coefficient 0.80 (95% CI 0.67–0.88)). The Regional Anaesthesia Procedural Skills tool used in this study is a valid and reliable assessment tool to score the performance of trainees for regional anaesthesia. .................................................................................................................................................................
Correspondence to: A. Chuan Email:
[email protected] Accepted: 31 August 2015 *Presented in part at the Australian Society of Anaesthetists National Scientific Congress, Darwin, Australia, September, 2015.
Introduction There is an international trend towards competencybased anaesthesia curricula, with defined criteria describing the knowledge, skill sets and professional attributes of a specialist anaesthetist [1–6]. In the domain of regional anaesthesia, core competencies have been published by the American and European societies © 2015 The Association of Anaesthetists of Great Britain and Ireland
of regional anaesthesia for ultrasound guidance [7], the Royal College of Anaesthetists [8] and the Australian and New Zealand College of Anaesthetists [9]. To assess these core competencies the specialty requires validated and reliable assessment tools [10]. Combined checklist and global rating scale (GRS) assessment tools are commonly used to evaluate dis1401
Anaesthesia 2015, 70, 1401–1411
crete anaesthesia procedures, and are supported by published evidence [11]. A checklist is a list of items used to score trainees, and is typically graded with a dichotomous outcome. Items should demonstrate content validity, preferably by referencing published core competencies. By splitting a complex procedure (such as regional anaesthesia) into a series of component tasks, checklists also measure thoroughness in performance, and allow an assessor to provide feedback on strengths and weaknesses that are specific and directed, so that trainees can target their learning more easily. Global rating scales are particularly useful for holistic assessment of non-technical skills, such as professionalism and communication, that are difficult to assess using checklists, and are graded using an ordinal Likert scale [11]. Combined checklist and GRS assessment tools have been used to measure performance of neurostimulation-guided interscalene blocks [12], combined ultrasound- and neurostimulation-guided supraclavicular blocks [13], ultrasound-guided axillary plexus [14] and anatomical landmark-guided lumbar epidural blocks [15], or specifically for ultrasound-guided regional anaesthesia (UGRA) procedures [16]. The UGRA assessment tool has been evaluated for validation and reliability in a pure clinical setting [17] and in a mixed clinical and simulation centre setting [18]. There is no consensus on a validated method to assess how a trainee’s technical skills mature during a regional rotation, or a tool to allow comparisons between different institutions [19]. One possible reason is that the previously mentioned tools only assess individual blocks or UGRA procedures, and no tool has been validated for assessment of all regional anaesthesia procedures currently taught to trainees. Another reason is that validation of assessment tools often involves anaesthetists from the same institution, which may impact on applicability to other hospitals or other countries. The objective of this study was to evaluate the psychometric properties of the Regional Anaesthesia Procedural Skills (RAPS) assessment tool. It was designed as a single tool to allow assessment of all neuraxial and peripheral blocks, single injection and continuous catheter techniques, irrespective of whether ultrasound guidance, neurostimulation, surface landmark or com1402
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool
bined techniques are used for nerve localisation. Assessment of patient care after nerve blockade was also included. This study comprised three phases: construction of the RAPS tool; initial validation phase using a test–retest methodology; and a clinical validation phase that evaluated face validity, construct validity, reliability and feasibility of the tool.
Methods This study was approved by the relevant human research ethics committees. Before the study started, video recordings from St Vincent’s, Melbourne were also used to evaluate other assessment tools [17, 20]. A panel of five anaesthetists were invited to construct a checklist using a Delphi process. All anaesthetists were known educators or supervisors of regional anaesthesia rotation programmes at teaching hospitals. The Delphi process involved an initial e-mail distribution of sample items to the panel, with feedback sought on item selection and wording. Results of discussion were compiled, and a new draft of the checklist was distributed for further feedback. Consensus was reached over a 3-month period. This iterative process established content validity of the checklist. The instructions given to the panel were to design a checklist that assesses all nerve block procedures (neuraxial, peripheral, single injection, continuous catheter insertions), and all types of target location techniques (anatomical landmarks, UGRA, with or without nerve stimulation). Items were to be anchored on any applicable published guidelines and standards describing knowledge, skill sets and professional attributes relevant for regional anaesthesia. Source documents included: the American Society of Regional Anesthesia and Pain Medicine; the European Society of Regional Anaesthesia and Pain Therapy joint recommendations for education and training in ultrasound-guided regional anaesthesia [7]; regional anaesthesia fellowship training guidelines [21], and the Australian and New Zealand College of Anaesthetists Professional Standards [22]. The latter standards are professional documents that ‘provide guidance to the College’s trainees and Fellows on standards of anaesthetic practice. . . and are also referred to by government and other bodies’ [22], and are thus considered minimum levels of performance expected of College © 2015 The Association of Anaesthetists of Great Britain and Ireland
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool
trainees, Fellows, and all other care-givers providing anaesthesia services in both countries. These publicly available documents remove any ambiguity with regards to expected clinical behaviours during assessment of trainees. Bould et al. suggest that a combined checklist and GRS is advantageous to provide comprehensive evaluation of a trainee [11]. A checklist allows assessment of components of procedural skills, and complements a GRS that is more suited for assessment of non-technical behaviours. The final checklist derived from the Delphi process was paired with a GRS previously used for ultrasound-guided supraclavicular plexus blocks [13] to form the RAPS tool. Ten anaesthetists with expertise in regional anaesthesia were asked to score three scripted videos of nerve block performance (test) and after a one-month interval to re-score the same videos (retest) using the RAPS tool. These anaesthetists were not involved in the construction phase. No formal training was provided before test and retest. Scripted videos were used as appropriate, simulating satisfactory and unsatisfactory performances to assess the discriminatory power of the RAPS tool. Scripted behaviours included examples of poor communication skills, break in aseptic technique, poor ergonomics, advancement of the needle despite inadequate ultrasound views, hesitant and clumsy transducer movements, and lack of patient monitoring during block performance. These unsatisfactory behaviours were interspersed with satisfactory behaviours. The three videos depicted thoracic epidural catheter insertion (Video 1), ultrasound-guided popliteal sciatic nerve single injection block (Video 2), and combined ultrasound and nerve stimulator-guided axillary brachial plexus single injection block (Video 3). Actors performed the roles of patient, trainee and anaesthesia team. Each video had two segments; a ‘pre-procedure interview’ testing cognitive domains and clinical knowledge of each block, and a ‘procedural’ segment showing the trainee performing the block and interacting with the anaesthesia team and patient. A video camera recorded the trainee performing the procedure. Ultrasound imagery was recorded directly from the
© 2015 The Association of Anaesthetists of Great Britain and Ireland
Anaesthesia 2015, 70, 1401–1411
ultrasound machine and edited as a picture-in-picture insert. Questions asked of each trainee are included in Appendix 1. We accepted an a priori confirmation of reliability if moderate agreement (defined as Cohen’s kappa ≥ 0.70) was found during this test–retest stage. Scores below 0.70 would prompt redesign of the RAPS tool. After written informed consent was obtained from both patients and trainees that were to be filmed, 70 clinical videos were recorded over a 12month period in the operating rooms of Liverpool Hospital, Sydney and St Vincent’s Hospital, Melbourne, Australia. Adult patients receiving a regional anaesthesia procedure as part of their anaesthetic management were included. Each clinical video was recorded, edited and formatted as described for the scripted videos. Filming of the ‘pre-procedure interview’ occurred without the patient present and before the trainee performed the regional anaesthesia procedure. Supervising anaesthetists and nursing staff were informed of the study protocol and advised not to comment, prepare equipment or intervene unless specifically requested by the trainee, but they were permitted to intervene if trainee performance compromised patient safety. Three regional anaesthetists (authors DBA, AJDC and YL) who were not involved in the construction or initial validation phases, were recruited as ‘assessors’ (external to the study institutions) after signing a confidentiality agreement to maintain privacy of the patients and trainees filmed. Assessors were trained on scoring with the RAPS tool, in particular with reference to the publicly available source documents used to anchor the checklist items. Each assessor then received a password-encrypted hard drive by registered courier that contained the 70 clinical videos. The sequence of videos was randomised, and assessors were asked to score each video off-line in the same order as presented. Assessors were blinded to trainee identity, experience and level of training. RAPS scores and time taken to complete the assessment were electronically submitted. Clinical validation of the 70 videos involved evaluating the following five psychometric properties:
1403
Anaesthesia 2015, 70, 1401–1411
•
•
• •
•
Face validity (the ability of the RAPS tool to discriminate between trainees based on the difficulty of the regional anaesthesia procedure, with trainees expected to have higher scores when performing easier nerve blocks than during more difficult techniques). Construct validity (the ability of the RAPS tool to discriminate between trainees based on their level of experience in regional anaesthesia, with experienced trainees expected to perform better than their less experienced peers). Internal reliability (consistency of scoring by assessors for components measuring the same construct); whether checklist and GRS scores were consistent with global pass or fail scores. External reliability (reproducibility of scoring between different assessors viewing the same trainee performance). Assessment tools used for certification or other similar high-stakes examinations require evidence of high reliability and feasibility. Feasibility (the time taken to score using the RAPS tool is a reflection of ease of use in a clinical setting).
All checklist scores were converted to a percentage of maximum possible score for that procedure. For example, a single injection peripheral nerve block using UGRA has a maximum possible score of 21, whereas a lumbar epidural catheter insertion using surface landmarks has a maximum possible score of 19. The seven-item GRS scores were summated and converted to a percentage of the maximum possible score of 35. The dichotomous pass or fail score was unchanged. Using the above conversion, the 10 anaesthetists’ checklist and GRS scores for the three scripted videos at test and retest were normalised. Cohen’s kappa with 95% confidence intervals (CI) was calculated between test and retest for each video. Paired t-tests were used to evaluate normalised scores obtained at test and retest. Face validity compared assessors’ normalised scores to difficulty of nerve block procedure using analysis of variance (ANOVA) after nerve blocks were stratified into basic, intermediate, and advanced procedures. Ultrasound-guided regional anaesthetic 1404
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool
blocks were categorised using the Zuer’s ultrasound experts’ classification [23]. Construct validity was assessed by two methods: comparing experienced and inexperienced trainees’ normalised scores using a two-sample t-test, and comparing the pass/fail scores of experienced and inexperienced trainees using the Pearson chi-square test. Experienced was defined as ≥ 31 previous blocks, and inexperienced as ≤ 30 blocks at the time the trainee was recorded on video [17]. Internal reliability scores were stratified into three cohorts of low, intermediate and high scores (defined as checklist normalised scores < 50%, 50–75%, > 75%; raw GRS scores < 15, 15–25, > 25) against the dichotomous pass/fail item using a Pearson chi-square test. External reliability was measured using two-way average measures intraclass correlation coefficient (ICC) for absolute agreement, with 95% CI for the checklist, GRS and dichotomous pass/fail components of the RAPS tool. Feasibility was measured as the total time taken minus video length to calculate time taken for the assessors to score. Differences between assessors were measured using one-way ANOVA. Statistical analysis was performed with SPSS Version 22 (SPSS Inc, Chicago, IL, USA). Statistical significance for all analyses was defined as p < 0.05.
Results The final checklist contained 25 items, with 15 items applicable for all nerve blocks, one item specific for catheter insertions, six items for UGRA techniques and three items for non-UGRA blocks. When combined with the existing GRS and a dichotomous Pass/ Fail item this resulted in the RAPS assessment tool described in Table 1. In the initial validation phase, the test–retest kappa scores were 0.86 (Video 1, 95% CI 0.78–0.94), 0.78 (Video 2, 95% CI 0.68–0.88) and 0.70 (Video 3, 95% CI 0.57–0.83). Normalised checklist and GRS scores between test and retest are presented in Fig. 1. Scores from test to retest were not significantly different for either checklist or GRS in all three videos (paired t-test, all p ≥ 0.121). Nerve block procedures and experience levels of trainees recorded in the 70 videos used for clinical © 2015 The Association of Anaesthetists of Great Britain and Ireland
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool
Anaesthesia 2015, 70, 1401–1411
Table 1 The Regional Anaesthesia Procedural Skills (RAPS) Assessment Tool is composed of a 25-item checklist, 7item global rating scale and 1 item for overall pass or fail. ‘Time taken to score’ item was for collection of feasibility data and is not part of the RAPS tool. Professional Standards refer to the practice guidelines applicable to all Fellows and trainees of the Australian and New Zealand College of Anaesthetists [22].
Checklist Items Satisfactory (S) Unsatisfactory (U) Not Applicable (NA)
Checks All Blocks 1. Obtains informed consent as described by Professional Standards 26 and Professional Standards 03 2. Ergonomic positioning of patient, equipment, and proceduralist 3. Obtains intravenous access and applies monitoring as defined by Professional Standards 03 4. Appropriate combination of local anaesthesia, additives, or adjuvants 5. Chooses clinically appropriate needle 6. Sets up equipment properly, including ultrasound machine or neurostimulator 7. Skin asepsis and maintains sterility for that block as defined by Professional Standards 28 8. If providing procedural anxiolysis: maintains conscious sedation as defined by Professional Standards 09 9. Aspirates to check for blood/cerebrospinal fluid, uses incremental boluses, and re-aspirates between bolus 10. Checks for signs of systemic toxicity, intravascular injection 11. Checks for signs of potential intraneural injection Catheter item 12. Correctly fixates and checks continuous infusion catheters/epidurals Non-ultrasound-guided regional anaesthesia techniques 13. Locates correct surface anatomy/landmarks for block 14. Chooses appropriate needle insertion point and trajectory 15. Chooses correct current and motor endpoints during block if combined with neurostimulation Ultrasound-guided regional anaesthesia techniques 16. Performs survey scan, identifies structures pertinent to procedure 17. Optimises nerve image by probe manipulation, nerve localisation techniques 18. Chooses appropriate needle insertion point, and trajectory to maintain in-plane ultrasound views 19. Demonstrates ability to locate needle tip in real time, throughout procedure 20. Recognises spread of local anaesthesia and adjusts needle positioning to optimise local anaesthesia distribution 21. Chooses correct current and motor endpoints during block if combined with neurostimulation All blocks 22. Demonstrates knowledge of block onset and success by motor/sensory testing 23. Formulates and performs rescue block (as necessary) 24. Formulates perioperative analgesia plan 25. Formulates plan for care of blocked region, and postoperative follow up Global Rating items Preparation for procedure
Respect for tissue
Time and motion
1 Did not organise equipment well. Has to stop procedure frequently to prepare equipment 1 Frequently used unnecessary force on tissue or caused damage 1 Many unnecessary moves
2
3 Equipment generally organised. Occasionally has to stop and prepare items
4
5 All equipment neatly organised, prepared, and ready for use
2
3 Careful handling of tissue but occasionally caused unintentional damage 3 Efficient time/motion but some unnecessary moves
4
5 Consistently handled tissues appropriately with minimal damage 5 Clear economy of movement and maximum efficiency
2
4
(continued)
© 2015 The Association of Anaesthetists of Great Britain and Ireland
1405
Anaesthesia 2015, 70, 1401–1411
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool
Table 1 (continued) Global Rating items Instrument handling
2
Knowledge of procedure
1 Repeatedly makes tentative or awkward moves with instruments 1 Frequently stopped procedure and seemed unsure of next move 1 Deficient knowledge
Overall performance
1 Very poor
2
Flow of procedure
2
2
3 Competent use of instruments but occasionally appeared stiff or awkward 3 Demonstrated some forward planning with reasonable progression of procedure 3 Knew all important steps of procedure 3 Competent
Overall, should the trainee pass or fail
Pass
4
5 Fluid moves with instruments and no awkwardness
4
5 Obviously planned course of procedure with effortless flow from one move to the next 5 Demonstrated familiarity with all aspects of procedure 5 Clearly superior
4
4
Fail
Please record total time of viewing and scoring each video (minutes/seconds)
validation are presented in Table 2. Fifteen different regional anaesthesia procedures were recorded. Technically distinct approaches for each block were grouped (e.g. subcostal and posterior transversus abdominis plane blocks). All neuraxial blocks were surface landmark-based techniques. All other blocks were in-plane, ultrasound-guided procedures, except for the lumbar plexus block which was a combined ultrasound-guided and neurostimulation technique. Lumbar neuraxial, transversus abdominis plane and fascia iliaca blocks were not part of the original Zuer’s classification, and were categorised as basic blocks for this study. Plexus catheter insertions were also not originally classified, but for this study were included in the same complexity category as single injection technique as the perceived technical difficulty was similar in both procedures. Face validity results are described in Table 3, and showed statistical significance for both the checklist (p = 0.016) and GRS (p = 0.026) normalised scores in discriminating between videos depicting three different levels of nerve block difficulty. Table 4 describes construct validation. In the first method, the checklist (p = 0.027) and the GRS (p < 0.001) normalised scores were significantly different between inexperienced and experienced trainees. In the second method, the proportions of inexperienced and experienced trainees who received a pass or fail by assessors were significantly different (p < 0.001). 1406
Table 5 describes testing for internal reliability, comparing the proportions of trainees scored as pass or fail against their checklist and GRS scores when divided into low, intermediate and high scores. There was a significant difference between these proportions (p < 0.001). External reliability between the three assessors was tested using ICC absolute agreement for the checklist (0.80, 95%CI 0.67–0.88), for the GRS (0.80, 95%CI 0.69–0.87) and for the pass/fail dichotomous item (0.71, 95%CI 0.57–0.81). Feasibility of the RAPS tool as used by the three assessors is reported in Table 6. Mean time taken to use the RAPS tool was statistically significant between the three assessors (p < 0.001), but the mean time for scoring was less than four minutes, suggesting that the tool is feasible.
Discussion The Regional Anaesthesia Procedural Skills tool used in this study is a valid and reliable assessment tool to score trainees performing regional anaesthesia. A 25item checklist was created using a Delphi process to produce a scoring tool anchored on core competencies, technical skills and professional attributes relevant to regional anaesthesia. Uniquely, this checklist was designed to be applicable for all types of regional anaesthesia procedures, whether using ultrasound, surface anatomical landmark or neurostimulation, and includes assessment of postblock patient care. This was © 2015 The Association of Anaesthetists of Great Britain and Ireland
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool
% of maximum possible score
Table 2 Characteristics of regional anaesthesia procedure and trainee experience levels of 70 videos used for clinical validation.
Normalised checklist scores
100 80
Inexperienced trainee (≤ 30 prior blocks)
60
Regional anaesthesia procedure
40
Basic nerve blocks Interscalene 3 brachial plexus Axillary brachial 7 plexus Terminal branches 1 of brachial plexus Femoral nerve* 5 Adductor canal/ 2 Subsartorial saphenous nerve Superficial cervical 1 plexus Neuraxial (spinal or 5 lumbar epidural*) Transversus 1 abdominis plane* Fascia iliaca 0 (inguinal approach) Intermediate nerve blocks Supraclavicular 1 plexus Sciatic nerve* 5 Advanced nerve blocks Lumbar plexus 0 Total 31
20 0
o
1
o
2
o
3
de
de
de
Vi
Vi
Vi
Normalised GRS scores 100 % of maximum possible score
Anaesthesia 2015, 70, 1401–1411
80 60 40 20 0
eo
d Vi
1
eo
d Vi
2
eo
Experienced trainee (≥ 31 prior blocks)
Total
4
7
5
12
1
2
3 4
8 6
0
1
4
9
2
3
1
1
3
4
11
16
1 39
1 70
3
d Vi
Figure 1 Regional Anaesthesia Procedural Skills assessment tool normalised scores at test–retest validation. Box and whiskers plot of checklist (top) and global rating scale (GRS) (bottom) scores of the three scripted videos. Means and range of scores of 10 anaesthetists, after conversion of raw scores as percentages of maximum possible score for each video. There was no significant difference in means of scores from test (h) to retest ( ) (p values range 0.121–0.468).
complemented with an existing seven-item GRS [13] to score the professional and non-technical domains of performance that formed the RAPS tool. In the initial validation phase, we demonstrated the reliability of the RAPS tool using a test–retest methodology. In the clinical validation phase, the RAPS tool fulfilled requirements for face validity, construct validity, reliability, and feasibility as an assessment tool for evaluating trainees at the end of a regional anaesthesia rotation. © 2015 The Association of Anaesthetists of Great Britain and Ireland
*Denotes when catheters were inserted, but were counted as a single injection block. Inexperienced trainees were defined as having performed ≤ 30 prior blocks at time of video recording, and experienced trainees as having performed ≥ 31 blocks.
The RAPS checklist has the advantage of assessment against specific criteria based on minimum learning outcomes, such as maintaining needle tip visibility under ultrasound. This allows identification of an individual trainee’s relative proficiency in component tasks that are necessary to demonstrate competency in regional anaesthesia. This has important consequences. It assists educators in providing structured feedback of strengths and weaknesses [24, 25]. Scarce training resources can be more efficiently targeted at weaknesses, training programmes could be tailored for individuals and the checklist can be used to measure quantitative success of educational interventions. One strength of this study was the methodology used to validate the RAPS tool. Content validity of the checklist was established through a Delphi process. A 1407
Anaesthesia 2015, 70, 1401–1411
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool
Table 3 Face validation of the 70 clinical videos. Normalised percentage scores from three assessors scoring 70 videos, reported as mean (SD). Categorisation of nerve block difficulty is based on the Zuer’s ultrasound experts’ regional anaesthesia statement [23]. Regional anaesthesia procedure difficulty level
Checklist GRS
Basic (n = 147)
Intermediate (n = 60)
Advanced (n = 3)
p value
67.6 (15.7)% 64.1 (19.0)%
67.4 (15.1)% 64.6 (16.7)%
41.4 (10.0)% 35.2 (12.6)%
0.016 0.026
GRS, global rating scale.
Table 4 Construct validation of the 70 clinical videos. Values are means (SD) of normalised checklist and global rating scale (GRS) scores of trainees, categorised as experienced and inexperienced, or as number of returns (percentage proportion of pass/fail within each trainee experience group). Inexperienced trainees were defined as performed ≤ 30 prior blocks at time of video recording, and experienced trainees as performed ≥ 31 blocks. Trainee experience
Checklist GRS Pass Fail
Inexperienced (n = 93)
Experienced (n = 117)
64.7 58.9 41 52
69.5 68.4 84 33
(16.3)% (18.8)% (44.1%) (55.9%)
(14.9)% (17.1)% (71.8%) (28.2%)
p value 0.027 < 0.001 < 0.001
test–retest protocol was an opportunity to confirm initial validation of the checklist, providing assurance of reliability before commitment to a clinical validation phase. Face validation was improved by recruiting anaesthetists from different institutions in all three phases of the study and by including a mix of different regional anaesthesia procedures in the scripted and clinical videos. The RAPS tool was able to discriminate between blocks of different complexity, with lower scores for more difficult procedures, which is consistent with clinical experience. Individually, both the checklist and GRS exhibited similar reliability and validity. Construct validity was demonstrated by the utility of the RAPS tool to discriminate between trainees of different expertise. As there is no gold standard for 1408
defining expertise in regional anaesthesia, we used a surrogate measure of the trainee’s volume of practice at the time of video recording to quantify experience. In the clinical validation study of an UGRA-specific checklist by Wong et al. [17], trainees were divided into inexperienced and experienced practitioners based on a cut-off of 30 previous blocks at the time of video recording. Using this same definition, this study showed that the RAPS checklist, GRS and overall pass or fail items were equally able to discriminate trainees based on experience. Internal reliability is defined as whether different parts of the tool assess the same construct score similarly, whereas external reliability examines whether different assessors score the same trainee similarly. It has been argued that external reliability is the more important, as this contributes to fairness of assessment [11]. We were able to show good correlation between higher checklist and GRS scores and pass marks. With regard to external reliability, the ICC of 0.80 for both the checklist and GRS satisfies as a moderate-stakes assessment when this is defined as an ICC of > 0.80 [26], or as excellent reliability using a definition of > 0.75 [27]. This is sufficient reliability for end-of-course examinations and at the conclusion of a regional anaesthesia rotation. Time taken to score using the RAPS tool was not excessive. This is important for acceptance of the RAPS tool as a workplace-based assessment tool in a busy clinical environment. A potential limitation of this study is the absence of a large sample of advanced blocks as defined by Zuer’s classification. Our videos covered 15 different regional anaesthesia blocks that were representative of current practice in the study institutions, but nearly all were categorised as basic and intermediate complexity blocks. In the original description, advanced techniques were lumbar plexus, deep cervical and thoracic paravertebral blocks, to which we would add ultrasound-guided neuraxial techniques. The range of blocks in our study is likely to reflect the activity of similar centres worldwide, with the advanced blocks more commonly taught in specific centres. Another possible limitation is the anchoring of the checklist items 1 (informed consent), 3 (intravenous access and monitoring), 7 (skin asepsis and sterility) © 2015 The Association of Anaesthetists of Great Britain and Ireland
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool
Anaesthesia 2015, 70, 1401–1411
Table 5 Internal reliability of the 70 clinical videos. Proportions of pass/fail score given to trainees, compared with their normalised checklist and raw global rating scale (GRS) scores when divided into three cohorts. Reported as number of returns (proportion of cohort). The cohorts correspond to low (normalised checklist score < 50%, raw GRS score < 15), intermediate (normalised checklist score 50–75%, raw GRS score 15–25) and high (normalised checklist score > 75%, raw GRS score > 25). Number (proportion). * p < 0.001. Normalised checklist score (percentage of maximum score)
Pass Fail
< 50%
50–75%
> 75%
< 15
15–25
> 25
6 (22.2%) 21 (77.8%) *
70 (60.3%) 46 (39.7%)
56 (83.6%) 11 (16.4%)
3 (9.1%) 30 (90.9%) *
56 (56.6%) 43 (43.4%)
73 (93.6%) 5 (6.4%)
Table 6 Feasibility of the Regional Anaesthesia Procedural Skills assessment tool, as determined by time taken to score each of the 70 videos (min).
Assessor 1 Assessor 2 Assessor 3
Raw GRS scores (maximum score 35)
Range
Mean (SD)
0–8.25 0–12.50 0–8.00
1.59 (1.51) 3.93 (2.68) 2.96 (1.24)
and 8 (conscious sedation) on Professional Standards [22] promulgated by the Australian and New Zealand College of Anaesthetists. While these practice guidelines were written for the trainees and Fellows of the College, we believe equivalent principles exist in other countries. The advantage of using explicit standards is to remove ambiguity when scoring performance, and they were deliberately referenced during both checklist construction and scoring by assessors. We purposely chose two assessors from outside Australasia for the clinical validation phase to improve the generalisability of the RAPS tool to other countries. Previous research on education in regional anaesthesia has focussed on assessment tool validation, technology and use of partial task trainers and simulation. We suggest that future research should examine the utility of assessment tools to serially measure trainees as they progress through a regional anaesthesia rotation. Having objective, reproducible measurements before and after educational interventions allows evaluation of our training programmes. Structured feedback to a trainee after use of an assessment tool is also valuable, and should be considered as important as quantitative scoring of performance. © 2015 The Association of Anaesthetists of Great Britain and Ireland
In conclusion, there is evidence that the RAPS assessment tool is a valid and reliable tool to assess regional anaesthesia performance. Unlike previous checklists that were specific for a single block type or nerve localisation technique, the RAPS checklist was designed as a universal assessment form for all types of regional anaesthesia procedures with all types of guidance. Scoring was stable between test and retest of scripted videos in a group of ten anaesthetists, despite no training in the use of the RAPS tool. During clinical validation, the RAPS tool was shown to have face validity, construct validity and feasibility. The RAPS tool may be used for trainee assessment during a regional anaesthesia rotation, with checklist items scoring specific technical skills and the complementary global rating scale used to score non-technical skills.
Acknowledgements The authors thank our colleagues for their expert assistance in this study: Drs Peter Hebbard, Graham Hocking, Chris Mitchell, David M. Scott, and Paul Soeding for the construction phase of the RAPS checklist; Drs Malcolm Albany, Harmeet Aneja, Tung Bui, Phil Cowlishaw, Michael Ehrlich, Clement Fong, Andrew Lansdown, Daniel McGlone, Minh T. Tran, and Chris K.B. Wong as the regional anaesthesia experts involved with the test–retest initial validation phase. The authors would also like to thank the consultants, trainees and nursing staff of the Department of Anaesthesia, Liverpool Hospital, Sydney, and Department of Anaesthesia and Acute Pain Medicine, St Vincent’s Hospital, Melbourne, for their participation in the clinical 1409
Anaesthesia 2015, 70, 1401–1411
videos. AC received financial support from the National Health and Medical Research Council (APP1056280) and the Australian Society of Anaesthetists (PhD Support Grant) to assist this work. No conflicts of interest declared.
Appendix Prepared questions asked in the three scripted videos. For the clinical videos, relevant information (for example, actual patient body weight) was substituted.
Pre-procedure interview segment 1 Please describe the name of the block you are performing today, and what is the indication for this patient. 2 Assume I am the patient. Please take informed consent for this block procedure. 3 This patient is 70 kg. Please describe your local anaesthesia concentration, volume, and any additives for this block. Can you explain your choice. 4 Please describe what needle type, size and length you would use for this block. 5 Please describe how you would test the success of this block. 6 You test the block after 30 min and the block is inadequate for surgical anaesthesia. Describe what options you have if you cannot cancel the case, and need to proceed to surgery. 7 Assume your block is successful. Please describe your postoperative analgesic plan for this patient. This includes what medications you would prescribe, and any instructions you would provide for this patient and the nursing staff.
Procedural segment 8 For UGRA procedure: Please perform a traceback scan that shows all the relevant sonoanatomy for this block. 9 Please verbally describe all important structures that are visible during this traceback scan, using an analogue clock description. For example ‘The femoral nerve is at three o’clock position to the femoral artery’
1410
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool
10 Show me the best possible image at the location where you will be performing this block. 11 Relative to this nerve, where do you intend to position your needletip, using an analogue clock description. 12 What would be your ideal spread of local anaesthesia around this nerve, using an analogue clock description. 13 Show me where you intend to insert your block needle on the patient’s skin. 14 If using combined UGRA and nerve stimulation: What current are you commencing at, and what is the lowest current you will be satisfied as an endpoint.
References 1. Wilkinson JR, Crossley JG, Wragg A, Mills P, Cowan G, Wade W. Implementing workplace-based assessment across the medical specialties in the United Kingdom. Medical Education 2008; 42: 364–73. 2. Wragg A, Wade W, Fuller G, Cowan G, Mills P. Assessing the performance of specialist registrars. Clinical Medicine 2003; 3: 131–4. 3. Australian and New Zealand College of Anaesthetists. ANZCA Curriculum Framework. http://www.anzca.edu.au/training/ 2013-training-program/pdfs/ANZCA_CurriculumFramework_V1-0_ Apr2010.pdf (accessed 01/03/2013). 4. Royal College of Physicians and Surgeons of Canada. CanMEDS 2005 Framework. http://www.royalcollege.ca/common/ documents/canmeds/framework/the_7_canmeds_roles_e.pdf (accessed 01/08/2014). 5. The Royal College of Anaesthetists. Curriculum for a CCT in Anaesthetics. http://www.rcoa.ac.uk/system/files/TRG-CUCCT-ANAES2010_0.pdf (accessed 01/08/2014). 6. Van Gessel E, Mellin-Olsen J, Ostergaard HT, Niemi-Murola L. Postgraduate training in anaesthesiology, pain and intensive care: the new European competence-based guidelines. European Journal of Anaesthesiology 2012; 29: 165–8. 7. Sites BD, Chan VW, Neal JM, et al. The American Society of Regional Anesthesia and Pain Medicine and the European Society of Regional Anaesthesia and Pain Therapy joint committee recommendations for education and training in ultrasound-guided regional anesthesia. Regional Anesthesia and Pain Medicine 2010; 35: S74–80. 8. Association of Anaesthetists of Great Britain & Ireland, Royal College of Anaesthetistis, Intensive Care Society. Ultrasound in Anaesthesia and Intensive Care – A Guide to Training. http:// www.aagbi.org/sites/default/files/Ultrasound%20in%20Anae sthesia%20and%20Intensive%20Care%20-%20A%20Guide%20 to%20Training.pdf (accessed 20/12/2012). 9. Australian and New Zealand College of Anaesthetists. The 2013 Training Program. http://www.anzca.edu.au/training/ 2013-training-program/pdfs/training-accreditation-handbook (accessed 17/04/2015). 10. Epstein RM. Assessment in medical education. New England Journal of Medicine 2007; 356: 387–96.
© 2015 The Association of Anaesthetists of Great Britain and Ireland
Chuan et al. | The Regional Anaesthesia Procedural Skills Tool 11. Bould MD, Crabtree NA, Naik VN. Assessment of procedural skills in anaesthesia. British Journal of Anaesthesia 2009; 103: 472–83. 12. Naik VN, Perlas A, Chandra DB, Chung DY, Chan VW. An assessment tool for brachial plexus regional anesthesia performance: establishing construct validity and reliability. Regional Anesthesia and Pain Medicine 2007; 32: 41–5. 13. Chin KJ, Tse C, Chan V, Tan JS, Lupu CM, Hayter M. Hand motion analysis using the Imperial College Surgical Assessment Device: validation of a novel and objective performance measure in ultrasound-guided peripheral nerve blockade. Regional Anesthesia and Pain Medicine 2011; 36: 213–9. 14. Sultan SF, Iohom G, Saunders J, Shorten G. A clinical assessment tool for ultrasound-guided axillary brachial plexus block. Acta Anaesthesiologica Scandinavica 2012; 56: 616–23. 15. Friedman Z, Katznelson R, Devito I, Siddiqui M, Chan V. Objective assessment of manual skills and proficiency in performing epidural anesthesia–video-assisted validation. Regional Anesthesia and Pain Medicine 2006; 31: 304–10. 16. Cheung JJ, Chen EW, Darani R, McCartney CJ, Dubrowski A, Awad IT. The creation of an objective assessment tool for ultrasound-guided regional anesthesia using the Delphi method. Regional Anesthesia and Pain Medicine 2012; 37: 329–33. 17. Wong DM, Watson MJ, Kluger R, et al. Evaluation of a taskspecific checklist and global rating scale for ultrasound-guided regional anesthesia. Regional Anesthesia and Pain Medicine 2014; 39: 399–408. 18. Laurent DA, Niazi AU, Cunningham MS, et al. A valid and reliable assessment tool for remote simulation-based ultrasoundguided regional anesthesia. Regional Anesthesia and Pain Medicine 2014; 39: 496–501.
© 2015 The Association of Anaesthetists of Great Britain and Ireland
Anaesthesia 2015, 70, 1401–1411 19. Nix CM, Margarido CB, Awad IT, et al. A scoping review of the evidence for teaching ultrasound-guided regional anesthesia. Regional Anesthesia and Pain Medicine 2013; 38: 471–80. 20. Watson MJ, Wong DM, Kluger R, et al. Psychometric evaluation of a direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia. Anaesthesia 2014; 69: 604–12. 21. The Regional Anesthesiology Acute Pain Medicine Fellowship Directors Group. Guidelines for Fellowship Training in Regional Anesthesiology and Acute Pain Medicine: Second Edition, 2010. Regional Anesthesia and Pain Medicine 2011; 36: 282–8. 22. Australian and New Zealand College of Anaesthetists. Professional Standards. http://www.anzca.edu.au/resources/professional-documents (accessed 10/05/2015). 23. Marhofer P. Ultrasound Guidance in Regional Anaesthesia Principles and practical implementation. Oxford, UK: Oxford University Press, 2010. 24. Dannefer EF. Beyond assessment of learning toward assessment for learning: educating tomorrow’s physicians. Medical Teacher 2013; 35: 560–3. 25. Norcini JJ. Current perspectives in assessment: the assessment of performance at work. Medical Education 2005; 39: 880–9. 26. Downing SM. Reliability: on the reproducibility of assessment data. Medical Education 2004; 38: 1006–12. 27. Cicchetti DV. Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment 1994; 6: 284–90.
1411