Development of a competency framework for ...

20 downloads 1038 Views 512KB Size Report
May 12, 2016 - and researchers from the medical and nursing schools at the University of Sydney and the Nurse Teacher's Society in Australia were ... Nursing is viewed as a practice-based discipline, and clinical ...... Online J. Issues Nurs.
Nurse Education Today 39 (2016) 189–196

Contents lists available at ScienceDirect

Nurse Education Today journal homepage: www.elsevier.com/nedt

Development of a competency framework for evidence-based practice in nursing Kat Leung a,⁎, Lyndal Trevena b,1, Donna Waters c,2 a b c

Sydney Medical School, Room 121B, Edward Ford Building (A27), University of Sydney, NSW 2006, Australia Sydney Medical School, Room 321B, Edward Ford Building (A27), University of Sydney, NSW 2006, Australia Sydney Nursing School, Room C5.10, 88 Mallett Street, Camperdown, NSW 2050, Australia

a r t i c l e

i n f o

s u m m a r y Background: The measurement of competence in evidence-based practice (EBP) remains challenging to many educators and academics due to the lack of explicit competency criteria. Much uncertainty exists about what specific EBP competencies nurses should meet and how these should be measured. Objectives: The objectives of this study are to develop a competency framework for measuring evidence-based knowledge and skills in nursing and to elicit the views of health educators/researchers about elements within the framework. Design: A descriptive survey design with questionnaire. Participants and Settings: Between August and December 2013, forty-two health academics/educators, clinicians; and researchers from the medical and nursing schools at the University of Sydney and the Nurse Teacher's Society in Australia were invited to comment on proposed elements for measuring evidence-based knowledge and skills. Methods: The EBP competency framework was designed to measure nurses' knowledge and skills for using evidence in practice. Participants were invited to rate their agreement on the structure and relevance of the framework and to state their opinion about the measurement criteria for evidence-based nursing practice. Results: Participant agreement on the structure and relevance of the framework was substantial, ICC: 0.80, 95% CI: 0.67–0.88, P b 0.0001. Qualitative analysis of two open-ended survey questions revealed three common themes in participants' opinion of the competency elements: (1) a useful EBP framework; (2) varying expectations of EBP competence; and (3) challenges to EBP implementation. Conclusions: The findings of this study suggested that the EBP competency framework is of credible value for facilitating evidence-based practice education and research in nursing. However, there remains some uncertainty and disagreement about the levels of EBP competence required for nurses. These challenges further implicate the need for setting a reasonable competency benchmark with a broader group of stakeholders in nursing. © 2016 Elsevier Ltd. All rights reserved.

Article history: Accepted 26 January 2016 Keywords: Evidence-based practice Competency criteria Competence measurement Competency framework Competency measurement Competency standards

Introduction Evidence-based practice (EBP) has been proposed for optimal patient care for more than three decades, yet competence in EBP knowledge and skills among nurse clinicians remains difficult to measure. A lack of explicit competency standards and a limited number of validated assessment tools add to the challenge (Leung et al., 2014). The Nursing and Midwifery Board of Australia regulates competency standards for practising nurses and midwives in Australia and requires all newly registered nurses to be competent in providing care within an evidence⁎ Corresponding author. Tel.: +61 2 93513885 (office). E-mail addresses: [email protected] (K. Leung), [email protected] (L. Trevena), [email protected] (D. Waters). 1 Tel.: +61 2 93517788; fax: +61 2 93515049. 2 Tel.: +61 2 93510699 (office).

based practice framework (ANMC, 2006). While currently under review, these standards have driven competence assessment for the past ten years and are far from specific in detailing attributes and characteristics for measuring EBP competence. Despite requirements for EBP competency frameworks in education, regulation and practice in nursing, this ambiguity remains part of an ongoing agenda in evidencebased nursing practice (Laibhen-Parkes, 2014). Background The Meaning of Competence and Competency Nursing is viewed as a practice-based discipline, and clinical competence remains the focus of much nursing literature (Chapman, 1999; Raines, 2010). Competence is a combination of the complex

http://dx.doi.org/10.1016/j.nedt.2016.01.026 0260-6917/© 2016 Elsevier Ltd. All rights reserved.

Downloaded from ClinicalKey.com/nursing at Veteran's Affairs Medical Center - Wichita May 12, 2016. For personal use only. No other uses without permission. Copyright ©2016. Elsevier Inc. All rights reserved.

190

K. Leung et al. / Nurse Education Today 39 (2016) 189–196

attributes of knowledge, skills, and attitudes; with the ability to make professional judgement and to perform intelligently in specific situations. Competence cannot be directly observed from an individuals' behaviour but is inferred from their performance (Biggs, 1994; Gonczi, 1994; Hager and Gonczi, 1996; Messick, 1984; Neufeld and Norman, 1985). Competency is described as an underlying characteristic of performance; it is multifaceted and difficult to measure (Carraccio et al., 2002; Cowan et al., 2005; Eraut, 1998; Laibhen-Parkes, 2014; Tilley, 2008). The fundamental meanings of competence and competency are similar in that ‘multiple attributes’ and ‘performance’ are frequently used inconsistently and interchangeably in nursing literature (Cowan et al., 2005; Laibhen-Parkes, 2014). Broadly speaking, competence reflects a person's cognitive approach to a task, encompassing the multiple attributes of knowledge, skills and attitudes (ANMC, 2006; Carraccio et al., 2002; Frank et al., 2010a) whereas competency highlights a person's ability to perform those tasks within the defined context of professional practice (Laibhen-Parkes, 2014; Rebholz, 2006; Whelan, 2006). Although Laibhen-Parkes (2014) has recently defined EBP competence as “the ability to ask clinically relevant questions for the purposes of acquiring, appraising, applying, and assessing multiple sources of knowledge within the context of caring for a particular patient, group, or community (p. 8)”, this working definition is conceptually a generic description of evidence-based practice, which has already been published elsewhere. Competency-based Education The practice of competency-based educational assessment (CBE) has emerged over more than three decades and is widely adopted by the medical and health disciplines. The pedagogical model on which competency assessment is based is goal-oriented and outcome-driven. Learners are expected to take responsibility for their learning and be able to meet all performance criteria specifically set for the task. The emphasis of CBE is to enhance an effective integration of knowledge and skills and it is concerned with whether or not an individual can perform at a minimally acceptable level of competence for accreditation or licensure purposes (Frank et al., 2010b; Hsieh and Hsu, 2013; Pijl-Zieber et al., 2014; Pimlott, 2011; Taber et al., 2010). Competency-based education allows students to have more flexibility in their learning, directed to achieving course learning objectives rather than rote learning. Throughout the 1980s and 1990s, nursing education around the world shifted from ‘hospital-based training’ to a ‘competency-based curriculum’ (Bradshaw and Merriman, 2008; Windsor et al., 2012). CBE is viewed as a teaching-learning process that emphasises outcome achievement, in which an individual must know and be able to complete certain tasks (Harrison and Mitchell, 2006; Scott, 1982; Tanner, 2001). CBE involves the cognitive, affective and psychomotor aspects of an individual's performance. Competency-based Assessment The move towards CBE paved the way for the development of new ways of evaluating learning outcomes for both clinical and nonclinical subjects. Constructive strategies were required to clearly identify components of measurable criteria; with indicators for performance levels where specific skill acquisition for a profession could be evaluated (Hackett, 2001; Reeves et al., 2009). Observable measures such as simulation assessment and Objective Structured Clinical Examination are now commonly used for assessing clinical competence in the health disciplines (Baez, 2004; Damron-Rodriguez, 2008; Hyer et al., 2004; Ryan et al., 2007; Simpson et al., 2006). The relationship between competence, competency and CBE is summarised in Fig. 1. Within the EBP paradigm in nursing, the measurement of competence remains challenging to many faculties and schools due to the lack of explicit competency criteria and reliable assessment tools. To

date, only a few validated performance-based EBP assessment tools such as the Fresno test (Ramos et al., 2003) and the Assessing Competency in Evidence Based Medicine (ACE) tool (Ilic et al., 2014) have been tested. However neither is specific to measuring evidence-based nursing practice (Leung et al., 2014). In Australia, EBP is taught within all nursing pre-registration programmes in order to meet the national competency requirements for registration. Registered nurses are expected to understand the concept of EBP and be readily engaged with using research evidence in their practice. However, much uncertainty exists about what specific EBP competencies nurses should meet, and how these should be measured. Although the recent EBP competency guide developed by Melnyk et al. (2014) provides useful performance indicators for measuring 13 EBP competency elements through a seven-step process, there remains ambiguity about how achievement of skills within each step can be measured. For example, in competency elements 5–7, RNs are expected to participate in critical appraisal of evidence (step 3 of the 5-step EBP model) (Melnyk et al., 2014). The associated practical skills required to accomplish this step are not clearly stated. Melynk's seven-step EBP implementation model begins with the cultivation of a spirit of clinical inquiry (Step 0), followed by the generic 5-step model and dissemination of practice outcome (Step 6) (Melnyk et al., 2014). These extra steps may contribute to more successful EBP implementation; however, objective measurement of their outcome, such as EBP culture (Step 0), is not always feasible. Therefore, an EBP competency framework with explicit measurement criteria is important in facilitating the measurement of EBP competence through the generic 5-step EBP model; as well as to guide the development of a scenario-based evidence-based practice assessment tool for nursing. Aim The aim of this study was to develop a competency framework for guiding the development of an evidence-based nursing practice assessment tool. Objectives 1. To develop an evidence-based practice competency framework to test EBP knowledge and skills in nursing. 2. To elicit the views of health educators/researchers about elements proposed for measuring evidence-based knowledge and skills within the framework. Design The purpose of this study was to solicit expert comment and agreement on a newly developed EBP competency framework through a questionnaire. A descriptive survey design was used (Jirojwong et al., 2014) for answering the research question below: What are the views of health educators/academics about the framework developed for measuring evidence-based knowledge and skills in nurses? Methods Developing the EBP Competency Framework The initial competency framework was drafted following a review of standards for evidence-based nursing practice within the Australian and international health literature by the first author. The framework was then iteratively reviewed and developed by co-authors, who are experienced academics and researchers teaching EBP in the medical, public health and nursing professions. The EBP competency framework (Table 1) was designed to measure nurses' knowledge and skills for using evidence in practice. The framework was developed as a matrix of three columns and five rows,

Downloaded from ClinicalKey.com/nursing at Veteran's Affairs Medical Center - Wichita May 12, 2016. For personal use only. No other uses without permission. Copyright ©2016. Elsevier Inc. All rights reserved.

K. Leung et al. / Nurse Education Today 39 (2016) 189–196

191

Competency Based Education

Teaching - Learning Process

Goal Oriented

Teachers

Learners

Set learning goals and objectives Construct strategies for outcome measurement

Effective Education Strategies

Competency Based Assesment

Outcome Driven

Fig. 1. The relationship between competence, competency and competency based educational assessment.

mapping each step of the EBP model Ask, Acquire, Appraise, Apply; and Assess (Straus et al., 2011) across the grid of competency elements. As the general assessment of competence has been framed as the application of knowledge and psychomotor skill expected for the competent practise of nursing (Laibhen-Parkes, 2014), it was our intention to imprint the attributes of knowledge and skill within the context of the 5step EBP model in the second and third columns of the framework. For the purpose of this paper, we define ‘knowledge’ as an individual's cognitive understanding of the EBP concept. Descriptions of attributes in the second column (Table 1) relate to actions and/or behaviours associated with using evidence and are derived from the Australian Nurse and Midwifery Council (ANMC) competency standards (ANMC, 2006) and a publication by an Australian EBP scholar (Pearson et al., 2007). In the third column (Table 1), core EBP skills derived from the international literature offer useful performance indicators to capture an individual's ability to implement EBP through the 5-step model.

The Attributes of EBP Competence The ANMC competency standards for Registered Nurses (RNs) provide a benchmark against which the performance of RNs is assessed for gaining initial nursing registration, or for the purpose of nursing practise appraisal. There are currently ten competency standards organised into four domains: professional practice, critical thinking and analysis, provision and coordination of care, and collaboration and therapeutic care. Within the second domain, a number of essential elements for research participation are listed, with the expectation of RNs to practise within an evidence-based framework as part of their critical thinking and analytic skills (ANMC, 2006). The current competency elements are not contextualised within the EBP 5-step model and specific skills required for implementing EBP are either implicit or narrowly focused on research participation. Further, while some of the elements in the second column of Table 1 describe expectations for EBP knowledge

Downloaded from ClinicalKey.com/nursing at Veteran's Affairs Medical Center - Wichita May 12, 2016. For personal use only. No other uses without permission. Copyright ©2016. Elsevier Inc. All rights reserved.

192

K. Leung et al. / Nurse Education Today 39 (2016) 189–196

Table 1 Evidence-based practice competency framework for Registered Nurses/Midwives 2016. EBP 5-step model

Attributes of EBP competence

Attributes of EBP competency

Ask

1.1 Identifies the relevance of research to improving individual/group health outcomes1 1.2 Identifies problem(s)/issue(s) in nursing practice which is/are suitable for research1 2.1 Demonstrates analytic skills in accessing health information and research evidence1

1.3 Understands the acronyms of PICO2–6 1.4 Converts a clinical scenario into an answerable question using PICO strategies2–6

Acquire

Appraise

2.2 Understands the difference between filtered (pre-appraised) and unfiltered (un-appraised) database resources, recognises the common databases being used2–5, 7 e.g. Medline, CINHAL 2.3 Relates study types to study designs2–5,7 2.4 Possesses basic searching skills2–6 2.5 Constructs search strategy plans for relevant databases2–6 3.4 Recognises key research terminologies and commonly used statistical terms3,4 3.5 Recognises common tests used for quantitative/qualitative analysis3,4 3.6 Uses a relevant appraisal tool to evaluate the strength of evidence3–5 3.7 Identifies the strength and applicability of evidence6:

3.1 Demonstrates analytic skills in evaluating health information and research evidence1 3.2 Undertakes critical analysis of evidence in considering its application to practice1 3.3 Understands how knowledge/evidence is transferred through appraisal of evidence8 If pre-appraised research evidence is used, this step may be skipped.

Apply

Assess (evaluate)

4.1 Uses evidence to improve current practice1 4.2 Recognises that nursing expertise varies with education, experience and context of practice1 4.3 Participates in research and quality improvement activities1 4.4 Changes practice via guidelines/protocols8 5.1 Reviews the outcome of nursing care1 5.2 Participates in case review activities1 5.3 Discusses implications of research with colleagues1 5.4 Seeks feedback from various sources to improve quality of care1 5.5 Participates in review of practice outcomes, standards and guidelines; review of policies, procedures and guidelines based on evidence1 5.6 Recognises the need to evaluate the impact on outcome8

a. For quantitative evidence — discusses benefit and harms of choices in measurable numbers/effect size2,3,5,9 b. For qualitative evidence — discusses the evidence's credibility 2–4 using the FAME (Feasibility, Appropriateness, Meaningfulness, and Effectiveness) scale10,11 or other qualitative appraisal tool5 4.5 Summarises all applicable evidence with consideration of patient's preference and other clinical and non-clinical contextual factors2,4,6,12 4.6 Explains evidence and discusses options with patient in lay language2,12 4.7 Applies evidence into clinical scenario2 5.7 Assesses outcome of care through application of the first four steps of EBP model13 5.8 Identifies strategy for direct measures of care outcomes4,6,13 e.g. derived from clinical documentation, case review, patient's feedback 5.9 Reflects on own skills and seek improvement13

Superscripts relate to references on pages 7 and 8.

(such as in 2.1 Demonstrates analytic skills in accessing health information and research evidence (Table 1)); others capture EBP competence as skills (as in 5.3 Discusses the implications of research with colleagues (Table 1)). As it was our intention to illustrate the current benchmark for measuring competence in evidence-based nursing practice in Australia, we included all elements attributed to the application of evidence-based knowledge and skills from the current Australian EBP competency standards for the registered nurse. The Attributes of EBP Competency International nursing and healthcare literature was sourced to describe elements for measuring EBP skills as shown in the third column of Table 1. These performance indicators allow for a step-by-step criterion assessment of individuals' abilities to transform their EBP knowledge (cognitive understanding) into skills (behavioural performance) through the 5-step implementation model (Straus et al., 2011). RNs are, for example, expected to understand the PICO acronym (Patient/ Population; Intervention; Comparison; Outcome) and be able to use it to formulate an answerable question as the first step of implementing EBP ‘Ask’. The second step of EBP implementation is to ‘Acquire’ evidence, which refers to the development of literature searching skills using electronic databases and various sources; followed by the ability to ‘Appraise’ the rigour and trustworthiness of quantitative and/or qualitative research method(s) contributing to the research evidence base. After appraising the strength and applicability of research evidence, the fourth step of EBP implementation is where the applicable evidence is integrated with the bio–psycho-social context and values of patients and their families through shared healthcare decision-making. The final step of implementing EBP ‘Assess’ involves evaluating the effects on patient care or assessing the impact of the EBP intervention. This important but often neglected step measures the effectiveness of the

implemented evidence as well as the process from ‘Ask’ to ‘Apply’ (Dicenso et al., 2011; Hoffmann et al., 2013; Melnyk et al., 2014; Straus et al., 2011; Tilson et al., 2011). Survey Participants Three groups of participants were invited to serve as an expert panel to comment on the proposed elements for measuring evidencebased knowledge and skills within the newly developed competency framework. A purpose-specific questionnaire was developed to record their views. The first group were nominated as ‘EBP experts’ and included medical and allied health academics and researchers, some of whom also taught epidemiology, research or EBP as part of their substantive roles, thus their capacity and knowledge of EBP is assumed (Goodman, 1987). ‘Nurse academics/educators’ formed the second group, while the third group comprised ‘experienced nurse clinicians’. Using three groups of participants from different roles in health and academia enabled us to solicit a broad perspective on EBP competence assessment. In addition to identifying areas of agreement and disagreement, it was possible to elicit what these groups perceived as appropriate levels of EBP preparation for nurses and their opinion on core EBP knowledge and skill requirements within the climate of interprofessional collaboration aimed at improving person-centred care. Date Collection and Ethical Considerations All survey participants were recruited through the email network of the medical and nursing schools of a large research-focussed university and the Nurse Teachers' Society of New South Wales, Australia. The invitation email gave an introduction to the study and explained the participant's role as part of an expert panel, commenting on elements of the newly developed competency framework. Participants

Downloaded from ClinicalKey.com/nursing at Veteran's Affairs Medical Center - Wichita May 12, 2016. For personal use only. No other uses without permission. Copyright ©2016. Elsevier Inc. All rights reserved.

K. Leung et al. / Nurse Education Today 39 (2016) 189–196

expressing an interest in the project were given a choice to complete the survey either through face-to-face/telephone interview, or to follow written instructions to have the package delivered to them by post. All participants received the same survey package, regardless of their chosen delivery method. The package included a one-page introduction to the study; instructions for completing the questionnaire; a participant information statement; a draft of the EBP competency framework; a questionnaire; and a reply paid envelope if the participant opted to complete the survey at their own time. The survey could be completed anonymously and was open for three months (August to December, 2013). A reminder email was sent four weeks after the survey package had been sent, if not returned. The questionnaire consisted of two sections: (a) a series of rating of questions about the structure and relevance of the EBP framework using a 5-point Likert scale with 1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree, and (b) two dichotomous questions (yes/no answer) followed by open-ended contingency questions for collecting participants' opinions about the elements proposed for measuring EBP knowledge and skills. All participants were instructed to take into account that the competency framework encompasses a list of minimal requirements a registered nurse/midwife should meet at completion of their undergraduate nursing course. In the first section, participants were asked to rate their level of agreement with eight statements about the relevance and appropriateness of the EBP framework, for example, “Using the 5-step EBP model to build the framework was appropriate.” The second question was about the relevance of competency elements, “The competency elements set for EBP knowledge were relevant to the 5-step model.” The remaining six statements were built from background literature used to inform the structure of the framework. The study was approved by the Human Research Ethics Committee of the University of Sydney, Australia. All questionnaires were completed anonymously, identified only with a group name and number for analysis. Statistical Analysis Quantitative survey data were analysed using the Statistical Package for Social Sciences (SPSS) Version 23. Descriptive statistics such as means, standard deviations and percentages were used to explore the data and the intraclass correlation (ICC) test was used to examine agreement between the three groups of participants.

193

was substantial, ICC: 0.80, 95% CI: 0.67–0.88, P b 0.0001. Table 2 summarises quantitative data from the questionnaire survey. As noted, there were significant differences between the three groups on all items. Qualitative analysis of the two open-ended survey response questions revealed three common themes in participants' opinion of the competency elements.

Theme 1: A Useful EBP Framework The framework appears to be a useful guide for measuring EBP competence in nurses. Several participants indicated that the framework has credibility and utility because it is clearly linked to accepted curriculum and professional frameworks. This was consistent with results of the survey question relating to the validity of the framework for assessing the EBP competency standards required for nurses, for which 31 of 42 (74%) (Table 2) agreed. Almost all participants (98%) agreed that the framework was a useful tool for measuring EBP competence in nurses and this constitutes excellent face validity for the framework. Examples of relevant participant comments are presented as follows: ‘The elements are from the ANMC standards so are relevant known to nurses’ [(Nurse educator/academic-id7)]

‘Very well constructed framework. It will be a useful addition to nursing practice toolkit’ [(Nurse educator/academic-id25)] When participants were invited to comment on the knowledge and skills elements of the framework in survey questions 9 and 10, most (91%) (Table 2) agreed with the knowledge elements, and more than half (67%) agreed with the skills elements. The remaining others indicated that the elements were either too easy or too difficult, requiring further adjustment. These results were reflected in qualitative comments (below), with many suggesting that the flow of EBP skills in the framework seemed to fit with nursing practice, covering essential elements for EBP implementation. ‘Broadly covers the necessary knowledge required’ [(EBP expert-id11)]

Qualitative Analysis Simple thematic analysis was used to collate and synthesise participants' opinions about the proposed knowledge and skills elements of the survey. Narrative comments were read through several times to gain an overall perspective, then categorised into key themes. Coresearchers independently identified themes from the raw comments; any incongruent themes were discussed before achieving final consensus on all themes. The themes generated from this analysis were used to triangulate the interpretation of quantitative components.

‘Easy, logical process/sequence. Good acronym 5A's’ [(Nurse clinician-id12)] ‘Could strengthen or broaden conceptual application of planning c/in [within] the framework of the nursing process’ [(Nurse educator/academic-id5)] ‘I found it — the language to be appropriate given the neophyte native of most practising nurses with the concept’ [(Nurse educator/academic-id24)]

Results Of 67 participants invited to participate, 42(63%) completed the questionnaires. The largest group of participants completing the questionnaire was ‘nurse academics/educators’ (n = 25, 60%), whose primary role is teaching in clinical settings or educational institutes across Australia. The ‘EBP experts’ group (n = 11, 26%) included academics and researchers from a range of medical, nursing and allied health disciplines working within an affiliated national research network associated with a Sydney medical school. The third group consisted of six (14%) ‘experienced nurse clinicians’, including four consultant nurses who were engaged with ongoing research in their specialty fields. Participant agreement on the structure and relevance of the framework

Theme 2: Varying Expectations of EBP Competence Participants had different expected levels of EBP competence for nurses. Although most participants (38 of 42, 91%) (Table 2) agreed with the proposed knowledge elements, about one-third (33%) recommended revisions of the skills elements, with some also suggesting what they perceived as the appropriate standard for measuring EBP skills. The diversity of suggestions for these skills elements was in accordance with the observed disagreement between the three groups; for example, 21 of 25 (84%) of the ‘nurse academics/educators’ agreed that no change was required, whereas more than half of the ‘EBP

Downloaded from ClinicalKey.com/nursing at Veteran's Affairs Medical Center - Wichita May 12, 2016. For personal use only. No other uses without permission. Copyright ©2016. Elsevier Inc. All rights reserved.

194

K. Leung et al. / Nurse Education Today 39 (2016) 189–196

Table 2 Participant agreement on the structure and relevance of EBP competency framework. Items 1 = strongly disagree; 2 = disagree; 3 = neutral; 4 = agree; 5 = strongly agree

1. Using the 5-step EBP model to build the framework was appropriate 2. The competency elements set for “EBP knowledge” were relevant to the 5-Step model 3. The competency elements set for “EBP skills” were relevant to the 5-Step model 4. The document and literature selected for building the framework were appropriate 5. The structure of the competency framework was well established 6. The framework was likely to bring out the EBP competency standard required for nurses 7. The competency framework was pertinent to EBP education and research 8. The framework was a useful guiding tool for measuring EBP competence of nurses 9. Do you agree with the elements set for measuring EBP knowledge of Registered Nurse (RN)? (1 = agree; 0 = disagree) 10. Please look at the elements proposed for measuring EBP skills of RN and comment if they are reasonably expected for a beginner RN (1 = no change; 0 = too easy/difficult)

Agreement⁎ of all participants (n = 42)

Agreement⁎ within each group

Mean (SD) n* (%)

Nurse educator (n = 25) n* (%)

Clinician (n = 6) n* (%)

EBP expert (n = 11) n* (%)

25 (100%)

6 (100%)

11 (100%)

23 (92%)

6 (100%)

9 (81.8%)

23 (92%)

5 (83.3%)

10 (90.9%)

19 (76%)

5 (83.3%)

9 (81.8%)

22 (88%)

5 (83.3%)

10 (90.9%)

19 (76%)

4 (66.6%)

8 (72.7%)

24 (96%)

5 (83.3%)

11 (100%)

24 (96%)

6 (100%)

11 (100%)

4.55 (0.50) 4 (100%) 4.26 (0.70) 38 (90.5%) 4.26 (0.70) 38 (90.5%) 4.13 (0.69) 33 (82.5%) 4.24 (0.79) 37 (88.1%) 3.98 (0.72) 31 (73.8%) 4.40 (0.67) 40 (95.2%) 4.36 (0.53) 41 (97.6%) 0.9 (0.30) 38 (90.5%) 0.67 (0.48) 28 (66.7%)

¥

¥

22 (88% )

5 (83.3% )

11 (100%¥)

21 (84%¥)

2 (33.3%¥)

5 (45.5%¥)

⁎ Participant who rated 4 = agree or 5 = strongly agree for items 1–8. ¥ Participant who rated 1 = agree for item 9 and 1 = no change for item 10.

experts’ (6 of 11, 55%) disagreed with the ‘action verbs’ that were used for describing skills elements. Several participants from the ‘EBP experts’ group recommended the re-wording of competency criteria such as ‘recognises’ common statistical tests (element 3.5, Table 1) be replaced with ‘evaluates’, which, if using the competence acquisition of Bloom's taxonomy as a guide, indicates a higher level of expectation in appraising evidence (Anderson and Krathwohl, 2001). The following comments indicate participants' diverse expectations and uncertainties about levels of EBP skill across health professions. ‘There is a range of levels which is good…’ [(Nurse educator/academic-id38)] ‘Summary of all applicable evidence maybe a lot to expect; suggestsmaybe summary of key literature evidence depending on level of nurse…maybe as part of a broader multi-disciplinary team rather than identify a strategy alone; suggests as part of multidisciplinary team’ [(EBP expert-id4)] ‘The taxonomy is highly acceptable, through individual nurses’ perception may differ because of their subjective interpretation on everybody views their world uniquely e.g. what one person may recognise, another may not even perceive. There needs to be some way to discern the subjective differentiation of the EBP tool's outcome.’ [(Nurse educator/academic-id9)]

Theme 3: Challenges to EBP Implementation There were perceived challenges to EBP implementation from nursing experts. Despite the competency framework being perceived as useful in guiding assessment for EBP, several participants also identified factors affecting the way nurses apply evidence in their practice. Nurses' level of autonomy was perceived as an obstacle to the successful implementation of EBP, with nurses (particularly novice clinicians) being required to adhere to guidelines or protocols mandated by their organisations/facilities. For example, one participant stated:

‘4.4. in my practice context (public hospital-acute) a first year RN would probably not have the opportunity to change practice via guidelines/protocols or to apply evidence to clinical scenario in a formal manner. Would have to consult with senior staff’ [(Nurse educator/Academic-id29)]

Discussion The purpose of this study was to elicit the views of health educators/ researchers about a newly developed framework for measuring evidence-based knowledge and skills in nurses. A descriptive questionnaire survey was used to establish the face validity of the framework and gain feedback on the content. The framework appears to be a useful guide for measuring competence in evidence based nursing practice in different clinical contexts. Using this competency framework, the nature of each knowledge and skill assessment criterion can be characterised with regard to the 5-step EBP implementation model. Although there is some disparity between participants in terms of the language used for describing skills elements within the framework (e.g. evaluate vs recognise), so far there has been no consensus on which language style or taxonomy is preferable across health disciplines. This finding is consistent with the view of other authors (Laibhen-Parkes, 2014; Melnyk et al., 2014; Stevens, 2013). In nursing, subsequent revisions of Bloom's taxonomy (Anderson and Krathwohl, 2001) are commonly used for defining the spectrum of competence acquisition, from ‘recognise’ (lowest ranking) through ‘evaluate’ to ‘create’ (highest ranking). Participants in the ‘EBP experts’ group appear to have a higher expectation on evidence appraisal skills than other groups. Nevertheless, cognitive knowledge and psychomotor skills are only some of the factors that influence an individual's performance. Attitudes, emotions, personality traits and environment also contribute to competence and performance (Khan and Ramachandran, 2012; ten Cate et al., 2010). The core concept of EBP is to apply an optimal, research-based clinical decision within patient and family preferences. The inconsistent use of ‘action verbs’ in defining competency elements lacks practical meaning for many and leads to trivial debates about whether words such as ‘understand’ are better than ‘recognise’ when

Downloaded from ClinicalKey.com/nursing at Veteran's Affairs Medical Center - Wichita May 12, 2016. For personal use only. No other uses without permission. Copyright ©2016. Elsevier Inc. All rights reserved.

K. Leung et al. / Nurse Education Today 39 (2016) 189–196

there are many other important factors affecting successful EBP implementation. While the ‘EBP expert’ group responding to this study suggested that a higher level of competence for critical appraisal skills was required for reviewing the quality of evidence (step 3 of the 5-step EBP model), this may not be either necessary or reasonable for nurse clinicians. Despite requiring good mathematical skills, many nurses do not feel comfortable interpreting statistics (Epstein et al., 2011; Gaudet et al., 2014). As good quality pre-appraised evidence sources become more readily available, clinicians are given more options for support with this particular step. This may greatly reduce nurses' knowledge gap for evaluating the rigour and applicability of research evidence. The move to national registration of health professions in Australia in 2010 highlights the fact that nurses are an integral part of a large inter-professional collaboration. Although this framework was originally developed for guiding assessment of EBP competence in nurses and midwives, ‘evidence-based’ principles are now applied to the scope of practice in all clinical and non-clinical health professions/disciplines — including social care, criminology and health information services (Dawes et al., 2005b; Hoffmann et al., 2013). Therefore, this framework is potentially adaptable to the professional practice of other health professions and disciplines. This study has formed the basis for development of an acceptable EBP competence assessment tool for which validity and reliability have already been tested with three groups of health professionals, including nurses. While acknowledging the recently developed seven-step EBP competency guide for practising RNs (Melnyk et al., 2014), the proposed EBP framework in this study has integrated the generic five steps of the EBP model (Straus et al., 2011) to explicitly direct the intellectual and physical skills that are required for implementing each step of EBP applicable in different contexts of clinical practice. Finally, the comments of participants who raised issues about successful EBP implementation were unexpectedly similar between groups and are largely consistent with other studies that aimed at investigating barriers for implementing EBP. Evidence implementation has been extensively researched and continues to be an ongoing agenda in nursing practice. Limitations The limitations of this study are the small sample size and the use of purposive sampling through the academic/research network of only one university, however, this network did extend across a number of Australian states. The EBP experience of panel members was not quantified, but was assumed through purposive sampling. This may have biased the findings if participants were uncertain about the concepts and process for EBP implementation themselves. Although questionnaire surveys were used to collect participants' agreement and feedback, the strength of this study is the inclusion of participants representing academia and research across health and medicine, as well as experienced nurse clinicians. We believe that this EBP competency framework could be used as a reference guideline for other health disciplines. Future studies should solicit agreement and feedback from EBP academics or experts through a Delphi survey. Conclusions EBP competency standards can be used to succinctly establish expectations regarding the level of performance of health professional registrants. The development of this competency framework offers a roadmap for EBP academics and educators to navigate essential criteria for measuring EBP competence. While initially developed for beginning registered nurses, the generic criteria within the framework are useful performance indicators that attribute a consistent standard for implementing EBP across different clinical settings or health professions. The findings of this study suggest that this framework could be

195

acceptable to a range of health professionals and educators, and may be of value to EBP education and research in nursing. However, there remains some uncertainty and disagreement about the levels of EBP competence required across the health professions. Much effort is required to clarify the language used for measuring competence in consideration of clinicians' experience and obstacles in EBP implementation within nursing. These challenges further implicate the need for setting a reasonable EBP competency benchmark with a broader group of stakeholders in nursing. Acknowledgements We would like to express our sincere gratitude to all 42 participants from the University of Sydney and its affiliated researchers, and the members from the Australian Nurse Teachers Society in Australia; for your precious time and effort in completing the questionnaire for this study. References Anderson, L.W., Krathwohl, D.R., 2001. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. Complete ed. Longman, New York. 1 ANMC, 2006. National Competency Standards for the Registered Nurse. fourth ed. Australian Nursing and Midwifery Council, Australia. Baez, A., 2004. Development of an Objective Structured Clinical Examination (OSCE) for parcticing substance abuse intervention competencies: an application in social work education. J. Soc. Work. Pract. Addict. 5, 3–20. Biggs, J., 1994. Learning outcomes: competence or expertise? Aust. N. Z. J. Vocat. Educ. Res. 2, 1–18. Bradshaw, A., Merriman, C., 2008. Nursing competence 10 years on: fit for practice and purpose yet? J. Clin. Nurs. 17, 1263–1269. Carraccio, C., Wolfsthal, S.D., Englander, R., Ferentz, K., Martin, C., 2002. Shifting paradigms: from Flexner to competencies. Acad. Med. 77, 361–367. Chapman, H., 1999. Some important limitations of competency-based education with respect to nurse education: an Australian perspective. Nurse Educ. Today 19, 129–135. Cowan, D., Norman, I., Coopamah, V., 2005. Competence in nursing practice: a controversial concept — a focused review of literature. Nurse Educ. Today 25, 355–362. Damron-Rodriguez, J., 2008. Developing competence for nurses and social workers. Am. J. Nurs. 108, 40–46. 3 Dawes, M., Davis, P., Gray, A., Mant, J., Seers, K., Snowball, R., 2005a. Evidence-based Practice: A Primer for Health Care Professionals. second ed. Elsevier Limited, Churchill Livingstone. Dawes, M., Summerskill, W., Glasziou, P., Cartabellotta, A., Martin, J., Hopayian, K., et al., 2005b. Sicily statement on evidence-based practice. BMC Medical Education 5 (1), 1–7. 5 Dicenso, A., Guyatt, G., Ciliska, D., 2011. Evidence-based Nursing: A Guide to Clinical Practice. Elsevier Mosby, St Louis. Epstein, I., Mina, E., Gaudet, J., Singh, M., Gula, T., 2011. Teaching statistics to undergraduate nursing students: an integrative review to inform our pedagogy. Int. J. Nurs. Educ. Scholarsh. 8, 1–15. Eraut, M., 1998. Concepts of competence. J. Interprof. Care 12, 127–139. Frank, J.R., Mungroo, R., Ahmad, Y., Wang, M., De Rossi, S., Horsley, T., 2010a. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med. Teach. 32, 631–637. Frank, J.R., Snell, L.S., Cate, O.T., Holmboe, E.S., Carraccio, C., Swing, S.R., Harris, P., Glasgow, N.J., Campbell, C., Dath, D., Harden, R.M., Iobst, W., Long, D.M., Mungroo, R., Richardson, D.L., Sherbino, J., Silver, I., Taber, S., Talbot, M., Harris, K.A., 2010b. Competency-based medical education: theory to practice. Med. Teach. 32, 638–645. Gaudet, J., Singh, M.D., Epstein, I., Santa Mina, E., Gula, T., 2014. Learn the game but don't play it: nurses' perspectives on learning and applying statistics in practice. Nurse Educ. Today 34, 1080–1086. Gonczi, A., 1994. Competency based assessment in the professions in Australia. Assess. Educ. Princ. Policy Pract. 1, 27–44. Goodman, C.M., 1987. The Delphi technique: a critique. J. Adv. Nurs. 12, 729–734. Hackett, S., 2001. Educating for competency and reflective practice: fostering a conjoint approach in education and training. J. Work. Learn. 13, 103–112. Hager, P., Gonczi, A., 1996. What is competence? Med. Teach. 18, 15–18. 7 Hamer, S., Collinson, G., 2005. Achieving Evidence-based Practice: A Handbook for Practitioners. second ed. Elsevier Limited, Philadelphia. Harrison, R., Mitchell, L., 2006. Using outcomes-based methodology for the education, training and assessment of competence of healthcare professionals. Med. Teach. 28, 165–170. 12 Hoffmann, T., Bennett, S., Mar, C.D., 2013. Evidence-based Practice Across the Health Professions. second ed. Churchill Livingstone, Elsevier, Chastwood Australia. Hsieh, S.-I., Hsu, L.-L., 2013. An outcome-based evaluation of nursing competency of baccalaureate senior nursing students in Taiwan. Nurse Educ. Today 33, 1536–1545. Hyer, K., Skinner, J.H., Kane, R.L., Howe, J.L., Whitelaw, N., Wilson, N., Flaherty, E., Halstead, L., Fulmer, T., 2004. Using scripted video to assess interdisciplinary team effectiveness training outcomes. Gerontol. Geriatr. Educ. 24, 75–91.

Downloaded from ClinicalKey.com/nursing at Veteran's Affairs Medical Center - Wichita May 12, 2016. For personal use only. No other uses without permission. Copyright ©2016. Elsevier Inc. All rights reserved.

196

K. Leung et al. / Nurse Education Today 39 (2016) 189–196

Ilic, D., Nordin, R.B., Glasziou, P., Tilson, J.K., Villanueva, E., 2014. Development and validation of the ACE tool: assessing medical trainees' competency in evidence based medicine. BMC Med. Educ. 14, 114. Jirojwong, S.J., Maree, Welch, Anthony, 2014. Research Methods in Nursing and Midwifery: Pathways to Evidence-based Practice. second ed. Oxford University Press, Victoria, Australia. Khan, K., Ramachandran, S., 2012. Conceptual framework for performance assessment: competency, competence and performance in the context of assessments in healthcare — deciphering the terminology. Med. Teach. 34, 920–928. Laibhen-Parkes, N., 2014. Evidence-based practice competence: a concept analysis. Int. J. Nurs. Knowl. 25, 173–182. Leung, K., Trevena, L., Waters, D., 2014. Systematic review of instruments for measuring nurses' knowledge, skills and attitudes for evidence-based practice. J. Adv. Nurs. 70, 2181–2195. 4 Melnyk, B.M., Fineout-Overholt, E., 2011. Evidence-based Practice in Nursing & Healthcare: A Guide to Best Practice. second ed. Lippincott Williams & Wilkins, Philadelphia. 6 Melnyk, B.M., Gallagher-Ford, L., Long, L.E., Fineout-Overholt, E., 2014. The establishment of evidence-based practice competencies for practicing registered nurses and advanced practice nurses in real-world clinical settings: proficiencies to improve healthcare quality, reliability, patient outcomes, and costs. Worldviews Evid.-Based Nurs. 11 (1), 5–15. Messick, S., 1984. The psychology of educational measurement. J. Educ. Meas. 21, 215–237. Neufeld, V., Norman, G., 1985. Assessing Clinical Competence. Springer Publication Co., New York. 9 NHMRC, 2000. How to Use the Evidence: Assessment and Application of Scientific Evidence. National Health and Medical Research Council, Canberra, Australia. 10 Pearson, A., 2002. Nursing takes the lead: redefining what counts as evidence in Australian health care. Reflect. Nurs. Leadersh. 28 (4), 18–21. 11 Pearson, A., 2004. Balancing the evidence: incorporating the synthesis of qualitative data into systematic reviews. JBI Reports 2 (2), 45–64. 8 Pearson, A., Wiechula, R., Court, A., Lockwood, C., 2007. Are-consideration of what constitutes “evidence” in the healthcare professions. Nurs. Sci. Q. 20 (1), 85–88. Pijl-Zieber, E.M., Barton, S., Konkin, J., Awosoga, O., Caine, V., 2014. Competence and competency-based nursing education: finding our way through the issues. Nurse Educ. Today 34, 676–678.

Pimlott, N., 2011. Competency-based education. Can. Fam. Physician 57, 981. Raines, D., 2010. Nursing practice competency of accelerated bachelor of science in nursing program students. J. Prof. Nurs. 26, 162–167. Ramos, K.D., Schafer, S., Tracz, S.M., 2003. Validation of the Fresno test of competence in evidence based medicine. Br. Med. J. 326, 319(313). Rebholz, M., 2006. A review of methods to assess competency. J. Nurses Staff Dev. 22, 241–245. Reeves, S., Fox, A., Hodges, B., 2009. The competency movement in the health professions: ensuring consistent standards or reproducing conventional domains of practice? Adv. Health Sci. Educ. 14, 451–453. Ryan, S., Stevenson, K., Hassell, A.B., 2007. Assessment of clinical nurse specialists in rheumatology using an OSCE. Musculoskeletal Care 5, 119–129. Scott, B., 1982. Competency based learning: a literature review. Int. J. Nurs. Stud. 19, 119–124. Simpson, D., Gehl, S., Helm, R., Kerwin, D., Drewniak, T., Bragg, D.S.A., Ziebert, M.M., Denson, S., Brown, D., Heffron, M.G., Mitchell, J., Harsch, H.H., Havas, N., Duthie, E., Denson, K., 2006. Objective Structured Video Examinations (OSVEs) for geriatrics education. Gerontol. Geriatr. Educ. 26, 7–24. Stevens, K.R., 2013. The impact of evidence-based practice in nursing and the next big ideas. Online J. Issues Nurs. 18, 122–124. 2 Straus, S.E., Glasziou, P., Richardson, W.S., Haynes, R.B., 2011. Evidence-based Medicine: How to Practice and Teach It. fourth ed. Church Livingstone Elsevier, Edinburgh. Taber, S., Frank, J.R., Harris, K.A., Glasgow, N.J., Iobst, W., Talbot, M., 2010. Identifying the policy implications of competency-based education. Med. Teach. 32, 687–691. Tanner, C., 2001. Competency-based education: the new panacea? J. Nurs. Educ. 40, 387–388. ten Cate, T.J.O., Snell, L., Carraccio, C., 2010. Medical competence: the interplay between individual ability and the health care environment. Med. Teach. 32, 669–675. Tilley, D., 2008. Competency in nursing: a concept analysis. J. Contin. Educ. Nurs. 39, 58–64. 13 Tilson, J.K., Kaplan, S.L., Harris, J.L., Hutchinson, A., Illic, D., Niederman, R., et al., 2011. Sicily statement on classification and development of evidence-based practice learning assessment tools. BMC Med. Educ. 11 (78), 1–10. Whelan, L., 2006. Competency assessment of nursing staff. Orthop. Nurs. 25, 198–204. Windsor, C., Douglas, C., Harvey, T., 2012. Nursing and competencies — a natural fit: the politics of skill/competency formation in nursing. Nurs. Inq. 19, 213–222.

Downloaded from ClinicalKey.com/nursing at Veteran's Affairs Medical Center - Wichita May 12, 2016. For personal use only. No other uses without permission. Copyright ©2016. Elsevier Inc. All rights reserved.