Nursing students' views of clinical competence assessment

0 downloads 0 Views 234KB Size Report
EDUCATION. British Journal of Nursing, 2012, Vol 21, No 15. 923. Nursing students' views of clinical competence assessment. In 2002, a four-year BSc nursing ...


Nursing students’ views of clinical competence assessment Carmel Bradshaw, Maureen O’Connor, Geraldine Egan, Katie Tierney, Mary Pat Butler, Anne Fahy, Dympna Tuohy, Irene Cassidy, Bernie Quillinan, Mary C McNamara


n 2002, a four-year BSc nursing programme was introduced in Ireland, which resulted in undergraduate student nurses and programmes becoming fully integrated into universities and third-level institutes for the theoretical component of their programme. Criteria for programme approval included a stipulation that assessment of students’ clinical practice is undertaken using a competency assessment model (An Bord Altranais, 2000). The introduction of a competency assessment framework with the BSc programme heralded a significant change in nurse education, with assigned staff nurses assuming the role of preceptor and assessor of student nurses in practice. An Bord Altranais (2000) expanded on the domains of competency, using performance criteria, and recommended that these be further developed locally in the form of critical elements. A critical element is defined by An Bord Altranais (2003) as a set of single, discrete observable behaviours that are mandatory for the designated skill at the target level of practice. Students are expected to demonstrate competence in each critical element before the preceptor will deem them to have passed the competency. A competency assessment tool covering 17 competencies to be completed over a four-year period was developed by the University of Limerick in collaboration with local health service providers. A minimum of two weeks’ clinical placement is required to complete each competency. Student assessment includes a preliminary, intermediate and final interview with a preceptor. For continuity, the preceptor usually carries out all three interviews. The theoretical framework underpinning this assessment approach is the competency outcomes and performance assessment (COPA) model (Lenburg, 1999). This model provides an organising framework for the development of core competencies and competency outcome statements referred to in the competency documentation as critical elements, which define the expected competence. Carmel Bradshaw, Maureen O Connor, Mary Pat Butler, Anne Fahy, Dympna Tuohy, Irene Cassidy and Bernie Quillinan are lecturers, all at Nursing and Midwifery Department, University of Limerick. Geraldine Egan is Clinical Placement Coordinator, Nurse Practice Development Team; Mary C McNamara is Specialist Coordinator, Centre of Nurse & Midwifery Education, HSE West (LCNTR); and Katie Tierney, Clinical Educator, Theatre Department; all at Midwestern Regional Hospital, Dooradoyle, Limerick. Accepted for publication: July 2012

British Journal of Nursing, 2012, Vol 21, No 15


This paper reports on some outcomes of a research study evaluating a new assessment framework of clinical competence used in undergraduate nursing programmes in the Mid West Region of Ireland. First, this paper presents both the strengths and weaknesses of the present model, as articulated by student nurses. Second, it generates a broader critical debate around the concept of competency assessment. The model of competence in question was developed by the Irish Nursing Board then elaborated on by the University of Limerick in partnership with local health service providers in 2002. Methodology involved a triangulated approach, comprising a series of focus group interviews with students (n=13) and preceptors (n=16) followed by a survey of students (n=232) and preceptors (n=837). Findings from the student focus groups are reported here. Themes identified using Burnard’s (1991) framework for analysis are preparation for competency assessment, competency documentation, supporting assessment in practice, organisational and resource factors and the competency assessment structure and process. Results from this research have implications for refinement and revision of the present competency assessment framework, for student and staff preparation and for collaboration between stakeholders. Key words: Clinical competency assessment n Assessment framework n Student perceptions n Student experience

A research study was undertaken to evaluate the structure, content and process of the locally developed competence assessment approach in the preregistration BSc general, mental health and intellectual disability nursing programmes in the Mid-West region of Ireland. The two-phases study elicited students’ and preceptors’ views on this new approach (Butler et al, 2009). This paper reports on the qualitative findings from the student focus groups.

Literature review Developing a competence assessment tool Given that competence in nursing is inextricably linked to safe and effective practice, it is vital to consider its meaning when considering the effectiveness of competency assessment models. Competence has been defined as both the ability to perform nursing tasks and as a psychological construct, concerned with evaluating students’ ability to integrate cognitive, affective and psychomotor skills when delivering nursing care (Girot,


British Journal of Nursing. Downloaded from by on June 9, 2016. For personal use only. No other uses without permission. . All rights reserved.

Table 1. Domains of competency Domain

Performance criteria

Professional ethical practice

n Practises in accordance with legislation affecting nursing practice n Practises within the limits of own competence and takes measures to develop own competence

Holistic approaches to care and the integration of knowledge

n Conducts a systematic holistic assessment of client needs based on nursing theory and evidence-based practice n Plans care in consultation with the client taking into consideration the therapeutic regimes of the healthcare team n Implements planned nursing care/interventions to achieve the identified outcomes n Evaluates client progress towards expected outcomes and reviews plans in accordance with evaluation data and in consultation with the client

Interpersonal relationships

n Establishes and maintains caring therapeutic interpersonal relationships with individuals/ clients/groups/communities n Collaborates with all members of the healthcare team and documents relevant information

Organisation and management of care

n Effectively manages the nursing care of clients/ groups/communities n Delegates to other nurses activities commensurate with their competence and within their scope of professional practice n Facilitates the coordination of care

Personal and professional development

n Acts to enhance the personal and professional development of self and others

Source: An Bord Altranais (2000)

1993). Competency is the knowledge, skills and professional values that underpin competence; it includes the observable performance behaviour referred to as critical elements in the competency documentation used in our organisation. The assessment of competence therefore involves not only observable behaviours but also unobservable attributes. Such attributes include values and judgment ability that reflect an individual’s capability. According to Benner (1984) competence is apparent when a nurse develops the ability to cope and manage the many eventualities in the real world of nursing. All these factors need to be taken into consideration when developing a competency assessment tool. There are three main approaches to conceptualising competence: the behavioural approach; the generic approach; and a third approach which defines competence as a holistic, integrated concept. The holistic approach is considered to be more valid than other approaches as it enables a person’s capacity to integrate knowledge, skills and attitudes in a variety of clinical situations to be assessed (Gonczi, 1994) and so is the approach taken in the documentation of this study.

Challenges of competency assessment Some of the difficulties of competence assessment are highlighted by Dolan’s (2003) study, which suggested that students were overly focused on getting competencies signed


off rather than appreciating a range of learning experiences. Students reported that the competency assessment system did not encourage fundamental skill development. Inconsistencies in interpreting competency statements and variations in the amount of supporting evidence required of students were common. Students also had insufficient evidence to show they were competent in some basic skills. In Ireland, evaluation of competency assessment in clinical practice settings is at an early stage but has been reported on by Hanley and Higgins (2005), McCarthy and Murphy (2008) and O’Connor et al (2009). In the last study, three universities collaboratively developed and evaluated a competency assessment tool for use in shared specialist practice areas (O’Connor et al, 2009). Results indicated that both students and preceptors were broadly satisfied with the structure and usability of the tool. However, they were less satisfied with the preparation received for using it. Preceptors said students were unfamiliar with the assessment format and what was expected of them. Students had difficulty arranging meetings with preceptors, and wanted more specific domains and learning outcomes to be introduced. In conclusion, the literature claims a competency approach to assessment has the potential to enable the development of critical, analytical, problem-solving and decision-making skills (Burke and Harris, 2000). However, in Ireland, concerns have been identified about a lack of understanding of the competency assessment process (McCarthy and Murphy, 2008) and difficulty interpreting the language used in the competency assessment tool (Hanley and Higgins, 2005); in addition, the process is often regarded as time consuming, with insufficient preparation and support provided (O’Connor et al, 2009). This study is significant in that it explores both preceptors’ and students’ perspectives on the content and application of a competency assessment tool used in the Mid-West region in Ireland (Butler et al, 2009). This paper will report on part of the study—the experiences of student nurses using the competency assessment framework in practice.

Methodology Research design This study was undertaken in two phases, incorporating qualitative and quantitative methodologies. In the first phase, four focus group discussions were conducted to establish students’ and preceptors’ experience and views of the competency assessment process and the assessment document. Findings from the two student focus groups are reported here. The second phase consisted of a survey using questionnaires developed from the focus group data. Students (n=232) and preceptors (n=837) were surveyed, with a response rate of 202 and 255 respectively. The findings of this phase are reported by Butler et al (2009). Ethical approval for the study was granted by the university and three hospital/healthcare research ethics committees.

Focus groups Focus groups are considered an effective means of exploring a range of ideas and perspectives on new phenomena (Joyce, 2008) so are appropriate in the context of this study. In addition, focus groups can be valuable in developing and

British Journal of Nursing, 2012, Vol 21, No 15

British Journal of Nursing. Downloaded from by on June 9, 2016. For personal use only. No other uses without permission. . All rights reserved.

EDUCATION enhancing the reliability of questionnaires, particularly when the concept under discussion is new (Polit and Hungler, 1999; Parahoo, 2006). Data derived from the focus groups were used to generate the questionnaire for the second phase of the study. To recruit students for the focus groups, posters advertising the study were displayed in the university. Information about the study and a consent form were distributed to students. There was a concern that students may have perceived there was pressure to participate in the study, as Ferguson et al (2006) outlined in their discussion on involving students in research. To address this, members of the research team distributed the information and consent forms in lectures other than their own and left the lecture theatres once they had provided the information. Two focus group interviews with students (n=13) were conducted between April and June 2006. Students in the first year and second years of the preregistration BSc nursing programmes were invited to participate, both groups having had experience of the competency assessment process. The focus groups were tape recorded with the students’ consent. Each session lasted 60–90 minutes and was facilitated by two members of the research team. To focus the discussion, students were asked about their experiences of being assessed using the competency documentation in relation to preparation, the assessment process and support mechanisms. The facilitators also took notes during the focus groups.

Data analysis Verbatim transcripts of the focus group interviews were analysed using thematic content analysis as outlined by Burnard (1991). Four researchers each independently critically reviewed all categories and higher-order categories derived from the student data. Subsequent discussions led to a consensus of five themes, each with a number of subthemes emerging from the data.

Findings and discussion The themes and subthemes identified are detailed below The findings from the student focus groups include an emphasis on the need for preparation of both students and preceptors for the competency assessment process. Inconsistencies in how preceptors carried out the competency assessment process were noted and the significance of support available was highlighted by the students. The competing demands of clinical practice and their impact on the competency assessment process were raised, as was the complexity of the language used in the documentation. A perception by the students of the competency assessment as a one-off assessment was also noted, with students saying there was an overemphasis on theory at the expense of developing clinical skills.

Preparation for competency assessment Some shortfalls in relation to preparation for the competency assessment were noted. The timing of the initial workshop needs to be considered. Students attended this workshop before their first clinical placement. It may be difficult for students to engage in a meaningful way in something which they have not yet experienced.

British Journal of Nursing, 2012, Vol 21, No 15

Table 2. Student focus group findings Themes


Preparation for competency assessment

n Theoretical and practical preparation n Preceptor preparation

Competency documentation

n Language used in the competency document n Layout and user friendliness of the competency document

Supporting assessment in practice

n Support from clinical placement coordinators n Support from preceptors n Support from staff nurses and other students

Organisational and resource factors

n Competing demands of ward work and assessing competencies n Availability of clinical placement coordinators

Competency assessment structure and process

n Structure of the competency document n Grading competencies n Competency assessment process n Student centred focus


British Journal of Nursing. Downloaded from by on June 9, 2016. For personal use only. No other uses without permission. . All rights reserved.

‘Very concise … but [I] came out of it none the wiser.’ Another student noted that it was: ‘very squashed in. You got a lot of information and books and stuff. You get it all at once so it’s a bit daunting.’ It could be argued that students have enough to cope with on their first placement without the additional pressure of a summative assessment. One student suggested there should be: ‘… a trial run first, and come back on the second placement and complete competency.’ However, omitting a summative clinical assessment in the first placement may not be an option given stipulations from An Bord Altranais (2000). These findings are supported by O’Connor et al (2009) in that preceptors considered students were unfamiliar with the assessment format and what was expected of them, despite formal preparation for it. A single workshop was not considered adequate preparation by the students. Given the increase in complexity of the competencies required as students progress through the programme, it would appear wise to provide refresher workshops. Deficiencies in preceptor preparation were suggested by some students: ‘They don’t know themselves what they are meant to be getting from us, so they don’t know if we are right or wrong.’ This comment is echoed by Watson (1999) and, similarly, by Calman at al’s study (2002), which indicated that assessors did not understand the documentation, even though preparation had been provided. In response to this finding, competency workshops are now scheduled on an annual basis for each student cohort. More preceptor sessions (which include competency workshops) and refresher sessions are being run,

Competency documentation, assessment structure and process The complexity of the language used in the documentation was clearly off-putting for many students. Recurring themes of ‘off-putting phraseology’ and ‘confusing terminology’ were noted in the transcripts. The language used may suggest increasing academia in the nursing profession but documentation used to underpin clinical assessment must reflect that nursing is a practice-based profession. Hanley and Higgins (2005) similarly found the language of the competency assessment tool used in their study vague and difficult to understand. The layout of the documentation must also reflect the practice-based nature of the nursing profession. Students perceived the competency document as: ‘Not user friendly, and the layout could be better, [it’s] all bunched up together.’ Sufficient space must be given to identifying the individual student’s needs in practice (including clinical skills), rather


than the provision of written supporting evidence, which was perceived by some as making the assessment academic in focus. There have been changes to the documentation in response to these findings and the language used is under review. Inconsistency among preceptors in relation to the process was also noted by the students in this study, a finding echoed in research by Dolan (2003) and in Hanley and Higgins (2005). This lack of consistency was perceived to have an adverse effect on the students’ learning and experience: ‘Some of them [preceptors] expect loads off you and are really hard on you and others do not.’ The student’s perceptions of the competency assessment process as a one-off assessment is noteworthy, given that the assessment is designed to be a continuous assessment of practice. One student said: ‘I think it should be ongoing …. I don’t think it should be just one assessment.’ The process lends itself to continuous assessment where the student and preceptor are expected to undertake and document initial, intermediate and final interviews before the competency is completed. Revised guidelines for the competency assessment process now promote the interviews as opportunities to identify learning needs and provide feedback to students. This should help clarify the continuous nature of the assessment. It appeared from the findings that competencies were being considered in isolation from students’ overall practice. The number of competencies to be completed in a set time frame was also raised. ‘They can only fail you on your competencies … so, if you have them done … you can do nothing for the next three weeks.’ The student-centred nature of the model was seen as very positive. Students stressed that the model fosters student responsibility and promotes the concept of adult learning. ‘I think they are very good, because you can take control of your own learning experience with a competency.’ The strengths of the model were well summarised by students: ‘Sometimes people think that they can only learn about competencies in the classroom …. I mean, learning does not stop …. You learn from the patient, you learn from everybody, and student nurses, your own peers as well.’

Supporting assessment in practice This study indicates that there are many sources of support available over the competency assessment process such as clinical placement coordinators, preceptors, staff nurses, senior students, policies, procedures and reading material on the clinical placement areas. Given the value placed on the support provided by the clinical placement coordinators in particular, a case can be made for their greater availability in the practice area.

British Journal of Nursing, 2012, Vol 21, No 15

British Journal of Nursing. Downloaded from by on June 9, 2016. For personal use only. No other uses without permission. . All rights reserved.

EDUCATION This student’s comments indicated the clinical placement coordinators support was equally valued by the preceptors: ‘My preceptor was not overly confident in saying this is how we are going to do it and we were waiting for the CPC [clinical placement coordinator] to come along and say which way was right.’ Students noted that staffing levels on clinical sites and the complexities of practice areas adversely affected the completion of their competencies: ‘I felt it was very awkward, asking a nurse to help me do my competencies, because they were so busy.’ Gleeson (2008) notes that the preceptor’s first priority is patient care but they also have a responsibility to teach the student how to deliver care safely and competently within their scope of practice. Increased provision and access to ongoing education and support for preceptors may assist in integrating and enhancing the assessment process in clinical practice.

Key points n A

competency approach to assessment can enable the development of critical, analytical, problem-solving and decision-making skills

n Competency

assessment is a relatively new approach within nurse education

in Ireland n Students

were asked about their experiences of being assessed in relation to preparation, the assessment process and support mechanisms

n Both

students and preceptors need thorough preparation for assessments

n Language

used in the documentation should be straightforward to use in a busy clinical environment

n Staffing

levels in clinical areas and the complexities of practice areas adversely affected the completion of their competencies

n The

student-centred nature of the model was praised for fostering responsibility and promoting adult learning

n A

national competence assessment strategy that reflects the practice-based nature of nursing could improve consistency. Staff new to a hospital/unit would be familiar with the process if they had previously worked in Ireland

Conclusion and recommendations Competency assessment is a relatively new approach within nurse education in Ireland and presents unique challenges to educators and practitioners alike.This paper has reported on and discussed students’ perspective in relation to the competency assessment process in one geographical area in Ireland. This study has confirmed that the experiences of these students are similar to those reported in international nursing literature. Some suggestions have been made in relation to addressing the students’ concerns and some changes to the competency assessment process have been made locally in light of these findings. It is recommended that these changes be evaluated and their impact determined in relation to the competency assessment process. From a broader perspective, continued collaboration is recommended between all stakeholders to develop a more consistent approach in competence assessment methods used in clinical practice that takes a holistic approach (Butler et al 2009). This study would also support the development of a national competence assessment strategy as recommended by previous authors (Norman et al 2002; O’Connor et al, 2009) which reflects the practice-based nature of nursing with documentation that is straightforward to use in a busy clinical environment. Such a development might help offset some of the challenges associated with the competency assessment process. It may enhance consistency in preceptors’ approach to the process and ensure that staff who are new to the hospital/unit would be familiar with the documentation BJN if they had previously worked in Ireland.  The authors would like to thank the BSc nursing students who participated in this research. No conflict of interest has been noted by the authors. An Bord Altranais (2000) Requirements and Standards for Nurse Registration Education Programmes. 2nd edn. An Bord Altranais, Dublin An Bord Altranais (2003) Competence Assessment Tool for Nurses Educated and Trained Overseas in non-EU Countries. An Bord Altranais, Dublin

British Journal of Nursing, 2012, Vol 21, No 15

Benner P (1984) From Novice to Expert: Excellence and Power in Clinical Nursing Practice. Addison-Wesley, Menlo Park, California Burke LM, Harris D (2000) Education purchasers’ views of nursing as an all graduate profession. Nurse Educ Today 20(8): 620–8 Burnard P (1991). A method of analysing interview transcripts in qualitative research. Nurse Educ Today 11(6): 461–6 Butler, MP, Fahy, A, Cassidy et al (2009) An Evaluation of Clinical Competence Assessment in BSc Nursing Registration Education Programmes. University of Limerick, Ireland Calman L, Watson R, Norman I, Redfern S, Murrells T (2002) Assessing practice of student nurses: methods, preparation of assessors and student views. J Adv Nurs 38(5): 516–23 Dolan G (2003). Assessing student nurse clinical competency: will we ever get it right? J Clin Nurs 12(1): 132–41 Ferguson LM, Myrick F,Yonge O (2006) Ethically involving students in faculty research. Nurse Educ Today 26(8): 705–11 Girot EA (1993) Assessment of competence in clinical practice—a review of the literature. Nurse Educ Today 13(2): 83–90 Gleeson M (2008) Preceptorship: facilitating student nurse education in the Republic of Ireland. Br J Nurs 17(6): 376–80 Gonczi A (1994) Competency based assessment in the professions in Australia. Assessment in Education: Principles, Policy & Practice 1(1): 27–44 Hanley E, Higgins A (2005) Assessment of practice in intensive care: Students’perceptions of a clinical competence assessment tool. Intensive Crit Care Nurs 21(5): 276–83 Joyce P (2008) Focus groups. In: Watson R, McKenna H, Cowman S, Keady J, eds. Nursing Research Designs and Methods. Churchill Livingstone Elsevier, Edinburgh Lenburg C (1999) The framework, concepts and methods of the competency outcomes and performance assessment (COPA) model. The Online Journal of Issues in Nursing 2(4): A N A M a r ke t p l a c e / A N A Pe r i o d i c a l s / O J I N / Ta bl e o f C o ntents/ Volume41999/No2Sep1999/COPAModel.html (accessed 2 August 2012) McCarthy B, Murphy S (2008) Assessing undergraduate nursing students in clinical practice: do preceptors use assessment strategies? Nurse Educ Today 28(3): 301–13 Norman IJ,Watson R, Murrells T, Calman L, Redfern S (2002) The validity and reliability of methods to assess the competence to practise of pre-registration nursing and midwifery students. Int J Nurs Stud 39(2): 133–45 O’Connor T, Fealy GM, Kelly M, Mc Guinness AM, Timmins F (2009) An evaluation of a collaborative approach to the assessment of competence among nursing students of three universities in Ireland. Nurse Educ Today 29(5): 493–9 Parahoo K (2006) Nursing Research: Principles, Process and Issues. 2nd edn. Palgrave, Basingstoke Polit DF, Hungler BP (1999) Nursing Research: Principles and Methods. 6th edn. Lippincott, Philadelphia Watson NA (1999) Mentoring today—the students’ views. An investigative case study of pre-registration nursing students’ experiences and perceptions of mentoring in one theory/practice module of the Common Foundation Programme on a Project 2000 course. J Adv Nurs 29(1): 254–62


British Journal of Nursing. Downloaded from by on June 9, 2016. For personal use only. No other uses without permission. . All rights reserved.

Suggest Documents