Developing and evaluating an interactive ... - Wiley Online Library

4 downloads 65264 Views 239KB Size Report
Methods: The development of a tutorial on OVID to teach transferable .... Project Apple in .... terms, application of limits, and whether a manage-.
Developing and evaluating an interactive information skills tutorial1 Blackwell Publishing Ltd

Maria J. Grant & Alison J. Brettle, Salford Centre for Nursing, Midwifery and Collaborative Research, Institute for Health and Social Care Research, University of Salford, Salford, UK

Abstract Objective: To develop and evaluate a web-based interactive information skills tutorial integrated into the curriculum. To determine whether the tutorial was acceptable to students and explore the use of a skills assessment tool in identifying whether the tutorial improved skills. Methods: The development of a tutorial on OVID  to teach transferable information skills. A small cohort study to evaluate students’ views on the tutorial and its effects on information skills. Results: Thirteen objective assessments were usable. There was a statistically significant improvement in mean final assessment scores, compared with mean pre-training scores, F(2,14) = 11.493, P = 0.001. Eleven (85%) students had improved their overall information skills. The improvement in overall searching skills was enhanced by referral to the tutorial. Conclusions: The tutorial was successfully developed and integrated into a Masters programme curriculum. In this setting, it appears to reinforce active learning, and was well received by students, who developed core generic searching skills and demonstrated improved information skills in the short and longer term. Students could use the tutorial for revision and study at a time and place of their choosing. Further evaluation is required to assess the impact of using the tutorial with large groups of students, and as a stand-alone teaching medium.

Introduction Effective information skills (IS) are essential for healthcare students if they are to complete their courses successfully and become qualified evidence-based practitioners. Health librarians are well placed to deliver IS education and there are many examples, within the library literature, of papers describing the developCorrespondence: Maria J. Grant, Salford Centre for Nursing, Midwifery and Collaborative Research, Institute for Health and Social Care Research, University of Salford, Allerton Building, Salford M6 6PU, UK. E-mail: [email protected] 1Based on a presentation given at the Umbrella 2003 Conference: Information in Action, UMIST, Manchester, 4th July 2003.

ment and introduction of such courses. However, whilst there is evidence to demonstrate that students find these courses useful, there is limited evidence to demonstrate that teaching students searching improves their skills and ability to find information for patient care.1,2 Problems facing librarians delivering IS education and shared by the authors include: teaching large student cohorts; a wide range of resources and skills to be taught; choice of teaching methods and limited time allocated to teaching IS within the curriculum. At the time the project was conceived, there was little integration of IS education into the curriculum within the Faculty. Informal remarks from previous students also highlighted that, whilst they were

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79–86

79

80

Interactive information skills tutorial, Maria J. Grant & Alison J. Brettle

confident about their new skills on completing training sessions, this confidence disappeared when they sought to independently apply these skills at a later date. This paper describes a project to develop and evaluate a web-based interactive tutorial that aimed to overcome some of the problems outlined above. The tutorial was developed as part of the EvidenceBased Practice (EBP) module of the MRes (Masters in Research) in Health and Social Care, University of Salford in the UK, and was evaluated in February 2002 when the module first became available. Objectives The objectives were: to develop and evaluate an interactive web-based tutorial to teach IS to health students; to determine whether the tutorial was acceptable to the students; to explore the use of a skills assessment tool in identifying whether, as anticipated, the tutorial improved students’ IS. Literature review The vast amount of papers published on IS was acknowledged in a recent UK report of current IS or literacy training practices.3 It concluded that successful IS programmes should adopt a collaborative approach to include library, computing and academic staff, and should be integrated into the curriculum rather than be taught as a separate entity. The literature highlights various approaches to integrating IS into the curriculum and how librarians collaborate with academic staff. However, it is frequently descriptive and rarely evaluates whether skills have changed following training. For example, a hard-copy workbook was developed for student nurses in Newcastle in the late 1980s.4 This aimed to provide a structured and supportive framework to enable nurses to acquire and develop knowledge and skills in information retrieval, analysis and evaluation. Results showed that staff and students had a strong preference for this type of learning and the package was considered a success. Burrows et al.5 describe a strategy for integrating IS into the curriculum for medical students over a 2-year period. Library and computer skills were taught and the strategy was considered effective in enabling students to use IS as a problem-solving technique. Schilling et al.6 describe how library

staff were involved in the planning of a problem-based learning (PBL) course by serving on medical school curriculum committees. During course delivery, library staff were assigned to each PBL group and guided students in the use of information resources as questions arose. Francis et al.7 describe a multi-level library instruction programme for undergraduate and graduate nursing students at the University of Florida that is tailored to the needs of all incoming students. Fox et al.8 describe a programme at the University of Colorado that was integrated into two required modules designed to foster information-searching skills. It is taught by the clinical librarian and assignments and exams are used to assess whether students have mastered the techniques. Although all these programmes are integrated into the curriculum, many are focused around the resources of particular libraries rather than teaching generic skills. An exception to this is Dorsch et al.9 who taught transferable skills to medical students in a 10-week PBL critical appraisal skills course cofacilitated by the Library and the Department of Medicine. It was well received by students. None of these studies relate to web-based tutorials, nor do they address issues relating to lack of time for teaching or practising skills at a later date. A number of web-based tutorials are described in the literature. Librarians in Indiana10 describe a tiered approach to building student research skills year by year. The faculty librarian collaborated on the planning and delivery of the web-based training, and students responded positively to the tutorials. However, assessment of the learning outcomes had not been completed, providing no information on whether the tutorials improved IS. The INHALE project11 aimed to develop webbased IS packages, integrate them into the Virtual Learning Environment (VLE) of the University of Huddersfield, and test their impact on skills. Only subjective assessments were made. Rosenberg et al.12 evaluated changes in students’ skills objectively posttraining, and showed improvements, but not in relation to web-based learning. Project Apple in the West Midlands, UK13 evaluated users’ views of the web IS training package, but did not test their skills. As a delivery method, students liked the flexibility of studying via a web-site and learning at a time that suited them.

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79– 86

Interactive information skills tutorial, Maria J. Grant & Alison J. Brettle

In a review of the literature on evaluating web-based IS packages, Bracke et al.14 report that studies have used student and Faculty feedback as evaluation mechanisms and shown them to be positive. However, they state that this type of evaluation is inadequate in indicating whether students can apply what they have learned. This concurs with Brettle,1 who argues that more objective methods are needed to demonstrate the effectiveness of training. Bracke et al.14 highlight a number of non-health-related studies that have used pre- and post-skills tests that show little difference in tutorials via the Web or workbooks. They also suggest ‘the validity and ongoing effectiveness of web-based instruction remains an area to be explored’. This may be partly answered by an ongoing systematic review of effective methods for teaching information literacy skills to undergraduate students.15 Initial analysis suggests that webbased packages may be as effective as traditional methods of teaching IS. Methods An IS tutorial was developed and piloted. A small cohort study using pre- and post-session testing was used to evaluate the impact of the tutorial on skills and the users’ subjective views of the tutorial. Developing the tutorial Before developing the tutorial, the authors evaluated existing web-based products16−18 using a set of evaluation criteria (Fig. 1), but all were found lacking. It was therefore decided to develop a  tutorial, a resource selected for its relevance to a wide range of prospective students, and utilized OVID software that was suitable for demonstrating key skills such as thesaurus searching and combining terms. The tutorial was mounted on the Web to enable

students to visit the tutorial at a time and place to suit them. The tutorial was developed by dividing the authors’ traditional theory session into manageable chunks, actively incorporating questions that normally arose in the sessions into the text and ensuring clear user-friendly language was used. These were translated into linking web pages scripted in hypertext mark-up language (html). The tutorial begins with the rationale for a literature search, how a database works, then seven search steps covering: clarifying a search question, breaking down the question, MeSH, free text searching, Boolean operators, refining the search and final tips. At certain points in the tutorial, students are guided to open  in another browser window to enable them to complete tasks. Multiple-choice quizzes with feedback at the end of each step allow students to check their learning. A uniform format and a standard example are used throughout to demonstrate how to ‘build’ a strategy. Once the example has been completed, students are guided through a search of their own topic area. Piloting the tutorial The tutorial was piloted on seven health professionals undertaking a focused research course. Students received a theory session and used the tutorial for the practical session. Students were asked to complete a formative feedback questionnaire. Responses were positive but indicated the need for a longer session for both theory and practice. Skills were tested pre- and post-training using a skills assessment tool.12 The pilot enabled the authors to test the tool and ensure scores were applied consistently. Practical issues such as broken HTML links and problems obtaining test results were also highlighted and corrected. Using the tutorial on the evidence-based practice module

Figure 1 Authors criteria of an interactive tutorial

The 12-week module provides a general introduction to EBP including IS, research design and critical appraisal. Two sessions (weeks 2 and 3) were allocated for IS development. Each session was facilitated by an information specialist and support tutor, and lasted approximately 3 h.

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79–86

81

82

Interactive information skills tutorial, Maria J. Grant & Alison J. Brettle

A multi-professional mix of M-level, PhD students (nurse, occupational therapists and physiotherapists) and staff attended the course. The first session began with a lecture introducing students (n = 21) to IS theory (formulating a search question; selecting search terms; building up a search strategy; limiting searches) including an online demonstration. This was complemented with small group discussion and feedback of alternative information sources, and guided hands-on practice using the tutorial (http://www.fhsc.salford.ac.uk/hcprdu/litsearching.htm). Students were asked to complete ‘between-session’ exercises to encourage continued contemplation of the issues raised. These formed the initial group work of the second session. The second session, attended by 13 students, sought to consolidate learning through small group work to address areas of confusion or ambiguity. Students were also given the opportunity to complete or revisit the tutorial, to repeat the exercises on alternative databases (possibly using a different search interface e.g. SilverPlatter/WebSpirs/Dialog), and/or undertake their own searches in connection with the module assignment. Students could access the tutorial at any time between the first session and the completion of their end-of-module assignment. Additionally, they could request feedback on their searches and obtain general advice on the development of searches prepared as part of their course assessment. Evaluation The sessions were objectively and subjectively evaluated to determine the effects of the tutorial on students’ IS and their views of the tutorial. Objective evaluation To gain a baseline of skill levels, students were asked to undertake a literature search at the beginning of the first session in one of two areas: the effectiveness of nursing interventions for smoking cessation or the effectiveness of rehabilitation after stroke. These subjects were identified by the authors as being of potential interest to a varied group of practitioners and utilized a range of search skills to retrieve relevant literature. Students were asked to copy the most useful reference from the search and their search strategy onto a floppy

disk and return it to the authors. Students completed a further search at the end of the second session to assess the short-term impact of the teaching sessions. Longer-term impact was assessed via the end of module assignment which involved undertaking a systematic literature search on a topic of choice, describing the literature search process and providing search strategies, then selecting and critically appraising two papers. The IS component of the assignment was worth 30% and was marked by one of the authors (MJG). Search results were scored using an assessment tool (see online Appendix 1; modified from Rosenberg et al.12). The tool comprised a skills checklist such as Boolean operators, use of MeSH/indexing terms, application of limits, and whether a manageable and relevant number references were retrieved. The tool is scored on a scale of 1–16; a point for each feature used. The tool was modified from the original—designed for use with SilverPlatter software—by removing some items and adjusting the scoring to relate to the sessions’ learning objectives. The pilot had highlighted some issues relating to scoring when using the tool with OVID rather than SilverPlatter. In OVID  users are prompted to use MeSH subject headings so, in theory, students were able to gain points without understanding the technique. This was overcome by allocating scores for appropriate use of techniques e.g. if the OR Boolean operator had been used to combine individual lines of a search rather than a stream of terms. This latter form of ‘OR’ Boolean usage suggests that the user has simply followed OVID prompts. Subjective evaluation A questionnaire, previously piloted on a wide range of similar sessions, was administered at the end of the second session (see online Appendix 2). It explored the students’ views of the sessions, together with their perception of levels of learning and how their knowledge, skills and confidence in searching had changed. This was recorded on a five-point Likerttype scale. Students were also asked to list three likes and dislikes about the tutorial. Results Thirteen objective assessments were usable (that is, students completed and returned the pre-session

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79– 86

Interactive information skills tutorial, Maria J. Grant & Alison J. Brettle Table 1 Number of Masters in Research (MRes) students demonstrating core information skill techniques

Pre-training Post-training Post-assessment

MeSH or Boolean

MeSH and Boolean

MeSH and Boolean and systematic

Strategy includes items other than MeSH and Boolean

Strategy/end-ofmodule assessment not submitted

7 2 1

2 3 —

2 6 7

2 0 2

0 2 3

evaluation, including their search strategy and their search results, and at least one of either the postsessions search or course assignment). The following results are based on these 13 assessments. All 13 students had enhanced their core IS techniques (Table 1). Prior to the sessions, seven students (54%) demonstrated an understanding of one basic search technique, e.g. MeSH searching or use of Boolean operators. Two students (16%) had grasped both of these basic search techniques. A further 16% (n = 2) demonstrated an ability to develop a systematic approach to searches, including the use of more advanced search techniques, e.g. text-word searching and appropriate use of the limit function. Following the two sessions, the number of students who could use basic search techniques correctly and in a systematic way had tripled (n = 6; 46%). Following the assessment of the end-of-module assignment, this figure had increased to 54% (n = 7). The inferential statistical technique  assesses whether there are significant differences amongst treatment means. A one-way  for related samples was used to compare the mean presession, post-session and post-assignment scores (Table 2). A significant difference amongst the means was found, F(2,14) = 11.493, P = 0.001. Subsequent pair-wise comparisons showed that there was a significant difference between pre-training scores and post-training scores, F(1,10) = 5.106, P = 0.040, and between post-training scores and the postassessment scores, F(1,10) = 9.486, P = 0.008.

Six students requested help regarding their assignment search strategy and were referred to various sections of the tutorial. Differences in final assessment scores between those who were referred back to the tutorial and those who did not request feedback were analysed using an unpaired t-test. A onetailed test was applied, as it was hypothesized that students requesting feedback would show improved scores. The scores of this cohort of students were significantly greater than those that did not request feedback, t = −2.107, d.f. = 8, P = 0.034 (Table 3). Eight (62%) subjective questionnaires were returned, seven (88%) of which were wholly positive (agree or strongly agree). All respondents believed that the sessions were useful, well structured and interesting. They also considered that the support material was useful and relevant. Five students (63%) agreed or strongly agreed that the sessions improved their search skills, and four (50%) that their knowledge of IS had increased. Three students (38%) did not complete this item on their questionnaire. Seven students (88%) believed their confidence in searching was improved, five (63%) of whom strongly agreed with this statement. One student (13%) was undecided in each of these areas. Students appear to view their skill development more negatively than was demonstrated by the post-test. In line with the original project aim, the subjective evaluation elicited a positive response, although these should be interpreted with caution, given the small sample size. Students reported that the tutorial was

Table 2 Pre-training, post-training and post-assessment scores

Table 3 Mean scores for student requesting feedback on search strategies post-training

Pre-training score Post-training score Post-assessment score

Mean

SD

n

4.58 6.45 9.70

1.50 1.46 3.53

13 11 10

Feedback not requested Feedback requested and referred back to tutorial

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79–86

Mean

SD

n

7.25 11.33

2.87 3.08

4 6

83

84

Interactive information skills tutorial, Maria J. Grant & Alison J. Brettle

‘easy to follow’, enabled them ‘to work at (their) own pace’, and gave them ‘the opportunity to test out new skills’. In contrast, some students requested ‘more time’, ‘even more simpler stepby-step instructions’ and a desire to ‘relate (searches) to own research project’. These issues were similar to those raised during the piloting of the sessions, which had informed the structure of the teaching sessions. It was recognized that, in order to develop an open source tutorial to foster transferable search skills, it was important to use meaningful but generic examples. Students could then apply these skills to alternative sources, and individual search questions. The suggestions that simpler instructions were required contrasted with a comment that the tutorial ‘couldn’t be made more simpler’, which intimates that the right balance has been achieved, whilst the request for more time perhaps highlights the importance of emphasizing the availability of the tutorial outside class times. Discussion The introduction of a theory and web-based approach to skills development, in contrast to a theory and hard-copy workbook approach, was achieved and students engaged with the technology and materials. This finding echoes the comparison of Gutierrez et al.19 of student attitudes and satisfaction levels when using either printed or electronic workbooks. As with the Dewald study20 the web-based approach expanded ‘the student’s options of time and place of instruction’, enabling students to revisit IS theory and practical exercises at their convenience, reinforcing their learning through self-assessment quizzes. Students became increasingly proficient in using, and demonstrating the use of, core IS techniques throughout the study’s lifetime. Although the tutorial focused on OVID , evidence indicates its relevance to a range of professions.21 This, combined with the generic nature of the skills developed, will mean that this proficiency can be transferred to other resources, as demonstrated, with varying degrees of success, in the end of module assignment. There are a number of limitations in the study that should be acknowledged. The numbers of students assessed in this study was small and, although the results were statistically significant for the group

in question, the conclusions cannot be generalized. There were a number of reasons for the small numbers that could be addressed in further studies. Less than half of EBP students (47%) were obliged to submit work for assessment, whilst the attendance levels at the second session (n = 13; 62%) were low, in part, because of the session coinciding with the half-term break of many of the local schools. Both factors contributed to the lower than expected levels of data collected. These results and experiences are similar to those experienced by Rosenfeld et al.22 who stated that searching skills ‘improved somewhat’ but the data collection were hampered for a variety of reasons. The lack of a control group for comparison potentially limits the conclusions to be drawn from the data collected. However, as predicted, students appear to benefit from being able to access and use the resource at a time and place convenient to them. A significant improvement in test scores was observed from pre- to post-training and further improvements were apparent from post-training to post-assessment. Thus, as anticipated both by the authors and by Rowntree,23 the timely and constructive feedback provided by the completion of self-assessment quizzes appears to have facilitated learning. Additionally, in line with Biggs’24 hypothesis, deeper and longer-term learning appears to have been achieved when providing students with actual search practice. The scores of students requesting feedback who were referred to the web tutorial were significantly greater than those of students that did not request feedback. This resonates with Abram et al.25 and Blanc et al.26 who suggest that those students most at risk of low achievement are also least likely to request support. To ensure the availability of individual support, a maximum of six students per facilitator is specified when using this tutorial. However, large group facilitation is of key importance where student numbers and existing timetable congestion make small group facilitation unfeasible. The piloting of the tutorial with larger groups is ripe for evaluation. This investigation sought to go beyond measuring short-term learning, with the evaluation of an end-of-module assignment at 4 months post-training. Whilst it may be feasible to use the tutorial as a standalone training package, as with the Bracke et al.

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79– 86

Interactive information skills tutorial, Maria J. Grant & Alison J. Brettle

study,14 this investigation did not seek to evaluate the tutorials effectiveness without staff input. Findings indicate that the use of this tutorial to complement taught sessions and, in particular, in conjunction with the availability of guided feedback have a greater impact on IS development. From a technical perspective, the tutorial enables self-assessment by users. However, the technology used to develop the tutorial (HTML web pages utilizing hypertext links) does not enable facilitators to monitor its use nor the scores achieved through the quizzes. Since its development, technology has become available that could facilitate this type of remote monitoring, provided adaptations were made to the tutorial. This remains an area for further development and would also enable the monitoring of the tutorials’ contribution in the IS development of non-attendees to facilitated training sessions. Despite these limitations, it is believed that the tutorials’ development was successful, and the evaluation approach is worthy of further investigation in larger studies.

of students. Although it was originally designed for use in conjunction with a taught component feedback also suggests it is being used as a stand-alone tutorial. Further evaluation is required to assess the impact of these changes in use.

Conclusion

Acknowledgements

The web-based tutorial proved to be a useful and effective tool for teaching IS within the EBP curriculum in this setting. Students developed core generic searching skills, and demonstrated improvements between pre-training and final assessment scores. The tutorial can be, and was, revisited following teaching sessions, thus fulfilling the original aim of allowing students to use it for revision and study at a time and place of their choosing. It was rated positively by students and appears to reinforce active learning, contributing to the improvement of students’ IS in the short and longer term. Although a small-scale study with limitations, it nevertheless demonstrates that this methodology can be used to evaluate IS, and thus adds to the evidence base in this area. Repeating the study (or undertaking a similar study) using a larger sample size and a control group would provide stronger evidence relating to the effectiveness of IS education. Feedback within the University of Salford suggests that the tutorial is now widely used within other teaching programmes, often with large groups

The authors would like to thank the Teaching and Learning Quality Improvement Scheme (TLQIS) and the Health Care Practice R&D Unit (HCPRDU), University of Salford, UK for funding this study. We would also like to thank Professor Andrew Long for continuous support throughout the project and Dr Richard Stephens, Department of Psychology, University of Keele, UK for statistical advice. We also acknowledge the students from the MRes Health and Social Care, University of Salford and the HCPRDU Evaluation Programme who took part in the actual and pilot studies.

Supplementary material The following supplementary material is available for this article online: Appendix 1. Objective evaluation: modified Rosenberg assessment tool. Appendix 2. Subjective evaluation. This material is available as part of the online article from http://www.blackwell-synergy.com/ doi/abs/10.1111/j.1471-1842.2006.00655.x (This link will take you to the article abstract). Please note: Blackwell Publishing are not responsible for the content or functionality of any supplementary materials supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.

Key Messages Implications for Practice • A web-based tutorial facilitates continued access by students. • A web-based tutorial is acceptable to students learning information skills. • Provides a potential approach for evaluating the effectiveness of web-based tutorials.

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79–86

85

86

Interactive information skills tutorial, Maria J. Grant & Alison J. Brettle

Implications for Policy • Web-based tutorials—which encourage active learning practices—are a useful and effective tool for teaching information skills. • Web-based tutorials enable search practice which is convenient for students and facilitates deeper and longer-term learning. • Providing students with constructive feedback may encourage independent learning.

13 14

15

16

References 1 Brettle, A. Information skills training: a systematic review of the literature. Health Information and Libraries Journal 2003, 20(Suppl. 1), 3 –9. 2 Garg, A. & Turtle, K. M. Effectiveness of training health professionals in literature search skills using electronic health databases: a critical appraisal. Health Information and Libraries Journal 2003, 20, 33 – 41. 3 Joint Information System Committee, the Manchester Metropolitan University Library and the University of Leeds Library. The Big Blue: Information Skills for Students. Final report. 2002. Available from: http:// www.leeds.ac.uk/bigblue/finalreportful.htm (accessed 27 April 2006). 4 O’Brien, D., Procter, S. & Walton, G. Towards a strategy for teaching information skills to student nurses. Nurse Education Today 1990, 10, 125–9. 5 Burrows, S. C. & Tylman, V. Evaluating medical student searches of  for evidence-based information: process and application of results. Bulletin of the Medical Library Association 1999, 87, 471–6. 6 Schilling, K., Ginn, D. S. & Mickelson, P. Integration of information-seeking skills and activities into a problem-based curriculum. Bulletin of the Medical Library Association 1995, 83, 176–83. 7 Francis, B. W. & Fisher, C. C. Multilevel library instruction for emerging nursing roles. Bulletin of the Medical Library Association 1996, 83, 492 –8. 8 Fox, L. M., Richter, J. M. & White, N. E. A multidimensional evaluation of a nursing information literacy program. Bulletin of the Medical Library Association 1996, 84, 182–90. 9 Dorsch, J. L., Frasca, M. A., Wilson, M. L. & Tomsic, M. L. A multidisciplinary approach to information and critical appraisal. Bulletin of the Medical Library Association 1990, 78, 38 – 44. 10 Dorner, J. L., Taylor, S. E. & Hodson-Carlton, K. Faculty-librarian collaboration for nursing information literacy: a tiered approach. Reference Services Review 2001, 29, 132– 40. 11 University of Huddersfield. INHALE. Available from: http://inhale.hud.ac.uk/inhale/aims/index.html (accessed 27 April 2006). 12 Rosenberg, W. M., Deeks, J., Lusher, A., Snowball, R., Dooley, G. & Sackett, D. Improving searching skills and

17

18

19

20

21

22

23 24 25

26

evidence retrieval. Journal of the Royal College of Physicians of London 1998, 32, 557–63. Whittlestone, R. An open approach to CPD. IFMH Inform 2000, 10, 4–5. Bracke, P. J. & Dickstein, R. Web tutorials and scalable instruction: testing the waters. Reference Services Review 2002, 30, 330–7. Koufogiannakis, D. Effective methods for teaching information literacy skills to undergraduate students: what does the library literature research reveal? Oral presentation, 3rd International Conference of Evidence Based Librarianship, 15–18 October, Brisbane, Australia 2005. NHS Executive South and West and the School of Health and Related Research. How to Find the Evidence—the Basics (Retrieving Evidence in South and West for Clinical Effectiveness—RES&WCE). September 1998. Available from: http://www.shef.ac.uk/scharr/reswce/reswce3.htm (accessed 27 April 2006). West Midlands Regional Library Unit. Project Apple. 2000. Available from: http://www.wish-uk.org/train_dev/library/ apple/apple.asp (accessed 27 April 2006). University of Leicester. Database Training. 2005. Available from: http://www.le.ac.uk/li/sources/training/ databasetrain.html#ind (accessed 2 March 2006). Gutierrez, C. & Wang, J. A comparison of an electronic vs. print workbook for information literacy instruction. Journal of Academic Librarianship 2001, 27, 208–12. Dewald, N. H. Transporting good library instruction practices into the web environment: an analysis of online tutorials. Journal of Academic Librarianship 1999, 25, 26–31. Grant, M. J. Which database? Which interface?. In: Booth, A. & Brice, A. (eds). Evidence Based Practice for Information Professionals: a Handbook. London: Facet Publishing, 2004: 251–6. Rosenfeld, P., Salazar-Riera, N. & Vieira, D. Piloting an information literacy program for staff nurses: lessons learned. CIN Computers, Informatics, Nursing 2002, 20, 236–41. Rowntree, D. Assessing Students: How Shall We Know Them? London: Kogan Page, 1987. Biggs, J. B. Teaching for Quality Learning at University. London: Open University Press, 1999. Abrams, H. G. & Jernigan, L. P. Academic support and the success of high-risk college students. American Education Research Journal 1984, 21, 261–74. Cited in: Levin, M. E. & Levin, J. R. A critical examination of academic retention programs for at-risk minority college students. Journal of College Student Development 1991, 32, 323–34. Blanc, R. A., Debuhr, L. E. & Martin, D. C. Breaking the attrition cycle: the effects of supplemental instruction on undergraduate performance and attrition. Journal of Higher Education 1983, 54, 80–90. Cited in: Levin, M. E. & Levin, J. R. A critical examination of academic retention programs for at-risk minority college students. Journal of College Student Development 1991, 32, 323–34.

Received 1 December 2005; Accepted 2 March 2006

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79– 86

Interactive information skills tutorial, Maria J. Grant & Alison J. Brettle

Appendix 1 Objective evaluation: modified Rosenberg assessment tool LITERATURE SEARCH EVALUATION Assessment Sheet Student Name: Pre-theory

Post-theory

A: Free text 1.

Use of free-text terms B: Sensitive free text

2.

Use of synonyms

3.

Truncation

4.

Wildcard

5.

Use of Boolean operator ‘or’

6.

Use of MeSH terms

7.

Use of specific terms from question

8.

Use of explode

9.

Use of Boolean operator ‘and’

C: Thesaurus search

D: Limiters 10.

Use of limiters

11.

Search for review/rct/metaanalysis or other ebp

12.

Use of combination of Boolean operators

13.

Use of other effective features (e.g. subheadings)

14.

Systematic approach to search

15.

Number of articles (manageable number—50 or less)

16.

Relevance score of best (usually systematic review or other high quality evidence)

Yield

Total Search Score (Score 1 for use of each item)

Comments:

Modified from Rosenberg et al. (1998). Improving search skills and evidence retrieval, Journal of the Royal College of Physicians of London, 32(6): 557–563.

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79–86

87

88

Interactive information skills tutorial, Maria J. Grant & Alison J. Brettle

Appendix 2 Subjective evaluation Finding information for research and evidence-based practice Evaluation form The session today was developed to help you improve your information skills. To help us decide whether we can make improvements to the course, we would be grateful if you could complete the following evaluation form. Many thanks. Maria J Grant, Alison Brettle, HCPRDU Please tick to indicate to what extent you agree

with the following statements. Strongly agree

Agree

Undecided

Disagree

Strongly disagree

1. The session was useful/relevant to my needs 2. The session was interesting 3. The session was well structured 4. The session was pitched at the right level 5. The facilities provided were good 6. The support material was clear 7. The support material was useful/relevant 8. The session improved my knowledge of literature searching 9. The session improved my database searching skills 10. I feel more confident about my ability to carry out a literature search in the future 11. Overall the session was worthwhile Please name three things you like about the web-based session: 1. 2. 3.

Please name three things you would change about the web-based session: 1. 2. 3.

Do you have any further comments about the web-based session?

Name: Thank you for completing this questionnaire.

© Health Libraries Group 2006 Health Information and Libraries Journal, 23, pp.79– 86

Suggest Documents