Developing basic training programmes: a case study illustration using ...

11 downloads 71099 Views 106KB Size Report
Study Illustration. Using the Delphi ... basic academic programmes would be timely. For instance, current ... must be involved in the decision- making process.
Clinical Psychology and Psychotherapy Clin. Psychol. Psychother. 10, 55–63 (2003)

Practitioner Developing Basic Report Training Programmes: A Case Study Illustration Using the Delphi Method in Clinical Psychology Louise F. Graham and Derek L. Milne* Doctorate in Clinical Psychology, University of Newcastle upon Tyne, UK The effectiveness of interventions in mental health depends on the competence of the therapists who deliver them and in turn on the training that they receive. For this training to be successful it is necessary for programmes to periodically review and develop their objectives, content and methods. In the present article we summarize the main pressures on training programmes to undertake such review and development cycles, detail one review method (the Delphi Technique), and present the results of a Delphi survey of the stakeholders .N= 43/ associated with a clinical psychology training programme in the UK. The results indicated that the Delphi was conducted satisfactorily and that it can play a valuable role in defining how a training programme can be improved, based on reaching consensus about developments amongst the programme’s stakeholders. Implications are drawn regarding the need for multiple measures, to complement the Delphi Technique as a way of developing training. Copyright  2003 John Wiley & Sons, Ltd.

INTRODUCTION The Need for Development It is recognized that, in order to achieve the National Service Framework’s standards in the NHS, there needs to be a skilled workforce. Therefore, an underpinning programme has been initiated to support workforce planning, education * Correspondence to: Dr Derek L. Milne, Doctorate in Clinical Psychology, Ridley Building, University of Newcastle upon Tyne, NE1 7RU, UK. E-mail: [email protected]

Copyright  2003 John Wiley & Sons, Ltd.

and training. The development of relevant competencies through basic (i.e. initial) professional training is part of this initiative (Department of Health, 1999). This emphasis is also to be found in Modernising Mental Health Services (Department of Health, 1998), where it is noted that staff have to be ‘properly trained’ if they are to deliver effective interventions. In addition, there is considerable evidence from a number of different sources that a review of basic academic programmes would be timely. For instance, current policy directives from the NHS

Published online in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/cpp.353

56 and Department for Education suggest that any changes to the academic curriculum must: (a) Actively encourage the development of skills relevant to life-long learning and collaborative working; (b) Increase the use of IT in delivering the curriculum; (c) Continue to acknowledge theory–practice integration evidence-based practice (EBP) as the linchpins of good practice in the NHS; (d) Increase the emphasis on discussion and evaluation of the evidence-base in teaching; (e) Continue to provide a flexible and adaptable academic programme, which is capable of responding to changing NHS needs and which facilitates the development of skills that are directly relevant to trainees’ practice, post-qualification (i.e. team-working; collaborative research; communication and presentation skills; supervision; consultancy; and management); and (f) There is also a strong drive to make training programmes more responsive to research findings, and to utilize empirically-supported methodologies in the review process (Parry, 2000). In addition, programmes may have several local motives for developing their training. In our case this included the host University’s commitment to performing well in both the UK’s ‘Research Assessment Exercise’ (RAE) and the teaching quality assessment exercises; (Quality Assurance Agency—QAA); the host Faculty’s stress on innovative learning methods (especially ‘problembased learning’); staff changes at the programme level; and pressure from our local professional colleagues to work towards an approach to teaching that emphasizes independent learning.

Participation in Change Although it seems clear that changes to academic programmes are necessary, facilitating developments in organizations is known to be problematic, whether the system is educational (e.g. Christian, 1984) or otherwise (West & Farr, 1989). This is partly as a number of different groups have investments in the academic curriculum, and therefore must be involved in the decision- making process if development is to take place (Porras & Hoffer, 1986). Also, the decisions that need to be made are complex, as they require consideration of the strengths of the current academic curriculum, as Copyright  2003 John Wiley & Sons, Ltd.

L. F. Graham and D. L. Milne well as an examination of areas that could be further improved. Furthermore, any agreed actions may well require ‘root and branch’ changes to the academic curriculum (i.e. re-definition of the aims and objectives; and alterations to the content, methods and outcomes), and these must be endorsed by all of the stakeholders, if change is to proceed satisfactorily.

Methods for Developing Courses A number of key papers describe contemporary approaches to undertaking curriculum evaluation (e.g. Hoshmand & O’Byrne, 1996; Messer, Fishman, & McCrady, 1992), and outline current thinking with respect to designing curricula for psychology training programmes (e.g. Calhoun, Moras, Pilkonis, & Rehm, 1998; Peterson, 1997; Powell, Young, & Frosh, 1993). To illustrate, Peterson’s (1997) description of the ‘Reflective Educator’ provides a broad model of practice for trainers, embracing self-appraisal in teachers and trainees, a willingness to accept weaknesses and improve on them, and to develop training by providing intrinsically motivating teaching and learning formats. Messer et al. (1992), used four principles of Organizational Development (OD) to guide their evaluation of the clinical psychology training programme at Rutgers University, including a focus on stakeholders’ perceptions of the training programme, and a steering group.

Delphi Technique One approach that could be used to formalize the action research approach that Messer et al. (1992) adopted is the Delphi Technique. This technique evolved from experimental research conducted by Dalkey and Helmer (1963) and their colleagues at the RAND corporation in America. It was originally developed for forecasting technological developments, but has since been used in a range of different situations, including curricular evaluation and planning in higher education (Clayton, 1997). Linstone and Turoff (1975) described the Delphi Technique as a method for structuring group communication so the individuals in the group can deal as a whole with a complex problem. According to Clayton (1997): ‘The Delphi has great strength and utility. It collects and organizes judgements in a systematic fashion. It gains input, establishes priorities and builds consensus. It organizes Clin. Psychol. Psychother. 10, 55–63 (2003)

57

Training Programmes and helps to focus dissent, turning [it] . . . into a window of opportunity’ (pp. 382–383). The Delphi Technique is also a way of dealing with complex problems for which no scientific evidence exists (Jones & Hunter, 1995). Furthermore, the conventional Delphi process is involving, valuing and iterative, and therefore embraces the principles of good quality action research (Hoshmand & O’Byrne, 1996) and sound OD (Messer et al., 1992). Delphi literature relating to clinical psychology and curricular development reveals that the technique has been used successfully to: (1) Assess statements of professional competence in terms of priority in basic clinical psychology training (Green & Gledhill, 1993); (2) Identify research priorities in the NHS (Davidson, Merritt-Gray, Buchanan, & Noel, 1997); (3) Ascertain indicators of effective preparation for teaching adults (Houtz & Weinerman, 1997); and to (4) Establish the views of UK trainers of clinical psychology with respect to developing directions and actions for the future (The Merton conference: Llewelyn & Kennedy, unpublished data). This literature supports the view that the Delphi is a flexible process, as there are notable differences in methodology and procedure between the above studies. For example, Green and Gledhill (1993) used a two-round survey, Llewelyn and Kennedy (1999) used a three-round survey and a conference, and Houtz and Weinerman (1997) used four rounds of questions. Indeed, Clayton (1997) points out that the Delphi Technique can be used in a variety of different ways, including a postal survey conducted in rounds (a ‘conventional’ Delphi); a structured meeting (a ‘conference’ Delphi), and as part of a specific decision-making process (a ‘policy’ Delphi). The purpose of the present study is to present a detailed overview of the Delphi methodology, and to provide specific examples that demonstrate how it worked in our case.

selected, in the form of a series of three self-report, ad hoc Delphi questionnaires, developed through the related Delphi group meetings.

Participants Two samples of people were involved, the respondents to the questionnaires and a ‘Delphi group’ who contributed to the development of the questionnaires and the interpretation of the subsequent data.

The Respondent Group To recruit respondents, we sent invitations to all practising clinical psychologists associated with the training programme (including teachers and supervisors on the Course and Service managers), all current trainees, all members of the Course Executive and others. Prior to round 1, 58 people agreed to be respondents, and 43 of those completed all three Delphi questionnaires (a 74% response rate). Table 1 presents a breakdown of the respondents by their stakeholder group. The Delphi Group We set up a Delphi group by recruiting two current trainees, three clinical psychologists who teach and supervise on the programme, the Course Director (second author) and a prospective trainee (the first author and Assistant Psychologist to the project). Procedure

Design

Based on suggestions from the literature, we developed a timetable for the study as summarized in Table 2. The procedure we followed in each round is summarized below. Prior to round 1 the Delphi group met to define the problem and to develop the first questionnaire. Each member selected and rank-ordered the eight most important questions and the Assistant then developed a questionnaire from the most popular questions. In round 1, questionnaire 1 was presented in booklet form, with full instructions. It took the form of eight broad statements or open questions, and each occupied a single side of A4.1 Examples of the questions (no. in parentheses) are

The study’s central question was ‘how can we further improve the academic curriculum?’. In order to answer it a longitudinal design was

1 A copy of all three questionnaires is available on request to the second author.

METHOD

Copyright  2003 John Wiley & Sons, Ltd.

Clin. Psychol. Psychother. 10, 55–63 (2003)

58

L. F. Graham and D. L. Milne Table 1. A breakdown of the respondents by stakeholder group n D 43 Stakeholder group

Number of respondents

Current trainees Recent graduates Course tutors Service managers Professional bodies

10 3 2 2 1

Clinical psychologists (teachers and supervisors)

25

Ł

Relevant information ž From all 3 years on the programme ž Graduated in last 2 years ž An external representative of the Committee of Training in Clinical Psychology (the accrediting body) ž Eight learning disabilities speciality ž Five child & family speciality ž Eight adult mental health speciality ž Two older adults speciality ž One physical health speciality ž One forensic psychology speciality ž 19 teach on the course* ž 20 supervise trainees ž Two Special Interest Group convenors ž Four other stakeholders

These figures do not total 25 as the psychologists occupied multiple roles.

(1) ‘What do you think the aims and objectives of the course should be?’ and (2) ‘What are the primary qualities the Course should seek to develop in trainees?’ Respondents were asked to provide full written responses to each statement or question. The Delphi group then met and content analysed the responses to questionnaire 1, leading to questionnaire 2. In round 2 the aim of questionnaire 2 was to feed back responses to questionnaire 1 to the respondents and to encourage clarification of their views. This time the questionnaire comprised six core statements, reflecting the overriding themes extracted from the content analysis of responses to the first questionnaire. Lists of relevant items were presented underneath each statement. The 62 items took the form of precise examples of changes, or closed questions. Table 3 shows the six core statements and the number of related questionnaire items. Respondents were next asked to rate how strongly they agreed with each of the 62 items on a 5-point scale, ranging from ‘strongly agree’ to ‘strongly disagree’. They were also invited to make additional comments. All 43 respondents again completed and returned the questionnaire within the 2-week deadline. The same Delphi group procedure was then followed as for questionnaire 2. Copyright  2003 John Wiley & Sons, Ltd.

In round 3 (the final round), statements and items on questionnaire 3 were broadly the same as those on questionnaire 2. However, two main changes had been made: three new items were added to questionnaire 3 and 11 items were altered (to improve clarity). Also, each rating scale on questionnaire 3 was circled in black to indicate the overall central tendency in all respondents’ opinions, and highlighted in yellow to indicate the individual participant’s original response. Participants were invited to re-consider their initial responses in the light of this feedback and also in relation to some additional qualitative information, provided in a feedback document. All respondents again completed and returned the questionnaire within a 2 week deadline, and a report was prepared and circulated, together with a Delphi Satisfaction Questionnaire.

RESULTS The study took 24 weeks to complete (see Table 2), after which we had amassed a wealth of detailed information from 43 stakeholders in the training programme. This information comprised quantitative data from the rating scales on questionnaires 2 and 3, and qualitative data from the comments made on all questionnaires. The five main findings are now noted, together with supportive detail: Clin. Psychol. Psychother. 10, 55–63 (2003)

59

Training Programmes Table 2. A week-by-week timetable for our Delphi study Prior to round 1 Week 1 Week 2 Week 3 Round 1 Week 4 Week 5 Week 6 Week 7 Week 8 Week 9 Round 2 Week 10 Week 11 Week 12 Week 13 Week 14 Week 15 Round 3 Week 16 Week 17 Week 18 Week 19 Week 20 Week 21 Week 22 Week 24

ž ž ž ž ž

Problem definition session with Delphi group (2-h meeting) Results collated and interpreted by the researcher Questionnaire 1 (Q1) sent to Delphi group for review (Pilot) Delphi group return Q1 with comments/amendments Results collated and interpreted by the researcher and final draft developed

ž ž ž ž ž ž ž ž ž

Final draft of Q1 sent to respondent group (for completion within a 2-week deadline) Final draft of Q1 sent to Delphi group (for reference) ‘Reminder’ letters sent to all respondent group members Deadline for the return of Q1 (failed returns followed up with phone call) Delphi Group meet to analyse results of Q1 and develop Q2 (half-day, inc. lunch) Results collated and interpreted by the researcher Q2 sent to Delphi group for review (Pilot) Delphi group return Q2 with comments/amendments Results collated and interpreted by the researcher and final draft developed

ž ž ž ž ž ž ž ž ž

Final draft of Q2 sent to respondent group (for completion within a 2-week deadline) Final draft of Q2 sent to Delphi group (for reference) ‘Reminder’ letters sent to all respondent group members Deadline for the return of Q2 (failed returns followed up with phone call) Delphi group meet to analyse results of Q2 and develop Q3 (2-h meeting) Results collated and interpreted by the researcher Q3 sent to Delphi group for review (Pilot) Delphi group return Q3 with comments/amendments Results collated and interpreted by the researcher and final draft developed

ž ž ž ž ž

Final draft of Q3 sent to respondent group (for completion within a 2-week deadline) Final draft of Q3 sent to Delphi group (for reference) ‘Reminder’ letters sent to all respondent group members Deadline for the return of Q3 (failed returns followed up with phone call) Delphi group meet to analyse results of Q3 and develop overall summary of findings (half-day, inc. lunch) Results collated and interpreted by the researcher Draft feedback sheet sent to Delphi group for review (Pilot) Delphi group return feedback sheet with comments/amendments Results collated and interpreted by the researcher and final draft developed Final draft of feedback sheet sent to respondent group along with satisfaction questionnaire (2-week deadline for returning) Satisfaction data compiled and interpreted by researcher All data presented to steering group for discussion, re: next actions

ž ž ž ž ž ž ž

The Content of Teaching and the Teaching Methods are Closely Linked (i.e. Making Changes to the Content Requires that Changes are also Made to the Methods used to Deliver the Content) As regards the content, stakeholders’ responses to the questionnaires highlight the following views: there is the task of deciding what needs to be covered in teaching sessions during the 3 years of academic teaching, and determining what could be covered elsewhere (e.g. during placements, or as part of CPD/post-qualification), ensuring that the Copyright  2003 John Wiley & Sons, Ltd.

chosen content is relevant to trainees’ future practice (this link should be clearly demonstrated), and that it is up-to-date and refers to the evidence base. The main issues and concerns with respect to the related methods were that respondents would like to see a range of methods employed to deliver this material to trainees. There was a consensus of opinion that the following methods would satisfy this concern: in-class discussion and evaluation of the evidence base; live and video modelling of skills; facilitated reflection; rehearsal of transferable skills; debates; role plays; and problem-based learning (i.e. assigning tasks to small groups who Clin. Psychol. Psychother. 10, 55–63 (2003)

60

L. F. Graham and D. L. Milne

Table 3. The six core statements and the number of related items on questionnaire 2 The six core statements 1. We need to develop a more integrated Course 2. The teaching methods should be re-vitalized 3. The assessment methods of the programme need to be changed 4. Changes to the content of teaching are necessary 5. Personal awareness and development needs to be emphasized and tackled thoroughly 6. The academic learning environment needs attention Total number of items:

No. items 6 16 8 15 9 8 62

that, for this kind of integration to work, there needs to be good communication between the Course Executive and the teachers; the specialist teachers/Special Interest Groups; and between the teachers and trainees.

Personal Awareness and Development Needs Attention The main issues and concerns with respect to personal awareness and development were dealing openly with sensitive and difficult issues, whilst still respecting everyone’s right to privacy; and tackling the difficult question of how best to provide personal support for trainees (see general summary below).

Improving the Academic Learning Environment take responsibility and feedback; Huey, 2001). Such a range of teaching methods is commonly advocated as necessary to facilitate experiential learning (Goldstein, 1993; Kolb, 1984). In turn, it was recognized that the teachers need to be given a full induction and regular training in these methods, as well as support in using them.

The Methods of Assessment should ‘Drive the Curriculum’ The Delphi indicated that the main concern was that a range of appropriate and objective assessment methods should be used. One popular suggestion for improving current assessment methods included setting small-scale projects early on in the Course, during placements, so as to encourage NHS-based and collaborative research. The study also suggested that these assessment methods should require trainees to use knowledge and skills that are relevant to post-qualification practice. Suggestions included video-taped practicals, oral assessments and therapy logs.

Greater Integration is Necessary (i.e. teachers need to work together to integrate core themes and specialist material, relate teaching to placements, and to make links across specialized material and to ensure that material is as relevant as possible to the NHS) A large majority of respondents agreed that some aspects of the academic curriculum might be better tackled on placement, rather than during teaching sessions, and vice versa. It was also recognized Copyright  2003 John Wiley & Sons, Ltd.

A need was identified to set and monitor standards for teachers and trainees, but without impinging too much on teaching time. The Delphi also highlighted a need to deal with some negative group dynamics.

The Strengths of the Current Academic Curriculum Having noted some weaknesses, it is appropriate to also record briefly some identified strengths. These strengths were the quality and range of clinical supervision, effective teachers (including the use of up-to-date teaching material and enthusiasm about their topic), friendly staff, and the support for trainee development (e.g. good personal tutors). Consensus Overall, levels of agreement between the different stakeholder groups were high across all three rounds of the study. Quantitative data from round 3 that illustrate this level of consensus are summarized in Table 4. These data show overall consensus ranging from 66% (for changes to the assessment methods) to 85% (item 6—improving the learning environment). Similarly, the central tendency (i.e. the most common reply) shows a total of 62 of a possible 65 items in questionnaire 3 to have achieved a consensus. Satisfaction with the Delphi Exercise In general, respondents were happy with the Delphi process (rated 6 out of a possible 8), the usefulness of the Delphi (also rated 6) and the way that the process enabled respondents to express their ideas fully (rated 7). No ratings fell below 5. Clin. Psychol. Psychother. 10, 55–63 (2003)

61

Training Programmes

Table 4. A summary of the replies received to the third Delphi questionnaire, indicating the degree of consensus for each of the six core statements (see Table 3 for these statements) Core statement

Mean percentage of respondents who ‘agreed’ or ‘strongly agreed’ with each statement

Central tendency (no. ‘agree’ or ‘strongly agree’)

Range (percentage agreeing or strongly agreeing with statements)

82.7 71.7 65.6

All six itemsŁ 15 of 16 items Eight of the nine items

69–88 43–93 42–81

77.2 82.7 85.3

16 of the 17 items All nine items All eight items

43–100 58–98 74–93

1. Integrate the course more 2. Revitalize the teaching method 3. Change the assessment methods 4. Change the content of teaching 5. Emphasize personal awareness 6. Improve the learning environment Ł

The number of questionnaire items differs from Table 3 as that concerned questionnaire 2.

DISCUSSION Main Outcomes The main conclusions are that the Delphi Technique afforded a successful approach to ensuring professionals’ participation in organizational change, helped to define clear ways to develop the training programme, and achieved a high degree of consensus amongst a range of stakeholders. For example, the results provided us with many constructive suggestions for course development in six broad areas (e.g. increasing the integration of the academic programme and revitalizing the teaching methods). The Delphi was also perceived to have been conducted in a very satisfactory manner. When fed back to the programme’s stakeholders at an open meeting, the affirmative results (i.e. concerning what was already being done well) appeared to improve morale and to raise the generally positive perception of the programme (as also reported by Messer et al., 1992). These outcomes lead us to recommend the Delphi Technique to others, although we need to recognize some attendant weaknesses. In particular, the literature warns us that it is important to employ other methods, especially those which afford additional quantitative data. Indeed, Jones and Hunter (1995) point out that consensus does not necessarily mean that the correct answer has been found. It is therefore important to relate the results of the Delphi to other, more observable, quantitative measures. For example, in the staff training literature there is widespread acceptance of the need to measure learner satisfaction, learning outcomes, and the impact of any such learning on the service users (Goldstein, 1993). In keeping with this advice, our development effort has also included structured interviews with recent Copyright  2003 John Wiley & Sons, Ltd.

graduates and their managers (addressing the perceived adequacy of the programme in preparing students for the NHS), and the use of a validated questionnaire to assess the teaching and learning environment (Fraser, Treagust, & Dennis, 1986). Allied to these data-collection efforts, it is important to maintain the ‘stakeholder–collaborative’ approach. This has been shown to facilitate innovations of the present kind (Greene, 1987). To this end, we convened an ‘open meeting’, to which all interested parties were invited and at which all the available information was presented. Working in small groups, the delegates at this meeting interpreted the report and suggested ways to introduce change. A project steering group has subsequently incorporated all these points into an action plan, which will be implemented by one of the standing committees within the programme.

Conclusions It is concluded that the ‘proper training’ of NHS staff sought by the Department of Health (1998) depends upon the existence of appropriate quality assurance (QA) systems within basic training programmes. From our experience, the Delphi Technique affords a useful procedure, in terms of some principles of QA. Firstly, the technique helps to engage and enthuse stakeholders, a necessary condition for change (West & Farr, 1989). Secondly, the Delphi helped us to define and reach a clear consensus on what needs to change, the ‘problem statement’. Thirdly, the method generated constructive suggestions on how one might best develop a training programme to address such problems. A final principle of QA is the need for an evaluation of any development efforts. The Delphi Clin. Psychol. Psychother. 10, 55–63 (2003)

62 technique sets the scene for evaluation by specifying goals (development objectives) and could be repeated to gather views on the extent to which they have been achieved at a later date. Additional, more objective measures (e.g. of enhanced learning) would be desirable. Thus, although it is labour-intensive, the Delphi Technique appears to merit consideration as a QA device in order to develop effective training programmes.

ACKNOWLEDGEMENTS We are indebted to the 43 respondents for taking the trouble to complete the questionnaires; to Mandi Sherlock-Storey (Occupational Psychologist) for supporting us through the process; to the Delphi Group for many hours of enthusiastic debate and guidance (i.e. Christina Blackwell, Veronica Gore, Stewart Grant, Liz McManus, Genevieve Quayle and Richard Thwaites); and to Barbara Mellors for help in preparing the manuscript.

REFERENCES Calhoun, K.S., Moras, K., Pilkonis, P.A., & Rehm, L.P. (1998). Empirically supported treatments: implications for training. Journal of Consulting & Clinical Psychology, 66, 151–162. Christian, W.P. (1984). A case study in the programming and maintenance of institutional change. Journal of Organisational Behaviour Management, 5, 99–153. Clayton, M.J. (1997). Delphi: a technique to harness expert opinion for critical decision-making tasks in education. Educational Psychology, 17, 373–386. Dalkey, N., & Heimer, O. (1963). An experimental application of the Delphi method to the use of experts. Management Science, 9, 458–467. Davidson, P., Merritt-Gray, M., Buchanan, J., & Noel, J. (1997). Voices from practice: Mental health nurses identify research priorities. Archives of Psychiatric Nursing, 11, 340–345. Department of Health (1998). Modernising mental health services: Safe, sound & supportive. London: Department of Health. Department of Health (1999). National service framework for mental health: Modern standards & service models. London: Department of Health. Fraser, B.J., Treagust, D.F., & Dennis, N.C. (1986). Development of an instrument for assessing classroom psychosocial environment at universities and colleges. Studies in Higher Education, 11, 43–54. Goldstein, I.L. (1993). Training in organisations. Pacific Grove, CA: Brooks/Cole. Green, D., & Gledhill, K. (1993). What ought a qualified clinical psychologist be able to do? Consulting the oracle. Clinical Psychology Forum, October 1993, 7–11.

Copyright  2003 John Wiley & Sons, Ltd.

L. F. Graham and D. L. Milne Greene, J.C. (1987). Stakeholder participation in evaluation design: is it worth the effort? Evaluation & Program Planning, 10, 379–394. Hoshmand, L.T., & O’Byrne, K. (1996). Reconsidering action research as a guiding metaphor for professional psychology. Journal of Community Psychology, 24, 185–200. Houtz, J.C., & Weinerman, I.K. (1997). Teachers’ perceptions of effective preparation to teach. Psychological Reports, 80, 966–969. Huey, D. (2001). The potential utility of problem-based learning in the education of Clinical Psychologists and others. Education for Health, 14, 11–19. Jones, J., & Hunter, D. (1995). Consensus methods for medical and health services research. British Medical Journal, 311, 376–380. Kolb, D.A. (1984). Experiential learning. Englewood Cliffs, NJ: Prentice-Hall. Linstone, H.A., & Turoff, M. (Eds) (1975). The Delphi method—techniques and applications. London: AddisonWesley Publishing Company. Messer, S.B., Fishman, D.B., & McCrady, B.S. (1992). Evaluation-based planning of a professional psychology training program. Education and Program Planning, 15, 257–267. Parry, G. (2000). Evidence-based psychotherapy: an overview. In N. Rowland & S. Goss (Eds), Evidencebased counselling & psychological therapies. London: Routledge. Peterson, D.R. (1997). Educating Professional Psychologists. Washington DC: American Psychological Association. Porras, J.I., & Hoffer, S.J. (1986). Common behaviour changes in successful organisation development efforts. Journal of Applied Behavioural Science, 22, 477–494. Powell, G., Young, R., & Frosh, S. (1993). Curriculum in clinical psychology. Leicester: BPS Books. West, M.A., & Farr, J.L. (1989). Innovation at work: Psychological perspectives. Social Behaviour, 4, 15–30.

APPENDIX 1. THE QUESTIONS FROM QUESTIONNAIRE 1 Questionnaire 1 was presented in booklet form, with full instructions. The eight questions took the form of broad statements or open questions and each occupied a single side of A4. The questions were: (1) What do you think the aims and objectives of the Course should be? (2) What are the primary qualities the Course should seek to develop in trainees? (3) With reference to question 2, how can we ensure that the Course develops these qualities? (4) What are the main personal, professional and organizational issues we should consider when developing a new curriculum? Clin. Psychol. Psychother. 10, 55–63 (2003)

Training Programmes (5) Can you suggest ways we might address the personal, professional and organizational issues you raised in answer to question 4? (6) What are the strengths of the current Course and how can we continue to maintain them in a new curriculum?

Copyright  2003 John Wiley & Sons, Ltd.

63 (7) In addition to the points you have already made, are any other changes to the Course necessary, and why? (8) Bearing in mind the aims of this study, are there any additional questions, concerns or issues you would like to raise?

Clin. Psychol. Psychother. 10, 55–63 (2003)

Suggest Documents