useful resource called a Notebook was produced for the team in which all members of ... 9. Notebook was linked to each page of the web-site. This developers' ...
1
Evaluating FINESSE: a case study in group-based CAL
Rosa Michaelson, Christine Helliar, David Power, Donald Sinclair Department of Accountancy and Business Finance University of Dundee
2 Abstract Evaluating FINESSE: a case study in group-based CAL
The FINESSE project (Finance Education in a Scalable Software Environment)1 addresses problems associated with the teaching of finance courses in the U.K. Higher Education sector by constructing a networked, computer-based portfolio management game. The FINESSE consortium consists of finance lecturers at the Universities of Dundee, Strathclyde, Glasgow, Aberdeen and Glasgow Caledonian University, and members of the Computer Sciences division at the University of St Andrews. Subject-specific resources were developed to exploit access to real-time stock-market data thereby allowing students to explore portfolio management strategies in a new and exciting way. The development of FINESSE has produced generic resources to support student communication for group-based work and to facilitate staff monitoring of such work. This paper describes the various approaches employed to evaluate FINESSE throughout the first two years of its use.
Keywords: applications in subject areas; co-operative/collaborative learning; distributed learning environments; evaluation methodologies; evaluation of CAL systems.
3
1
Introduction This paper discusses the issues associated with the evaluation of computer-
aided learning (CAL) and how these issues inform the development of new forms of web-based learning environments. We describe the use of the FINESSE software in an undergraduate honours module and a outline a variety of evaluation methods employed in the course of the project; the results from students' evaluation of FINESSE is discussed in some detail. In particular, section 2 reviews issues that have been highlighted in the evaluation CAL while section 3 provides an overview of the FINESSE project. Section 4 describes the mixture of evaluation methods used for the project. Results from a formal evaluation of student learning are presented in section 5 and conclusions and limitations of the work are given in section 6.
2
Issues in Evaluating CAL Common themes emerge in critical reviews of CAL evaluation such as
problems with objectivity2; the adoption of a positivist approach to assessing learning outcomes; a lack of thoroughness in the evaluation methodology (such as the use of small subject groups or short study times to undertake a complete evaluation and draw meaningful conclusions about the adoption of CAL); the relative merits of qualitative as opposed to quantitative methods of evaluation. Similar concerns have been expressed about the evaluation of CAL in accounting education. A recent review of research into the impact of computing on accounting education found that the researchers: "conform to positivist research frameworks, are to varying degrees unreflective, contain a large number of implicit value judgements, give insufficient detail about the context of the enquiry for the readers to be able either to generalise or interpret their findings, and that their validity is therefore questionable" (Salleh & Williams, 1999, p8).
4 Other important considerations to be taken into account when evaluating the results of educational change are classic observational biases. One of the best known in educational research is that of the 'Hawthorne Effect' - that the observation of the change in teaching method is responsible for improvements in student learning (Borg & Gall, 1989; Gillespie, 1993). This effect is mitigated to a certain extent by studying the change over several cohorts of students. Another example is the so-called 'John Henry' effect (where the control group out-performs with respect to previous cohorts because of competition from the student group pioneering the new method) which can bias control group outcomes (Borg & Gall, 1989, pp. 191 -192). A further effect of changing the educational method - the excitement of the learner due to the introduction of new teaching methods, as opposed to the response to the new teaching method itself - is of particular relevance to investigations where enthusiastic proponents of communication and information technology (C & IT) are being scrutinised. These types of educational research effects, and the associated costs of thorough evaluation methods, have caused some to ask whether we should attempt to evaluate CAL at all (Jacobs, 1998; Shaw, 1998). Nevertheless, a variety of factors have ensured that CAL implementations and evaluations continue. These include (i) the growth in the use of computing in the teaching of subjects such as accounting (see Boyce, 1999 for an overview) requires justification for the professional bodies that accredit courses, (ii) the funding bodies’ need for evaluations of the CAL projects that they support and (iii) the need to disseminate lecturers’ experiences from trying to integrate CAL into teaching practise (which might mitigate the over-enthusiastic claims for cost savings which have occurred with each new technological advance).
5 Critical appraisals of attempts to evaluate CAL developments have informed the design and process of evaluation methodologies (Gunn, 1998; Mulholland, Au & White, 1998). As a result, a consideration of the project as a whole has become the focus of evaluation rather than specific learning outcomes. However, the literature provides very little guidance for evaluating complete projects. Prior studies have focused on a number of aspects of evaluation for different users of computer-based learning. For example, several authors have looked at the use of multimedia and courseware in terms of product quality (Barker, 1993; Tergan, 1998; Yildiz & Atkins, 1993)3. Others have analysed student use of CAL (Harvey, 1998), or examined the extent to which students believe that CAL is integrated within an undergraduate course (Stoner and Harvey, 2000). In contrast, Laurillard (1994) focuses on the student-teacher relationship and develops a model to describe the "workflow" between student and tutor when educational technologies involving CD-ROMs are employed4. What none of these studies show is a broadly-based evaluation which address the views of several interested parties in a project over a period of time. The current paper attempts to fill this gap. As use of the world-wide web and the internet has grown, so interest in new types of computer-supported learning has emerged. Distributed Learning Environments (DLE) (Ehrmann, 1989), Virtual Learning Environments (VLE) (Britain & Liber, 1999) and Computer Supported Collaborative Learning (CSCL) variously describe computing systems which do the following: support group-based learning; allow for students to be physically distant from each other/tutor; have 'real world' input; and place no restriction on the time when an individual uses the system. FINESSE is an example of a distributed-learning environment. There are as yet few evaluations of web-based learning environments on which to base the evaluation of
6 FINESSE. Though frameworks have been proposed for the pedagogical evaluation of web-based CAL, they are commonly used to decide which type of existing web package (Web CT or Lotus Notes, for example) to use. Work on the evaluation of VLEs has been conducted by Britain and Liber (1999) who compare a Conversational Framework5 and a Viable Systems Model (VSM) for assessing the use of 12 VLES in UK Higher Education. The authors prefer the VSM model which is based on Stafford Beer's cybernetic management process (described in Beer (1990) and Espejo and Harnden (1990)) although they do not address specific educational examples in their analysis. They simply produce a check-list of criteria for web-based groupware: resource negotiation, co-ordination, monitoring, individualisation, self-organisation and adaption6. Critical reviews of evaluation frequently stress the need to address both the context and pedagogical aims when computer-based learning is assessed with reference to specific educational goals (see Laurillard, 1993; Jones et al., 1996; Jones et al., 1999). One of the main criteria for judging the success of a CAL project is the extent to which it is integrated into the teaching process (Boyce, 1999, p. 216). More recently, in the keynote address for Web-Based Learning Environments 2000, Dias de Figueiredo (2000) called for contexts of learning to be taken into account when considering web-based educational systems. Michaelson (1999) has proposed that Constuctivist
ideas provide a useful set of criteria to assess DLE within an
educational context, especially when used for group-learning7. Selwyn (2000) proposes a wider notion of context which embraces the socio-political UK environment. An example of this type of wide-scale evaluation has been addressed by a review of the use of TLTP8 materials in the UK (Haywood, Anderson, Day, Land,
7 MacLeod and Haywood, 1999). However, this wider notion of evaluation context is out with the scope of this paper.
3
An Overview of the FINESSE Project Accountancy staff at the University of Dundee offer a fourth year honours9
module called 'Security Analysis and Portfolio Management' (4PM) and have developed a web-based game that requires groups of students to manage a portfolio of equities with a notional value of £100m10. Each team can spend the £100m on a range of equities traded on the London stock market and the Alternative Investment Market (AIM). In addition, shares of investment trusts can be bought and sold, thereby facilitating a portfolio strategy that can incorporate non-U.K. securities. The game introduces a level of realism into the analysis by incorporating dividend income and capital gains, by including transaction costs and by using real-time share price data subject to a 20 minute time-lag that are updated on a continuous basis. The real-time financial data are provided by two sources: Datastream and UpData. Datastream supplies daily batched data whereas UpData provides 20 minute price changes. In addition to providing a realistic setting where students could manage a portfolio of securities FINESSE had a number of other objectives. By building up a portfolio of securities students could explore theoretical issues that were discussed in the lectures for the course. It was also anticipated that group skills would be developed since the students managed their portfolios in teams of three or four, and it was an explicit objective for FINESSE to be able to assess this group activity. The game was also designed to allow students flexible access – enabling self-study – and to minimise staff time in administering the facility. The game is an integral part of the course – and the students’ assessment is based on their group use of FINESSE, the portfolio strategies adopted, an individual report about the project, and a joint
8 presentation of the investment strategies that each group has employed during the academic year. This component contributed 10% of the final mark for the course for each student in 1999/2000 and 15% in 2000/2001. Once logged on to the FINESSE web-site, which is accessed via any browser, a user with a portfolio sees a menu with the following options: a listing of current security prices; information on sector histories and security price histories; a transaction screen where all the current open positions for the portfolio are shown; audit trails (that track transactions either by security or by date); two screens that summarise portfolio performance and calculate the profit/loss to date; and a screen that enables a group to see a summary of the performance of other groups in the class. FINESSE has the following types of user: a system administrator, a tutor and a student. A tutor can assign a student to a group; a tutor can name groups and link each group with a particular set of resources, and allocate a portfolio management facility and a group Notebook. A tutor can monitor student performance using a variety of features. Further details concerning FINESSE are given in Power, Michaelson & Allison (1998) and Helliar, Michaelson, Power & Sinclair (2000). The development of FINESSE required a variety of management methods since the consortium involved staff drawn from four different sites and from across several subject disciplines. The group used video conferencing for regular meetings while agendas and minutes of the meetings were posted to a developers web-site. A useful resource called a Notebook was produced for the team in which all members of the consortium could write messages as they used the web-site. Messages placed on the Notebook could be e-mailed to the group and these messages gave a historical record of the development process which was accessible by all team members. The
9 Notebook was linked to each page of the web-site. This developers' Notebook allowed for informal project control by all participants, and for direct responses to problems encountered by the finance lecturers as the prototype evolved. Further informal development required e-mail and phone-calls between the finance lecturers and the programming team. Software development at the two main sites also employed desk-top video conferences (using IP multi-cast channels over ATM links). The participants could see the same web pages, and error messages in one window, and talk to and see each other on video, thereby debugging the underlying code in an efficient and effective way. The software was developed by a variety of methods involving continual user-feedback as in Rapid Application Development (Pressman, pp. 37-39, 1997). The Notebook was so successful that it was included in FINESSE and became the basis for members of a group to communicate with each other and with the tutor. In addition, it provided an on-line logbook for each group where they could detail their portfolio management strategies. This also provides evidence for the tutor to assess the group work, and help evaluate individual contributions to the project.
4
The Evaluation Process for FINESSE This paper analyses the evaluation aspects of FINESSE and focuses on three
main areas: the software, the educational content and the integration and use of the software in the teaching process. These areas were selected because (i) it was felt that they were the main focus of interest for those involved in the project including the grant body, the finance lecturers, the software experts and the students; and (ii) no one evaluation method would be sufficient for a large-scale CAL project with different stakeholder groups. Each of these groups had their own set of objectives for the project, and to try and evaluate FINESSE against these different objectives using a
10 single approach would have been difficult, if not impossible. Therefore a variety of evaluation methods were chosen; both qualitative and quantitative approaches were adopted for completeness11. First, the FINESSE software was evaluated via comparisons with other DLEs. These comparisons showed that at the time of the evaluation there was no existing software that would allow the integration of a web-based teaching environment with the database of information obtained from a real-life stock market source. Michaelson (1999) discusses the use of groupware such as Lotus Notes, and typical web-based DLEs such as WebCT. In addition, the availability of qualitative data, such as that provided by FINESSE via the Notebook, or the use of predicted-based filters that can be set by the tutor to signal important events in student activity, could not be provided by existing systems. Second, the software development was evaluated by the finance staff at the different Universities and by two experts who were specifically engaged for this purpose. The involvement of the staff began at the start when they outlined the requirements for the system and provided a specification of what was wanted from the game. In addition, development versions of the game were scrutinised by staff and feedback provided to the programmers. This developmental phase was an iterative process which lasted for over a year before the students were introduced to FINESSE as part of a course. The experts were then invited to examine the game: one had many years of experience in the development of CAL while the other had managed portfolios of over £1bn for nearly two decades in the investment-management industry. Both experts interacted with FINESSE over a period of many months, suggested improvements that might be made and indicated how further elements of realism might be injected into the system.
11 Third, members of the consortium made several presentations of the FINESSE software to an advisory body which had been established by the grant holders and a number of reports were written for the grant co-ordinators. Papers were presented at conferences, articles written for the academic press and demonstrations of FINESSE made to several interested parties. In each instance, the process of communicating about FINESSE generated feedback from interested experts, discussants and referees which lead to further evaluations and improvements in the game. Fourth, formal feedback from students on course evaluation forms and staffstudent committees provided insights into how the game could be improved. More importantly, however, informal comments to staff, through issues raised in the Notebook resource, also contributed to the development of FINESSE. Indeed, the first group of students to use the manual version of the game were introduced to FINESSE in a one-off lab at the end of the 4PM course when their views were sought on how the manual version of the game compared with the DLE version; they were strongly in favour of the new FINESSE system (Helliar et al., 1998). To summarise, the following issues informed the evaluation methods: the objectives of those who required the evaluation; the goals of those who carried out such evaluation; the appropriateness of the software design and implementation; how FINESSE met the educational goals of the accountancy and finance lecturers; the extent to which the integration of the portfolio management game into the educational process was successful (Helliar et al., 2000); the extent to which educational outcomes were achieved (Helliar et al., 2000); and various forms of formative and summative evaluation of student learning employed in the second year that FINESSE was used.
12 Since FINESSE has been used at Dundee for two academic sessions and use of FINESSE within a group-based project forms part of the student’s final mark one of the main criteria for success has already been met: that of integrating the software with the teaching process. However, it was decided that a formal evaluation of student views about the learning process with the use of FINESSE was needed. The remainder of the paper describes the questionnaire-based approach to this part of the evaluation.
5 5.1
The Formal Evaluation of FINESSE Method A two-questionnaire strategy was used to evaluate FINESSE in 1999/2000. A
pre-questionnaire was distributed at the start of the course before students were introduced to FINESSE and a post-questionnaire was employed at the end of the course once the game was finished. Each questionnaire contained a number of statements about which the students were asked their views; a 5-point Likert scale was employed where a 1 indicated "total disagreement" with the statement while a 5 suggested "complete agreement". The statements were grouped into four main areas, two of which were common to both questionnaires (Sections B and C). The first section (Section A) was different between the two questionnaires. In the prequestionnaire, it ascertained students' views on working in groups, using computers and surfing the net to gauge whether these essential pre-requisites for the game were in place. In the second questionnaire, the initial section investigated whether students had enjoyed FINESSE, found it easy to use, benefited from its notebook and e-mail features, and grasped theoretical concepts which had been discussed in lectures more quickly because of taking part in the game. The second section (Section B) sought
13 information on students' familiarity with certain topics that were covered in the course and employed in FINESSE. For example, the difference between dividend income and capital gains, the nature of investment trusts, familiarity with market indices and an understanding of the distinction between AIM stocks and Main Market shares were included. The third section (Section C) of the questionnaire focused on students' views about the benefits of group work and asked respondents if such work improved writing, presentation, communication and recognition skills. It also investigated whether students believed that collaborating in groups would improve their marks in general. The fourth section (Section D) was only included in the post-questionnaire and asked the students for general comments on the game. Finally, both questionnaires were piloted on staff and reworked several times as a result of the feedback provided. However, it was decided not to test them on students because such a strategy would have reduced the already small sample size available. In addition, the questionnaires were completed anonymously to allow students to express frank and honest views without fear that any adverse opinions might effect their overall mark for the course. Despite these precautions, the questionnaires are subject to all the usual limitations of such research instruments as detailed in section1 above. 5.2
Results The pre-questionnaire was administered to the 4PM class in October 1999 and
the results of the analysis shown in Table 1. The mean score and standard deviation are included as well as the p-value, which tests whether the average score is different from the neutral value of 3.000. In addition non-parametric analysis was also undertaken. Results are not shown here because the medians and associated Wilcoxon tests only differed marginally from the findings reported in this paper. A number of points emerge from an analysis of this Table. First, the students appear to have the
14 necessary background to undertake the course; the average scores in Section A of the questionnaire indicate that respondents did not like to work alone, enjoyed using computers and did not see any difficulty with using the Web. In addition, they regularly read the financial press and believed that other course material would be useful in studying for the 4PM course. Second, students indicated that they were familiar with different share selection strategies, aware of transaction costs, knew the difference between dividend income and capital gains, understood what was meant by investment trusts and aware of the stock market indices; in each instance, the mean score was greater than 3.000. However, with the exception of familiarity with transaction costs, the p-values were all greater than 0.050 indicating that none of these mean scores were significantly different from the neutral value of 3.000 on the Likert scale. Third, the responding students seemed positive about the potential of group work to improve their skills. The mean scores for improvements in presentation, communication and negotiation skills associated with group work were 3.650, 4.273 and 4.350 respectively and all the p-values were less than the critical value of 0.050. The average rating for the other two statements in this section of the pre-questionnaire were all close to 3.000. The high standard deviation in response meant that the null hypothesis that the average was equal to the neutral score of 3.000 could not be rejected at conventional levels of significance. The post-questionnaire was distributed to the class at the end of the course in April 2000 and Section A of Table 2 indicates that students benefited from using FINESSE. Statements about enjoying the portfolio game, finding FINESSE easy to use and improving a student's ability to work as part of a group all had high average scores that were significantly greater than 3.000; students therefore agreed with these
15 statements in the questionnaire. Features of the game such as the Notebook and the requirement to make a presentation which had an average score of 4.000 and 4.391 respectively were also seen as helpful. Surprisingly, face-to-face meetings had a higher average score than an e-mail facility in terms of helping students with group work. However, because students met each other every day in lectures and tutorials, a reliance on electronic communication may not have been vital. Also, there was a very high standard deviation of responses for both of these statements (1.234 and 1.238) indicating that there was a range of opinion on this issue among the students questioned. An analysis of Section B in the post-questionnaire reveals that student familiarity with financial concepts and financial terms had improved after the course and the game were finished. All of the statements in this section had average scores which were greater than the mean of 3.000 and 6 of the 8 statements' p-values were now less than the critical value of 0.050. With the exception of capital gains and AIM stock, there was a general agreement that students were more familiar with the main aspects of portfolio management. Perceptions about the advantages of group work also remained strong according to the post-questionnaire results. In Section C of the questionnaire, students' agreement with statements that group work improved presentation, communication and negotiation skills remained strong. In addition, while the average score for the statement that group work improved writing skills decreased, and for the statement that group work improves an individual's mark for the course increased, the mean response for both was still not significantly different from 3.000. A direct comparison of the pre- and post-questionnaires is contained in Table 3. The average scores for the common questions in the two instruments are provided
16 and a test of the null hypothesis that these means are equal against the alternative hypothesis that the mean score in the post-questionnaire is larger is conducted. Three points emerge from a visual inspection of this table. First, for 10 of the 15 statements, the mean score increased between the pre- and post-questionnaires. Two of the biggest increases were associated with statements about familiarity with share selection strategies and the concept of peer assessment (0.787 and 0.830 respectively). Second three of the increases in mean scores were significant at the 5% level. Familiarity with share selection strategies, awareness of issues involved in the management of a portfolio of equities and familiarity with the concept of peer assessment rose during the year by a significant amount. This result suggests that some of the main aims of the FINESSE portfolio game were achieved. A third point which emerges from the table is that the average score declined for four statements. These statements all related to views on group work where students' beliefs about improvements in writing, presentation, communication and negotiation skills were less positive at the end of the test period. This finding is surprising since FINESSE was designed to promote group work skills. Nevertheless, it is worth pointing out that while the mean scores for these statements declined, they still remained high. 5.3
Qualitative Responses Comments were invited in section D of the post-questionnaire on any features
of Finesse which the students found helpful and on how the computer-based system or game could be improved. The 19 students who responded to this section liked the ease with which the game could be used (8 out of 19 responses), the "ability to try strategies without risk", the buying and selling at up-to-date prices (2 responses) and the realism of FINESSE (5 out of 19 responses). One respondent argued that the game demonstrated "that it isn’t easy to make good profits every time you invest". Some
17 liked the practical aspects of FINESSE such as "putting theory into practise" and enabling them “to use the information that we have learnt during the past 4 years into practice". A small number of negative comments were received. For example, a few students found faults in the data due to incorrect share price values and inconsistencies in the data sources. Indeed, the largest number of negative responses related to problems encountered with the original data source which only changed prices on a daily basis. These lagged data changes enabled students to earn unrealistic profits by trading on information which was publicly available before the FINESSE price changed. When this problem was discovered early in the academic year 20 minute updates from a different source (UpData) was substituted. Six students wanted to have access to international stock markets (an option not available to us). Responses to other questions concerning the management of group activities underline some of the findings above; students organised regular and frequent face-toface meetings in general, though one commented that meetings were "Ad lib, usually through the notebook". Overall, the views in this section were supportive of FINESSE and indicated that the educational goals of the course were to a large extent achieved.
6
Conclusions A portfolio management game (FINESSE) has been in use for two years and is
now fully integrated into a fourth year honours module which is taught at the University of Dundee. Different evaluation techniques have been investigated and philosophical issues highlighted with a view to informing the evaluation process for this game. A variety of different types of evaluations have been carried out during the lifetime of the project, which are described in this paper. In particular, informed evaluations by potential users and subject experts which took place during the development of the game are described. The paper then focuses on the results of a
18 questionnaire-based evaluation of student users of the game which took place between October 1999 and April 2000. The general results of the questionnaire analysis indicated that students believed that they had the necessary skills to embark on the game. More importantly, the findings in the post-questionnaire showed that participants’ understanding of key issues in portfolio management had improved as a result of using FINESSE. While the students' views on group work indicated that this feature of the game was important, no changes in the mean scores for statements concerning group work occurred. Overall, the approach adopted in this paper lays a framework whereby other CAL may be evaluated in the future where several stakeholders in the project have been identified. The paper also highlights the benefit of conducting evaluations over a period of time to identify changes in views that may have occurred. The use of a mix of evaluation methods, that are related to the needs of all those involved in the project, seems to address some of the issues raised by those critical of the evaluation process as described in section 1. The conclusions of this paper suggest that a self-reflective evaluation process in which implicit values are made explicit, wherever possible, is important when making major changes in teaching. To fully exploit the potentials of new communications and information technologies an understanding of the educational context is vital.
19
Bibliography Barker, P. & King, T. (1993). Evaluating Interactive Multimedia Courseware – A Methodology. Computers and Education, 21 (4), 307-319. Beer, S. (1990). The Heart of Enterprise. Chicester: Wiley. Borg, W. R., Gall, M. D. (1989). Educational Research: An Introduction. New York, London: Longman. Boyce, G. (1999). Computer-assisited teaching and learning in accounting: pedagogy or product? Journal of Accounting Education, 17 (1999), 191-220 Britain, S. & Liber O. (1999) A Framework for Pedagogical Evaluation of Virtual Learning Environments. JTAP Report, October. < http://www.jtap.ac.uk/reports/htm/jtap-041.html> Dias de Figueiredo, A. (2000). Web-based Learning - largely beyond content (Keynote address) in Proceedings of Web-Based Learning Environments 2000, Porto, June 5-6, 85-88. Ehrmann, S. C. (1989). Improving a Distributed Learning Environment with Computers and Telecommunications, in Mindweave: Communication Computers and Distance Education. Robin Mason and Anthony Kaye (eds.), Oxford and New York: Pergamon, 255-259. Now out-of-print and available at Espejo, R.., Harnden, R. (1990). The Viable System Model. Chicester: Wiley. Gillespie, R. (1993). Manufacturing Knowledge: A History of the Hawthorne Experiments. Cambridge: Cambridge University Press. Gunn, C. (1998). Isolation or Integration? In J. Harvey (ed), LTDI Evaluation Cookbook. Institute for Computer-Based Learning, Heriot-Watt University. Haywood, J., Anderson, C., Day, K., Land, R., MacLeod, H. & Haywood, D. (1999). Use of TLTP Materials in UK Higher Education: a HEFCE-commissioned study. August 1995. Last referenced October, 2000 at http://www.flp.ed.ac.uk/LTRG/TLTP.html Harvey, J., ed. (1998). LTDI Evaluation Cookbook. Institute for Computer-Based Learning, Heriot-Watt University. Heliar, C.V., Michaelson, R. (1998). Evaluating Finesse: Experiences from the First Year. Presentation at the British Association of Accounting Scottish Conference, September 1998, Stirling University.
20 Helliar, C. V., Michaelson, R., Power, D. M. & Sinclair, C. D. (2000). Using a Portfolio Management Game (FINESSE) to Teach Finance, Accounting Education, 9 (1), 37-51. Jacobs, G. (1998). Evaluating courseware: some critical questions. Innovations in Education and Training International, 35(1), 3-8. Jones, A., Scanlon, E., Tosonoglu, S., Ross, S., Butcher, P., Murphy, P. & Greenberg, J. (1996). Evaluating CAL at the Open University: 15 years on. Computers and Education, 26 (1-3), 5-15. Jones, A., Scanlon, E., Tosunoglu, S., Morris, E., Ross, S., Butcher, P. & Greenberg. J. (1999). Contexts for Evaluating Educational Software. Interacting with Computers, 11 (1999), 499-516. Laurillard, D. (1993). Rethinking university teaching: a framework for the effective use of educational technology. London: Routledge. Laurillard, D. (1994). Multimedia and the changing experience of the learner. In M. Ryan (ed). Proceedings of the Asia-Pacific IT in Training and Education Conference and Exhibition (APITITE 94), Brisbane, 1, 19-24. Michaelson, R. (1999). Web-based Group Work, Proceedings of the 10th Annual CTIAFM Conference, Brighton, August, 58-64. Mulholland, C., Au, W. & White, B. (1998). Courseware evaluation methodologies – strengths, weaknesses and future directions. Presented at the 15th Australian Computers in Education Conference, Adelaide. Power, D. M., Michaelson, R. & Allison, C. (1998). The FINESSE Portfolio Management Facility. Proceedings of the 9th Annual CTI-AFM Conference, York, October, 119-125. Pressman, R. S. (1997). Software Engineering: A Practitioner's Approach (4th Ed), McGraw-Hill. Salleh, A., Williams, B. (1999). Some methodological issues facing research into the impact of computing on accounting education. Proceedings of the 10th Annual CTIAFM Conference, Brighton, August, 4-9. Selwyn, N. (2000). Researching computers and education - glimpses of the wider picture. Computers and Education, 34 (2000), 93-101. Shaw, R. (1998). Why Evaluate? In J. Harvey (ed), LTDI Evaluation Cookbook. Institute for Computer-Based Learning, Heriot-Watt University. Stoner, G., Harvey, J. (2000). Integrating learning technology in a foundation level management accounting course: an e(in)volving evaluation. Paper presented at the British Accounting Association 2000 Scottish Regional Conference, Stirling.
21
Tergan, S. (1998). Checklists for the Evaluation of Educational Software: Critical Review and Prospects. Innovations in Education and Training International, 35 (1), 9-20. Yildiz, R., Atkins, M. (1993). Evaluating Multimedia Applications. Computers and Education, 21 (1/2), 133-139.
22
Table 1. A Summary of the Responses to the Pre-Questionnaire. MEAN
S.D.
p-value
I like to work alone I like to use computers It is hard to use the web The 4PM course outline suggests that the course will be very theoretical I feel that I have the background in finance to use Finesse I regularly read the financial press Other course material will not be useful in studying for 4PM. I feel that I have the background in finance to use Finesse I regularly read the financial press Other course material will not be useful in studying for 4PM.
2.913 3.739 1.636 3.348 3.304 3.391 2.130 3.304 3.391 2.130
0.949 0.964 0.902 0.935 1.020 0.839 1.140 1.020 0.839 1.140
0.665 0.001 0.000 0.088 0.166 0.036 0.001 0.166 0.036 0.001
SECTION B I am familiar with different share selection strategies I am familiar with the notion of transaction costs I am familiar with issues involved in managing a portfolio of equities I am familiar with the concept of dividend income I am familiar with the notion of capital gains I have a good understanding of what investment trusts are I am familiar with the different stock market indices I have a good understanding of what AIM Stock are I am familiar with the concept Peer Assessment
3.143 3.842 2.850 3.450 3.045 3.300 3.250 2.800 2.579
1.195 1.068 1.182 1.395 1.090 1.174 1.070 1.436 1.071
0.590 0.003 0.577 0.165 0.847 0.267 0.309 0.541 0.104
SECTION C Group work improves writing skills Group work improves presentation skills Group work improves communication skills Group work improves negotiation skills Group work improves an individual student’s mark
3.048 3.650 4.273 4.350 2.684
1.203 1.137 0.883 0.875 1.250
0.858 0.019 0.000 0.000 0.285
Statement SECTION A
Note: This table summarises the results of a questionnaire before students commenced the FINESSE portfolio management game. MEAN is the average score on a 5-point Likert scale where a 1 denoted “complete disagreement” and a 5 denoted “complete agreement”. S.D. denotes standard deviation and p-value is the p-value of a t- test of the null hypothesis that the population mean is 3 (the neutral value) against the two-sided alternative that the population mean is not equal to 3.
23
Table 2. A Summary of Responses to the Post-Questionnaire. Statement
MEAN
S.D.
p-value
SECTION A I enjoyed playing the portfolio game Finesse is easy to use My ability to work with a group has improved theoretical content of the 4PM course became easier to grasp when using Finesse The Notebook was helpful for group work e-mail was helpful for group work presentations were helpful for group work face-to-face meetings were helpful for group work
4.304 4.391 3.636 3.391
0.974 0.656 1.049 0.941
0.000 0.000 0.010 0.059
4.000 3.391 4.391 3.478
0.905 1.234 0.839 1.238
0.000 0.142 0.000 0.077
3.913 4.217 3.522 3.696 3.348 3.696 3.609 3.478 3.409
0.793 0.795 0.790 0.635 0.885 0.765 0.722 1.238 1.054
0.000 0.000 0.004 0.000 0.073 0.000 0.001 0.077 0.083
4.217 3.000 3.565 4.000 4.130 3.043
0.902 1.155 1.080 1.024 0.869 1.397
0.000 1.000 0.020 0.000 0.000 0.883
SECTION B I am familiar with different share selection strategies I am familiar with the notion of transaction costs I am familiar with issues involved in managing a portfolio of equities I am familiar with the concept of dividend income I am familiar with the notion of capital gains I have a good understanding of what investment trusts are I am familiar with the different stock market indices I have a good understanding of what AIM Stock are I am familiar with the concept Peer Assessment
SECTION C I am familiar with the concept teamwork Group work improves writing skills Group work improves presentation skills Group work improves communication skills Group work improves negotiation skills Group work improves an individual student’s mark
Note: This table summarises the results of a questionnaire which was administered after students completed the FINESSE portfolio management game. MEAN is the average score on a 5-point Likert scale where a 1 denoted “complete disagreement” and a 5 denoted “complete agreement”. S.D. denotes standard deviation and p-value is the p-value of a t- test of the null hypothesis that the population mean is 3 (the neutral value) against the two-sided alternative that the population mean is not equal to 3.
24
Table 3. A Comparison of Responses to the Pre-and Post- Questionnaires. Statement
PreMEAN
PostMEAN
p-value
I am familiar with different share selection strategies I am familiar with the notion of transaction costs I am familiar with issues involved in managing a portfolio of equities I am familiar with the concept of dividend income I am familiar with the notion of capital gains I have a good understanding of what investment trusts are I am familiar with the different stock market indices I have a good understanding of what AIM Stock are I am familiar with the concept Peer Assessment
3.143 3.842 2.850 3.450 3.045 3.300 3.250 2.800 2.579
3.913 4.217 3.522 3.696 3.348 3.696 3.609 3.478 3.409
0.009 0.105 0.019 0.240 0.155 0.105 0.105 0.055 0.009
Group work improves writing skills Group work improves presentation skills Group work improves communication skills Group work improves negotiation skills Group work improves an individual student’s mark
3.048 3.650 4.273 4.350 2.684
3.000 3.565 4.000 4.130 3.043
0.450 0.400 0.175 0.210 0.190
Note: The p-value is the p-value of a t-test of the null hypothesis that the population means are the same for each questionnaire against the one-sided alternative that the population means are larger in the post-questionnaire.
1
The Finesse project was funded by SHEFC under the Use-of-the MAN Initiative II.
2
Issues such as the quasi-scientific approach hides the involvement of the evaluator in a
positive outcome; questions such as who does the evaluation, what vested interests do the evaluators have and how 'objective' can an external evaluator be, need to be addressed. 3
For example, in a review of evaluation methods for multimedia spanning the mid-1970s to
the early 1990s, Yildiz & Atkins lament the lack of informed CAL design. They call for an “evaluation design centred on the characteristics of the courseware, the students, and the nature of the learning task rather than on the underlying platform”. 4
Similarily, Jones, Scanlon, Tosonoglu, Ross, Butcher, Murphy et al. (1996) and Jones,
Scanlon, Tosunoglu, Morris, Ross, Butcher et al. (1999) describe attempts to evaluate CAL at the Open University over a 15-year period.
25
5
This model draws on the work of Laurillard (1993, 1994), but major adaptation of this model
is necessary if applied to group learning. 6
This method of evaluation seems unlikely to be of use for a particular teaching process given the assumption that students and tutor negotiate 'learning contracts'. It is also unlikely that the VSM approach is applicable to software evaluation as distinct from management systems such as companies or governments. 7
Group-learning is here taken to mean improving the student’s educational experiences by
means of collaboration and group work. 8
The Teaching and Learning Technology Program ran from 1992-1998 and was funded in
three phases. Over £11M was spent on the development of subject-specific CAL in the first two phases. 9
Scottish Universities traditionally award Honours Degrees after a four-year undergraduate
programme. This course is an honours subject offered to final year students. 10
The portfolio management game is based on an earlier, manual, one that used Lotus
spreadsheets. The manual version of the game had a number of serious problems including the fact that the portfolio could only change once a month, no dividend income was recognised and securities could only be selected from a small range of stocks. This version of the game did not appear to promote group work since one individual usually took all portfolio investment decisions because prices had to be manually updated each month there was a high cost in staff time. 11
See Mulholland, Au & White (1998) and Jones et al. (1996) for a further discussion and
justification of the methodological choice.