The effect of interactive on-line learning systems on ... - CiteSeerX

57 downloads 117458 Views 180KB Size Report
of Accounting and Business Information Systems at the University of Melbourne ... small in the context of an entire week. .... port a revamped tutorial program.
Journal of

Accounting Education

J. of Acc. Ed. 24 (2006) 16–34

www.elsevier.com/locate/jaccedu

The effect of interactive on-line learning systems on student learning outcomes in accounting Bradley N. Potter a

a,*

, Carol G. Johnston

b,1

Department of Accounting and BIS, Faculty of Economics and Commerce, University of Melbourne, Australia b Teaching and Learning Unit, Faculty of Economics and Commerce, University of Melbourne, Australia

Abstract We examine the association between student use of a unique, interactive, on-line learning system known as MarlinaLSä and the learning outcomes achieved by students in a major second year undergraduate accounting subject over the period 2002–2003. Primarily, we explore the relationship between students’ use of MarlinaLSä, an on-line system developed specifically to enhance reciprocal learning, and the examination performance of those students. Our results show that students’ use of MarlinaLSä is positively associated with their examination performance and also with the internal assessment result achieved. We also find that the extent of usage of the MarlinaLSä system by students varies systematically based on a number of defined characteristics. The study enhances our understanding of the role of teaching strategies generally, and, more specifically, the role of interactive on-line learning systems in improving student learning outcomes. Ó 2006 Elsevier Ltd. All rights reserved. Keywords: Accounting education; Learning outcomes; On-line learning

1. Introduction Significant attention has been given in recent years to measuring the impact of different teaching strategies on student learning outcomes (Dowling, Godfrey, & Gyles, 2003; Duff, 2004; Michlitsch & Sidle, 2002; Sawyer, Tomlinson, & Maples, 2000). An increase in *

1

Corresponding author. Tel.: +61 3 8344 4989; fax: +61 3 9349 2397. E-mail addresses: [email protected] (B.N. Potter), [email protected] (C.G. Johnston). Tel.: +61 3 8344 9699; fax: +61 3 8344 3647.

0748-5751/$ - see front matter Ó 2006 Elsevier Ltd. All rights reserved. doi:10.1016/j.jaccedu.2006.04.003

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

17

global competition in the tertiary education sector and a greater allocation of resources to the development of innovative teaching strategies and approaches have, in part, stimulated this focus (Freeman, 1996; Nunan, George, & McCausland, 2000). Accordingly, much has been written in recent years describing a range of teaching strategies and, more specifically, explaining the use of on-line resources to enhance the development and delivery of those strategies (see for example, Cleaveland & Larkins, 2004; Lont, 1999; Stanley & Edwards, 2005; Watson, Apostolou, Hassell, & Webber, 2003). Attempts to evaluate the different learning strategies have generally focussed on student perceptions of their learning, as well as their enjoyment and satisfaction in relation to the specific strategy adopted. While these aspects of evaluation are useful, students’ views are not objective evidence of their learning (Alexander, 1999; Dowling et al., 2003). Accordingly, there are calls by employer groups, professional accreditation bodies, and others for the higher education sector to more fully explore the effect of teaching and learning strategies generally and, particularly, the use of on-line learning systems in enhancing student learning outcomes (see for example AACSB, 1996; Boyce, 1999; Michlitsch & Sidle, 2002). The need to evaluate teaching strategies within higher education settings can be understood in the context of broader forces for education reform; specifically, a recent and gradual movement away from the information transfer mode towards a more studentlearning-centred focus. Higher education is being fundamentally reoriented to encourage students to more actively participate in their own learning by constructing their own knowledge and practice through the acquisition and application of new skills and concepts (AECC, 1990; Bigelow, Seltzer, Hall, & Gargcia, 1999; Davis, 1996; Mundell & Pennarola, 1999). Furthermore, there is now widespread recognition that teaching strategies which encourage reciprocal learning, whereby the lecturer builds into the subject avenues for student feedback to inform teaching practice, can enhance student learning outcomes (Adler & Milne, 1997). In this study, we focus on the impact of a specific teaching strategy on the learning outcomes achieved by students in an intermediate managerial accounting subject at the University of Melbourne, Australia. We directly focus on the impact of an interactive, on-line learning system on the learning outcomes achieved by students. Briefly, our rationale for this focus is as follows. The findings of several prior studies suggest a link between active learning by students and enhancements in the learning outcomes achieved within specific educational settings (see, for example, Byrne, Flood, & Willis, 2002; Davidson, 2002; Dennis, 2003). The on-line learning system we examine is MarlinaLSä,2 which was specifically designed to encourage more active participation by students in the learning process. We suggest that, in light of the unique setup and design of MarlinaLSä, a relationship exists between student use of this system and the learning outcomes they achieve. The main results of the current study support the existence of this relationship. Our study extends the findings of prior research in at least two important ways. First, we provide detailed insight into the impact of a specific teaching strategy on student learning outcomes. From the perspective of enhancing student learning outcomes, our results provide support for the strategy exemplified by MarlinaLSä. In this sense, we seek to 2 MarlinaLSä is a commercially available on-line learning system developed by staff at the University of Melbourne. Several academic institutions internationally have adopted this system. For further details about the MarlinaLSä system, refer http://expotech.unimelb.edu.au/products/learning_management/.

18

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

build on prior work such as that by Davidson (2002) who found a clear relationship between the adoption of a deeper approach to learning and student performance on complex examination questions. Davidson (2002) concluded his study by calling on accounting instructors to develop and deliver new teaching and learning strategies which assist students to develop deeper approaches to learning and to apply those approaches in solving complex examination problems. We seek to build on the work of Davidson by investigating the effects of the on-line student learning system, MarlinaLSä, on student examination performance. We posit that consistent, active engagement by students in the subject through the use of MarlinaLSä will significantly contribute to improved performance on examination questions, complexity of questions notwithstanding. Further, where more active participation by students occurs, there is a greater likelihood that a deeper approach to learning will be adopted (Adler & Milne, 1997; Davidson, 2002). Second, we contribute to the rapidly expanding literature which explores the nature and limits of on-line learning systems for enhancing learning outcomes by students within specific tertiary settings (see for example, Cleaveland & Larkins, 2004).3 The remainder of the paper is structured as follows. The next section provides a discussion of the relevant literature and also explains the background to the study, both in terms of the context for the development of MarlinaLSä as well as the capabilities and technical specifications of the system. In the section thereafter, the empirical model used in the study is developed and explained. This is followed by the presentation and discussion of the results as well as various additional specification tests. The final section contains concluding comments and suggestions for further research. 2. Prior literature and background 2.1. Prior literature The accounting education context is changing rapidly and requires a constant rethinking of how best to present technical information and encourage student learning (Albrecht & Sack, 2000). As students become more technologically aware and adept, demands on academic teaching staff to utilise available technologies in the development and delivery of subject content have increased (Paisey & Paisey, 2005). The capacity to respond to these pressures will play a critical role in determining the relative success of higher education institutions in preparing their students for productive participation in the workforce of the future (AECC, 1990; Albrecht & Sack, 2000; Frederickson & Pratt, 1995; Novin & Pearson, 1989). Accounting educators have come under pressure in recent years to reorient curricula away from the rules-based transmission of content and toward the development of teaching and learning approaches that encourage students to actively construct their own knowledge and practice using new skills and concepts (Albrecht & Sack, 2000; Davis, 1996; Mundell & Pennarola, 1999). In addressing such demands, various challenges, including resource constraints, persist (Cleaveland & Larkins, 2004; Frederickson & Pratt, 1995). The Department of Accounting and Business Information Systems at the University of Melbourne has 3 To date, evidence as to whether computer usage in accounting educational settings has a positive impact on learning outcomes has been somewhat mixed. For further discussion of this point, refer to Boyce (1999), McDowall and Jackling (2004).

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

19

responded to this changing context through the design and set-up of the MarlinaLSä system, which utilises on-line technology to promote active learning. The time students actually spend in face-to-face sessions with an instructor is relatively small in the context of an entire week. A significant proportion of student learning potentially takes place away from teacher contact hours, in private study or in discussion with their peers. Accordingly, part of the notion of quality teaching encompasses the development of innovative approaches that empower students to take greater control of their own learning outside class time. Where these approaches encourage students to use independent study time to develop knowledge structures, deeper learning approaches can be adopted and there is a greater likelihood of enhanced learning outcomes (Boyce, 1999; Davidson, 2002; Fuller, 1998). As explained further in later sections of the study, the MarlinaLSä system provides a structured environment for student learning outside class time. Several studies have focussed on differing approaches students may take to their learning and a key distinction has been made between deep and surface approaches (Davidson, 2002; Duff, 1999; Marton & Sa¨ljo¨, 1976; Ramsden, 1992). Put simply, students who adopt a deep approach to learning seek to thoroughly understand the material and reflect upon the implications of its content, while students utilising a surface approach generally see learning only as the means to achieve a desired result (e.g. a pass grade) and will do the minimum necessary to achieve this. There are numerous studies that establish a significant link between student approaches to learning and improved learning outcomes (e.g. Byrne et al., 2002; Davidson, 2002; Entwistle & Ramsden, 1983; Ramsden, 1985; Schmeck, 1988; Trigwell & Prosser, 1991; Watkins, 1982). Students who take an active role in their learning are more likely to develop a deep approach to their studies, thereby increasing the likelihood of enhanced learning outcomes (Adler & Milne, 1997; Becker & Dwyer, 1994; Chickering & Gamon, 1987). The link between students’ active participation in learning and the enhancement of learning outcomes is a key premise on which the MarlinaLSä project is founded. The link between active learning and enhanced learning outcomes has also stimulated a range of studies which examine teaching and learning strategies designed to encourage greater student participation in their learning. For example, according to Mayo, Donelly, Nash, and Schwartz (1993), one way to encourage active learning is through the use of problem-based learning (PBL). Barrows and Tamblyn (1980) describe PBL as the learning that results when students are presented with a problem or issue and where opportunity exists for them to build or develop their understanding of the problem, ultimately working towards an appropriate solution (see also Milne & McConnell, 2001). As a result of engaging in the problem-solving process, students generally are required to gather necessary information, acquire an understanding of new concepts and principles, and develop a range of skills necessary to solve the problem.4 Within PBL approaches, the emphasis is not only placed on the process of acquiring knowledge, but in students taking responsibility for acquiring that knowledge (Milne & McConnell, 2001). There are a range of strategies that may be classified as ‘‘problem-based’’ (Boud & Feletti, 1991; Hoffman & Ritchie, 1997; Milne & McConnell, 2001). We posit that elements of the approach taken within MarlinaLSä resemble PBL in its simplest form, since

4

For further discussion of the implementation of PBL in accounting courses at tertiary level, refer to Milne and McConnell (2001, p. 63).

20

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

students are routinely confronted with practical-based problems via MarlinaLSä, which provides the stimulus for learning to occur. Education researchers have found PBL to be superior to traditional instruction methods for enhancing students’ conceptual understanding, while also assisting their ability to engage in self-directed learning and enabling them to develop greater long-term information retention (see, for example, Gallagher, 1997; Milne & McConnell, 2001). Enabling students to identify and address their learning needs and encouraging them to develop and reflect upon their understanding are generally considered to be important elements of a successful PBL approach that are incorporated into the setup and design of the MarlinaLSä system. Feedback on an individual’s performance facilitates learning and should lead to an improvement in performance through increasing motivation. However, improvements in performance will only be attained if the feedback is specific, timely, accurate, and realistic in terms of what is achievable and where it is expressed in such a way that encourages students to reflect on their performance (Boud, 1995; Brown & Pendlebury, 1992). Nevertheless, it is important that students do not rely solely on face-to-face feedback from tutors but evolve into effective responsible and autonomous learners through the development of self-assessment (self-evaluation) skills (Baume, 1994). One convenient means of providing such feedback is via on-line learning systems, which enable students to work through large amounts of material at their own pace and which work best when students have unlimited access to the system (Paisey & Paisey, 2005). The questions contained in MarlinaLSä are designed to provide students with immediate feedback as they work through a range of questions of differing levels of complexity. The foregoing discussion locates the MarlinaLSä system within contemporary education research literature relating to approaches to learning, on-line learning, fostering active learning, building structures to support student learning in ‘out of class time’, and using problem-based learning. The background to the development of MarlinaLSä and the capabilities of the system are discussed next. 2.2. Background 2.2.1. The MarlinaLSä project At the University of Melbourne, the subject Cost Management (CM) is delivered to intermediate (second year) undergraduate students and has a historical annual enrolment of approximately 600. Up until 1999, CM was delivered via a teaching format comprising two one-hour lectures per week, one (one hour) tutorial, and one (one hour) optional workshop. The workshop and tutorials primarily comprised working through a number of paper-based exercises drawn from the textbook and other sources. Student attendance in workshops was not compulsory and was low. Attendance in tutorials was compulsory and student attendance and participation within tutorials accounted for 10% of the student’s total result for the semester. During 1999, the content and delivery of the subject CM was reviewed by the course team and three principal problems were identified. The first problem acknowledged was a disappointingly low level of preparedness by students prior to attending tutorials and workshops. This lack of preparedness was, in turn, closely associated with reduced engagement by students with the subject material and a lower level of student–staff interaction during the face-to-face teaching sessions. Second, also largely due to the poor level of preparedness by students, tutorials were often reduced to one-way content delivery of

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

21

technical material. This, in turn, limited the ability of staff teaching the subject to create an interactive and engaging learning environment in which students actively participated in their own learning. The third problem identified by the course team was a lack of regular and timely feedback received by students in relation to their learning. Subsequent to the review of CM described above, the content and delivery of the subject was redeveloped with the primary aim of encouraging students to participate more actively in their learning, and, in doing so, address the three problems identified above. The hour-long workshops were replaced by a series of on-line preparation exercises to support a revamped tutorial program. This on-line preparation was enabled through the setup and development of the MarlinaLSä system which comprises three modules. The most significant is the first module which consists primarily of short answer, narrative-type questions and longer, interactive style questions (ISQs). Prior to their tutorial, students were required to attempt pre-selected questions of both types, ranging from the simple to the complex. With the simpler questions, students receive necessary information and at each step are provided with ‘‘hints’’ relative to the data they have entered. On completion, students are provided with a model against which to compare their answer. The more complex ISQs required students to address a realistic problem, typically set within an organizational context. To assist students to structure the problem-solving process and to develop an appropriate solution, they may be required, for example, to make a collection of relatively simple calculations or to define or explain particular terms or concepts. In the latter instance, the students may be required to incorporate in their answer a number of key words which act as ‘‘checkpoints’’. Once students have attempted this task, they may seek ‘‘assessment’’ of their response for which a suggested answer is provided. Having completed these calculations (for which an assessment may also be sought) or these definitions in the initial stages of the problem, students are then required to apply ‘‘what they know’’ in addressing subsequent parts of the question. Where the responses to the subsequent parts of the question are clearly incorrect, incomplete, or inappropriate, the student is provided a hint and returned back to the initial screens where they can revise and enhance the initial parts of their explanation. Once students successfully complete an ISQ, they have the opportunity to view a complete (suggested) solution. Students are then required to bring their completed work to face-to-face interactive tutorial sessions wherein further discussion takes place. During these sessions, students are encouraged to share their newfound knowledge with each other via presentation, discussion and debate at which time they are encouraged to reflect on their initial response. The second module within MarlinaLSä comprises an on-line tutor. The on-line tutor is designed for students to ask specific questions which are answered by staff teaching the subject. Both questions and answers may be available for other students to view and are arranged under pertinent topic headings in order to facilitate ease of revision and reference. In posting questions to the on-line tutor, students may choose to remain anonymous and any response from staff may be sent to the individual as well as to the entire group. Accordingly, this facility provides students with the opportunity to get fast, on-line feedback to their questions in a non-threatening environment thereby removing potential impediments to active learning which may otherwise hinder learning outcomes. Students may be less inclined to engage with the material if, for example, they feel there is no outlet through which they can ask questions and receive rapid feedback, or where they remain reluctant to directly approach staff for assistance.

22

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

The third module in MarlinaLSä contains a multi-choice, self-paced, revision tool. This tool is designed to provide students with rapid feedback on their understanding. Within this tool, students initially receive information sufficient to attempt questions at their leisure. On attempting the questions, the students can request an assessment which then provides immediate feedback, either indicating the initial response to be correct, or providing the student with the correct response and a supporting explanation of that response. 2.2.2. MarlinaLSä – technical specifications Technical development of the MarlinaLSä has taken place over several years and is ongoing. The system is developed in Cold Fusion, SQL and HTML. MarlinaLSä runs on a dual processor Windows NT/2000 server using an SQL database. Development of MarlinaLSä version 1 began in early 2000 and initial operation commenced Semester 2, 2000. The system operates and can be accessed by students and staff 24 h per day, 7 days per week. Since its initial development, several versions of MarlinaLSä have ensued. In version 1, the focus was on the student side of the application, specifically, how students were going to see and interact with different types of questions. In this early version, academic staff could not directly develop questions themselves, but required the services of IT staff to do so. This was addressed in subsequent versions of the system, and also added were facilities to allow tutors and other staff some capacity to identify what was being entered by students. Version 3 (2002) represented a significant departure from previous versions, making extensive use of DHTML, while also offering further improvements in navigation and data retrieval. This version also implemented wizard-based interface allowing staff to more easily create questions by following a sequential set of steps and further enhanced the ability of staff to accurately identify and evaluate student usage of the system. In doing so, versions 3 and 4 enabled the collection of the data for the present study. The latter versions also enabled the introduction of multi-media content such as sound and video clips within questions, while also providing capability to insert hyperlinks allowing access to external websites and resources. These features added considerable authenticity to the organisational problems provided for students via MarlinaLSä. 3. Model development 3.1. Model student learning outcomes To develop the model used in this study, we focus on the established positive association between active learning by students and enhanced learning outcomes. Specifically, we predict that greater usage of the MarlinaLSä system, as a proxy for student interaction and engagement with the subject CM, will have positive implications for learning outcomes achieved by students. We use student examination performance as our primary measure of student learning outcomes. Further, in keeping with prior research, we control for several other demographic-type factors which also may impact upon student learning outcomes. We therefore posit the following functional relationship: Examination Performance = f(MarlinaLSä usage, prior knowledge, other factors). We now discuss each of the aspects of the above relationship.

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

23

3.2. Variable measurement 3.2.1. Examination performance Prior studies reflect various approaches to measuring student learning outcomes. Consistent with the approach adopted by several of these prior studies, we measure learning outcomes based on assessment performance (for example, see Dowling et al., 2003; Michlitsch & Sidle, 2002). We use the acronym TOTALEXAM to represent our dependant variable, which is the raw, numerical score achieved by CM students in the 2 h, closed book exam held at the conclusion of the semester. We believe TOTALEXAM is an appropriate measure of learning outcomes in this study for two reasons. First, the key members of the teaching team responsible for delivering the subject remained constant across the period examined. Perhaps as a consequence, the format of the examination was consistent for each semester comprising the period under study, in terms of both the types of questions asked and the subject content assessed. Second, the examination results used in the study were ascertained after the application of specific and detailed marking guides. While we recognise that such measures will not guarantee complete objectivity and consistency, we are confident that, taken together, these factors work to enhance the consistency and objectivity of our learning outcomes measure. Nevertheless, we recognise that our measure for student learning outcomes is not without limitation. Specifically, the total mark achieved by students in the semester exam may be influenced by a number of individual, time-specific, contextual factors, and thus may represent a relatively noisy measure of learning outcomes (see for example, Hartnett, Ro¨mcke, & Yap, 2004). We address this, in part, by conducting various sensitivity tests to validate our findings on the association between student usage of MarlinaLSä and learning outcomes. These tests are presented and described in Section 5 of this paper. 3.2.2. MarlinaLSä usage Student usage of MarlinaLSä during each semester is recorded at the system level and is based on the seconds elapsing from the time the student accesses (logs onto) the system to the time the student exits (logs off) the system. The acronym SYSTEMUSAGE is used to represent the seconds for which student access to MarlinaLSä was recorded. For the period under examination, student usage of the MarlinaLSä resource varied significantly.5 Accordingly, to reduce the impact of extreme usage by some students on the relationships examined, we take the logarithm of system usage by each student, represented by the variable LNUSAGE. 3.2.3. Prior knowledge Consistent with earlier studies in this area, we identify student prior knowledge as an important determining factor which influences the learning outcomes examined (see also Alexander, Kulkowich, & Schulz, 1992; Auyeung & Sands, 1993; Dowling et al., 2003; Eskew & Faley, 1988; Rankin, Silvester, Vallely, & Wyatt, 2003; Rohde & Kavanagh, 1996). As a proxy for student prior knowledge, we use the official result recorded for each student in the prerequisite accounting subject which students must complete prior

5

Jacque Bera statistic of 247,946 is strongly significant, thus rejecting the hypothesis that SYSTEMUSAGE is normally distributed across the sample period.

24

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

to gaining entry into CM. This subject, Accounting 1B, is the second introductory accounting subject studied by students and is typically taken by students in year one of their degree. The content of Accounting 1B comprises both managerial accounting and financial accounting material. Accordingly, we predict that the student result in this subject will be positively and significantly related to the learning outcomes examined in the current study (see also Rankin et al., 2003). We control for this effect by including the variable PRIOR1B in our model. 3.2.4. Other factors We also recognise that, for a number of reasons, student learning styles may differ. Previous studies have identified systematic differences in learning styles based on gender, however, the findings are not consistent with respect to the impact of gender on learning outcomes (Arbaugh, 2000; Barrett & Lally, 1999; Brazelton, 1998; Buckless, Lipe, & Ravenscroft, 1991; Doran, Boullion, & Smith, 1991; Lipe, 1989; Severiens & Dam, 1998). Nevertheless, taken together these prior studies indicate that, within particular teaching and learning contexts, gender can be a determinant of learning outcomes. Accordingly, we include the variable FEM, which assumes the value of one if the student is female, zero otherwise. Further, prior studies also predict that, ceteris parabis, learning styles will differ between ‘‘Australian’’ and ‘‘international’’ students (see for example, Auyeung and Sands, 1993; Christopher & Debreceny, 1993; Hartnett et al., 2004). To enable exploration of these factors in the present study, we include a dichotomous variable for ‘‘international status’’. We base this variable on the component of the total student cohort which is classified as ‘‘international, fee paying’’ in the central university student database. We recognise that this is a relatively ‘blunt’ way of describing this cohort, however, we believe that the inclusion of this additional variable as indicated will capture the essence of any association which exists. The variable INTERNAT assumes the value of one for students listed as belonging to this category, zero otherwise. We also include in our model two interaction variables: FEM * LNUSAGE and INTERNAT * LNUSAGE which allow us to examine specifically the relative impact of MarlinaLSä usage within these student sub-groups. The final model is provided below: TOTALEXAM ¼  a þ b1 LNUSAGE þ b2 PRIOR1B þ b3 FEM þ b4 INTERNAT þ b5 FEM  LNUSAGE þ b6 INTERNAT  LNUSAGE þ e Table 1 summarises the variables and their measurement bases. 4. Results 4.1. Descriptive statistics The period covered by the study is 2002–2003, during which time CM was delivered four times and two versions of MarlinaLSä were used. The first of these versions spanned the semesters 1, 2002 to 1, 2003 inclusive, while the program was revised and updated for the delivery of CM in semester 2, 2003. The descriptive statistics for the pooled data are contained in Table 2. For the four semesters examined, the overall usable data set consists of 1116 students. Table 2 also

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

25

Table 1 Measurement of dependent and independent variables Construct

Variable

Measure

Dependent variable Student learning outcome

TOTALEXAM

The recorded performance for each student in the semester final exam

Independent variables MarlinaLSä Usage

SYSTEMUSAGE

Student prior knowledge

LNUSAGE PRIOR1B

Gender International status

FEM INTERNAT

The student usage (seconds) of the MarlinaLSä system during the semester Logarithm of SYSTEMUSAGE The recorded student result for the prerequisite accounting subject Accounting 1B Assumes the value of one if female, zero otherwise Assumes the value of one if the student is listed on the university database as ‘‘international fee paying’’, zero otherwise

Table 2 Descriptive statistics – pooled data (Valid N – 1116)

TOTALEXAM PRIOR1B SYSTEMUSAGE MALE FEM NONINT INT

N

Minimum

Maximum

Mean

SD

1116 1116 1116 464 652 565 551

0.0 50.0 124.0

89.5 96.0 1,399,112.0

53.6 68.6 114,996.3

13.1 11.0 84,311.6

All variables are as described in Table 1.

reveals that the mean MarlinaLSä usage for each student comprising the sample was 114,996.3 (31.94) seconds (hours). Across a 13-week period, this equated to a mean weekly usage by students of 2 h and 27 min.6 The average examination result achieved by students was 53.6%. The usable data set comprises more female students (652) than male students (464) and more ‘‘non-international’’ students (565) compared to those listed as ‘‘international fee paying’’ (551). 4.2. Regression results Table 3 presents the regression results for the pooled data spanning the years 2002– 2003. The explanatory power of the model is relatively good (Adj. R2 = 27%). Further, as expected, student prior knowledge, as measured by the variable PRIOR1B, is a significant factor in determining total examination performance. However, of particular interest in this study, and also as predicted, LNUSAGE, as a proxy for student interaction and 6

During the period covered by this study, CM was delivered over a semester spanning 12 weeks. At the completion of the semester, students were provided with a one week study/revision period prior to the exam, during which time they are able to access the MarlinaLSä system. Data relating to system usage were collected over this 13 week period.

26

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

Table 3 OLS – Regression results (pooled data)

TOTALEXAM ¼  a þ b1 LNUSAGE þ b2 PRIOR1B þ b3 FEM þ b4 INTERNAT þ b5 FEM  USAGE þ b6 INTERNAT  USAGE þ e (Constant) LNUSAGE PRIOR1B FEM INTERNAT FEM * USAGE INTERN * USAGE

Coefficients

t Statistic

Probability

31.73 4.81 0.49 0.94 4.78 1.51 3.08

3.09 5.55** 15.28** 0.68 3.25** 1.40 0.28

0.0020 0.0000 0.0000 0.4940 0.0012 0.1611 0.7785

Adj R2 = 0.27 Valid N = 1107 The table reports estimates from the equation for the pooled data set which spans the period from semester 1, 2002 to semester 2, 2003. All variables are as described in Table 1. The t statistics are calculated using Newey– West corrected standard errors. Significance levels for t-statistics are ** indicate significance at the 1% level (two tail).

engagement, is a significant and positive determinant of student examination performance. The results in Table 3 also reveal that ‘‘international students’’ performed, on average, worse in the exam for CM during the period while gender was not a significant factor determining examination performance in CM during this time. The coefficients on the interaction variables reported in Table 3 were not statistically significant at conventional levels. As noted earlier, this study covers four semesters in which two separate and distinct versions of MarlinaLSä were used. Our expectation is that the functional relationship described above may differ across the two versions for three primary reasons, each of which relate to the gradual enhancement of the integration of the resource in the teaching of the subject. First, for the second of the versions used in the period examined (delivered in semester 2, 2003), the MarlinaLSä content was refined significantly. The number of questions available to students was significantly reduced and the remaining questions were reviewed and refined to more clearly focus on the key concepts covered in the subject. Included in this refinement was an expansion in the quality and volume of feedback provided to students. Longer, more clearly focussed explanations were provided to further, and more directly, assist student learning outcomes. Second, the program was more closely integrated into the delivery of tutorials in CM by teaching staff. Tutorial programs were specifically tailored through the inclusion of more complex, discussion-based questions in face-to-face sessions. The more technical content, and the issues/questions which typically require shorter, narrative-style questions were addressed almost entirely within the MarlinaLSä program. Finally, much of the above refinement of the MarlinaLSä content and the enhanced integration of the resource into the teaching of the subject was undertaken by a key member of the teaching team who possessed significant expertise and experience in matters relating to educational design. Further support for the suggestion that the program was more comprehensively integrated into the delivery of CM is found in Table 4 which reveals

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

27

Table 4 Descriptive statistics by version N Panel A TOTALEXAM PRIOR1B SYSTEMUSAGE FEM MALE INT NONINT

780 780 780 448 332 386 394

Valid N

780

Panel B TOTALEXAM PRIOR1B SYSTEMUSAGE FEM MALE INT NONINT

327 327 327 198 129 160 167

Valid N

327

Minimum

Maximum

Mean

SD

0.00 50.00 0.00

89.50 96.00 1,399,112.00

53.22 69.24 105,465.5

12.97 10.78 71,591.18

0.00 50.00 240.00

83.50 96.00 1,159,955.00

54.58 66.93 140,325.00

13.22 11.43 104,929.80

Variable measurement is as per the description provided in Table 1.

that average weekly usage of the program by students in semester 2, 2003 was approximately 3 h, relative to the lesser average weekly usage of 2 h 27 min for the earlier version. Due to the possible different functional forms that may hold across the two versions of MarlinaLSä used during the sample period, we separately estimate the main regression for each version. The descriptive statistics and regression results for these tests are reported in Tables 4 and 5. There are several features of these results which are worthy of note. First, Table 4 shows that the student cohorts examined were of a similar demographic, with comparable percentages of females and international students. The results in Table 5 reflect the greater integration of the program, with an improvement in the explanatory power of the model across the two versions, with Adj. R2 increasing from 0.23 to 0.43. Further, as expected, the results reported in Table 5 indicate student prior knowledge is a positive and significant determinant of the dependant variable across both versions. 4.3. Additional testing In order to enhance the robustness of our main findings, we conduct a number of additional tests. It is possible, for example, that the results discussed to date and reported in Tables 3–5 can be largely explained by sub-groups within the student cohort which exhibit different usage characteristics relative to the rest of the group. Further, several prior studies examine computer usage patterns demonstrated by various student cohorts within a range of educational settings (see McDowall & Jackling, 2004). Accordingly, we explore the impact of student prior knowledge, gender, and international status in determining the extent of student usage of MarlinaLSä. We report these regression results in Table 6. The results reported in Table 6 indicate that student prior knowledge and international status are significant determinants of MarlinaLSä usage, while there was a weak

28

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

Table 5 OLS Regression results – by version

TOTALEXAM ¼  a þ b1 LNUSAGE þ b2 PRIOR1B þ b3 FEM þ b4 INTERNAT þ b5 FEM  USAGE þ b6 INTERNAT  USAGE þ e Version 1

Panel A (Constant) LNUSAGE PRIOR1B FEM INTERNAT FEM * USAGE INTERN * USAGE

Coefficient

t Statistic

Probability

25.56 4.61 0.43 0.12 3.60 1.22 2.04

1.92 4.12** 10.57** 0.06 1.76 0.71 1.25

0.0558 0.0000 0.0000 0.9514 0.0788 0.4782 0.2133

49.81 5.33 0.66 3.50 3.13 2.65 4.49

2.91 3.61** 16.26** 1.78 1.40 1.73 0.30

0.0038 0.0000 0.0000 0.0766 0.1629 0.0850 0.7617

Adj. R2 = 0.23 Valid N = 780 Panel B (Constant) LNUSAGE PRIOR1B FEM INTERNAT FEM * USAGE INTERN * USAGE Adj. R2 = 0.43 Valid N = 327 The table reports estimates from the equation for each version of the MarlinaLSä program. All variables are as described in Table 1. The t statistics are calculated using Newey–West corrected standard errors. Significance levels for t-statistics are ** and indicate significance at the 1% level (two tail).

association between gender and MarlinaLSä usage. In interpreting these results, two points are worth noting. First, to the extent that the student result in the prerequisite subject Accounting 1B may also proxy for other intangible attributes such as diligence, commitment and the perceptions of students of the usefulness of on-line learning systems, the significant coefficient on PRIOR1B is to be expected. Second, there is little indication in Tables 3–5 that either female or international students demonstrating greater usage of MarlinaLSä performed significantly better, on average, on the semester exam than did students within these categories using the system to a lesser extent. Accordingly, we conclude that the primary results of the study do not appear to be driven by these sub-groups within the student cohort. 4.4. Sensitivity analysis For a number of reasons, it is also possible that the main results reported in Tables 3–5 may not reflect implications of MarlinaLSä usage for student learning, but may, instead, merely report the role of the MarlinaLSä in preparing students to answer specific types of

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

29

Table 6 Regression results – usage patterns

Regression results : LNUSAGE ¼  a þ b1 PRIOR1B þ b2 FEM þ b4 INTERNAT þ eN ¼ 1116 (Constant) PRIOR1B FEM INTERNAT

Coefficients

t Statistic

Probability

10.89 0.01 0.08 0.23

59.90 2.46* 1.93 4.82**

0.0000 0.0140 0.0534 0.0000

Adj. R2 = 0.03 The table reports estimates from the equation for the student usage of the MarlinaLSä program. All variables are as described in Table 1. The t statistics are calculated using Newey–West corrected standard errors. Significance levels for t-statistics are * and ** which indicate significance at the 1% level (two tail) and 5% level (two tail), respectively.

examination questions relative to other question types. Thus, even though the content of MarlinaLSä comprised questions of various types, set at a range of difficulty levels and which explore a range of subject material, we examine whether this system is more effective for preparing students to answer examination questions of a certain format and content. Specifically, we examine whether MarlinaLSä is more effective in enabling students to hone their technical, calculative skills, relative to their ability to address certain narrative-type questions where a more conceptual understanding is required, such as in preparing contextual explanations or in performing critical and interpretive, case-based analysis. With this in mind, in consultation with other key members of the teaching team, we reviewed the examination papers for the period covered by this study and identified those parts of each examination question which were considered to be more technical or calculation-based. Accordingly, we separately test, via additional regression analysis, the impact of student usage of MarlinaLSä on calculation-based questions and on questions which were not considered to be calculation-based. The results of these tests reveal that student usage of MarlinaLSä is, on average, significantly and positively associated with student performance across both calculation-based questions (t = 7.76, p < 0.001, two tailed) and those questions not requiring calculation (t = 6.50, p < 0.001, two tailed). We interpret this result as support for our main finding that MarlinaLSä is an appropriate resource for enhancing student learning rather than merely a device which more narrowly prepares students for answering specific types of examination questions. 4.4.1. Alternative learning outcome measure We further examine those factors which impact upon the assignment result achieved by students. These results indicate that student prior knowledge (PRIOR1B) and LNUSAGE are also significant determinants of the within semester assignment result achieved by students (with t = 7.88, p < 0.001; and t = 4.44, p < 0.001). This further supports our main findings that MarlinaLSä is effective for enhancing student learning outcomes. Finally, we substitute as our learning outcome (dependant variable) measure the result achieved by students in Accounting 1B. We regress this result on LNUSAGE, and our variables for gender, international status and the interactions FEM * LNUSAGE and INTERNAT * LNUSAGE previously explained. Students enrolled in Accounting 1B did not have exposure to MarlinaLSä at any time during their study. Accordingly we did

30

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

not predict, and did not find, any association between LNUSAGE and the student result in Accounting 1B. We conduct this test as a further means of validating the key association examined in this study. 5. Conclusions and further research The MarlinaLSä system was developed with the primary objectives of enhancing the engagement and interaction by students within the specific setting and to stimulate enhanced learning outcomes. The current study seeks to isolate and evaluate the role of this system by examining the association between student usage of MarlinaLSä and examination results achieved. In doing so, we control for the impact of ‘‘presage’’ factors identified in Biggs’ (1999) 3P model, through the inclusion of variables which proxy for student prior knowledge, gender, and international status. Our results indicate that MarlinaLSä usage improves learning outcomes as measured by student examination performance. The more students used the MarlinaLSä system the more, on average, their learning outcomes improved. Greater usage of MarlinaLSä was systematically related to improved student performance for both calculative and non-calculative examination questions, and also on the assignments completed during the semester. The improved results on both the calculation-based and non-calculation questions were to be expected in light of the design and setup of MarlinaLSä and its role as a unique and valuable learning resource. The explanation for the improvement on the assignment may lie with the structuring of ‘outside class time’ work. The greater support and structure for this independent study provides opportunity for students to become more closely engaged with, and reflect upon, the subject material, thus becoming more motivated to improve results in general. The provision of timely and detailed feedback to students as they complete the MarlinaLSä exercises may also explain improved learning outcomes. This study is subject to several limitations that should be recognised. We measure learning outcomes that are associated with assessment tasks. There are other forms of learning outcomes which are not necessarily captured in formal assessment instruments (Carland, Carland, & Dye, 1994). Examples of these other learning outcomes include improved teamwork and collaboration, enhanced reflection, information retrieval knowledge and oral communication skills (Boyce, 1999). These valuable learning outcomes were not tested here. Further, the international status proxy included in our model is relatively blunt in the sense that it is based only on whether a student is identified as being international and fee-paying in the central university database. The faculty draws students from a wide range of cultures. Grouping students only as ‘‘international’’ may mask differences within particular cohorts, for example students from China may perform differently than students from Malaysia whereas students from both locations could well be included in the category ‘‘international, fee-paying’’. Nevertheless, our study contributes to an enhanced understanding of the role of well designed and integrated on-line learning resources in utilising PBL approaches to stimulate enhanced learning outcomes. In this sense, we specifically shed light on the potential for such resources to allow educators to achieve identifiable and quantifiable improvements in learning outcomes measures within a tertiary environment that has become more complex and challenging. Our findings are in accordance with previous studies which emphasise the need for careful planning, consideration and creativity prior

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

31

to implementing on-line learning systems in order for the benefits for learning outcomes to be forthcoming (see for example, Boyce, 1999; McDowall & Jackling, 2004; Stanley & Edwards, 2005). Curriculum design should take into account the work that students do outside class time. On-line learning technologies can allow us to encompass this aspect of student learning in a manner not previously possible with large student numbers and paper and pencil learning environments. The MarlinaLSä is a major step forward in supporting accounting students in their learning outside the regular classroom. This study identifies several opportunities for further research and we specifically mention four opportunities here. First, as briefly discussed earlier in this study, researchers have previously established a link between active learning by students, deep approaches to learning, and enhanced learning outcomes. In this study, we focus only on the link between active learning and enhanced learning outcomes and we do not specifically test for the existence of deep learning. Accordingly, we reiterate the earlier calls of previous authors such as Davidson (2002) who encourage more studies that stimulate the adoption of deeper approaches to learning by students. We are particularly encouraging of future studies that focus on the role of on-line technology in encouraging deep learning. Second, several prior studies also associate active learning with greater internal commitment to, and involvement with, the subject material, and greater levels of student motivation and enjoyment of the learning experience. One possible outcome of this situation is that students engaging in active learning may be more likely to pursue further study, or perhaps even a future career, in the particular discipline area. Thus, to the extent that interactive on-line learning systems such as MarlinaLSä can facilitate active learning, the usage of such systems might improve the attitudes of some students toward their course of study, thus impacting upon the student choice of major/specialisation (Kulik, Kulik, & Cohen, 1980). The choices by students of majors or specialisations within particular subject disciplines have identifiable economic consequences for the institutions and for the academic departments involved. Accordingly, opportunity exists to build on the work of Jackling (2001) and others by exploring the role of on-line learning systems in the choice of major and of future career choice. Third, if approaches to supporting student learning outside face-to-face teaching time such as the MarlinaLSä are effective in engaging students it may be the case that student progression rates are improved. Further research is needed to judge the effect of on-line learning on progression rates. Intuitively, students who are engaged more actively in their study are less likely to drop out of their courses. Finally, we encourage further work to explore the impact of specific teaching strategies on student learning outcomes in a range of contexts. Where these studies embrace different approaches to measuring key variables, such as learning outcomes and student prior knowledge in various settings, the contribution to the existing published literature in this area will be enriched. In such instances, our understanding of the factors which impact upon student learning outcomes and the role of specific teaching strategies for enhancing student learning will increase. Acknowledgements We are grateful to Daniel Quin for assisting us with the data collection for this study. We also acknowledge the helpful comments of Carlin Dowling, Robert Dixon, Colin

32

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

Ferguson, Axel Schulz, participants at research seminars held at the University of Melbourne and at Deakin University as well as the editor, Jim Rebele, on earlier versions of this paper. References Accounting Education Change Commission (AECC). (1990). Objectives of education for accountants: Position statement no. 1. Issues in Accounting Education, vol. 5, pp. 307–312. Adler, R. W., & Milne, M. J. (1997). Improving the quality of accounting students’ learning through actionoriented learning tasks. Accounting Education, 6, 191–215. Albrecht, W. S., & Sack, R. J. (2000). Accounting education: Charting the course through a perilous future. Sarasota, FL: American Accounting Association. Alexander, S. (1999). An evaluation of innovative projects involving communication and information technology in higher education. Higher Education Research & Development, 18, 173–183. Alexander, P., Kulkowich, J., & Schulz, S. (1992). How subject matter knowledge effects recall and interest. In: Paper delivered at the XXV International Congress of Psychology, Brussels. American Assembly of Collegiate Schools of Business (AACSB). (1996). A Report of the AACSB Faculty Leadership Task Force, St. Louis, MO: AACSB. Arbaugh, J. B. (2000). An exploratory study of the effects of gender on student learning and class participation in an Internet-based MBA course. Management Learning, 31, 503–519. Auyeung, P. K., & Sands, J. (1993). An evaluation of secondary school studies as predictors of performance for accounting majors. Australian Educational Researcher, 20, 51–61. Barrett, E., & Lally, V. (1999). Gender differences in an on-line learning environment. Journal of Computer Assisted Learning, 15, 48–60. Barrows, H. S., & Tamblyn, R. M. (1980). Problem-based learning: An approach to medical education. New York: Springer Publishing Co.. Baume, D. (1994). Developing learner autonomy. SEDA paper 84, ISBN 0-946815-73-9. Becker, D. A., & Dwyer, M. N. (1994). Using hypermedia to provide learner control. Journal of Educational Multimedia and Hypermedia, 3, 155–172. Bigelow, J., Seltzer, J., Hall, J., & Gargcia, J. (1999). Management skills in action: four teaching models. Journal of Management Education, 23, 355–376. Biggs, J. (1999). Teaching for quality learning at university: what the student does. Society for Research into Higher Education. Buckingham: Open University Press. Boud, D. (1995). Enhancing learning through self assessment. London: Kogan Page. Boud, D., & Feletti, G. (1991). The challenge of problem based learning. London: Kogan Page. Boyce, G. (1999). Computer-assisted teaching and learning in accounting: pedagogy or product? Journal of Accounting Education, 17, 191–220. Brazelton, J. K. (1998). Implications for women in accounting: some preliminary evidence regarding gender communication. Issues in Accounting Education, 13, 509–530. Brown, G., & Pendlebury, M. (1992). Assessing active learning: effective learning and teaching in higher education. Module 11, Sheffield: CVCP. Buckless, F. A., Lipe, M. G., & Ravenscroft, S. P. (1991). Do gender effects on accounting course performance persist after controlling for general academic aptitude? Issues in Accounting Education, 6, 248–261. Byrne, M., Flood, B., & Willis, P. (2002). The relationship between learning approaches and learning outcomes: a study of Irish accounting students. Accounting Education, 9, 403–406. Carland, J. W., Carland, J. C., & Dye, J. L. (1994). Accounting education: a cooperative learning strategy. Accounting Education, 3, 223–236. Chickering, A. W., & Gamon, Z. F. (1987). Seven principles good practice in undergraduate education (Special insert). The Wingspread Journal, 9, 3–7. Christopher, T., & Debreceny, R. (1993). The impact of English as a second language on the performance of accounting students. Accounting Research Journal, 6, 3–7. Cleaveland, C., & Larkins, E. R. (2004). Web-based practice and feedback improve tax students’ written communication skills. Journal of Accounting Education, 22, 211–228. Davidson, R. A. (2002). Relationship of study approach and exam performance. Journal of Accounting Education, 20, 29–44.

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

33

Davis, J. (1996). Better teaching, more learning: strategies for success in post-secondary settings. Washington, DC: American Council on Education. Dennis, I. (2003). OK in practice – and theory. The experience of using an extended case study in auditing education: a teaching note. Accounting Education, 12, 415–426. Doran, B. M., Boullion, M. L., & Smith, C. G. (1991). Determinants of student performance in accounting principles I and II. Issues in Accounting Education, 6, 74–84. Dowling, C., Godfrey, J. M., & Gyles, N. (2003). Do hybrid flexible delivery teaching methods improve accounting students’ learning outcomes? Accounting Education, 12, 373–391. Duff, A. (1999). Access policy and approaches to learning. Accounting Education, 8, 99–110. Duff, A. (2004). The role of cognitive learning styles in accounting education: Developing learning competencies. Journal of Accounting Education, 22(1), 29–52. Entwistle, N., & Ramsden, P. (1983). Understanding student learning. London: Croom Helm. Eskew, R. K., & Faley, R. H. (1988). Some determinants of student performance in the first college-level financial accounting course. The Accounting Review, LXIII, 137–147. Frederickson, J. R., & Pratt, J. (1995). A model of the accounting education process. Issues in Accounting Education, 10, 229–246. Freeman, M. (1996). The role of the internet in teaching large undergraduate classes, Innovations in teaching and learning discussion paper no. 2, University of Technology, Sydney. Fuller, R. (1998). Encouraging Active Learning at University. HERDSA News, 20(3), 1–5. Gallagher, S. (1997). Problem-based learning: Where did it come from, what does it do, and where is it going? Journal for the Education of the Gifted, 20, 332–362. Hartnett, N., Ro¨mcke, J., & Yap, C. (2004). Student performance in tertiary-level accounting: an international student focus. Accounting and Finance, 44, 163–185. Hoffman, B., & Ritchie, D. (1997). Using multi-media to overcome the problems with problem-based learning. Instructional Science, 25, 97–115. Jackling, B. (2001). Student perceptions of tertiary commerce studies: Influence on accounting as a major study and career choice. Published proceedings, ANZAM conference, Auckland. Kulik, J. A., Kulik, C. C., & Cohen, P. A. (1980). Effectiveness of computer-based college teaching: a metaanalysis of findings. Review of Educational Research, 50, 525–544. Lipe, M. G. (1989). Further evidence on the performance of female versus male accounting students. Issues in Accounting Education, 4, 144–152. Lont, D. (1999). Using an intranet to facilitate student-centered learning. Journal of Accounting Education, 17(2–3), 293–320. Marton, F., & Sa¨ljo¨, R. (1976). On qualitative differences in learning: outcomes and process. British Journal of Educational Psychology, 46, 4–11. Mayo, P., Donelly, M., Nash, P., & Schwartz, R. (1993). Student perceptions of tutor effectiveness in problem based surgery clerkship. Teaching and Learning in Medicine, 5, 227–233. McDowall, T., & Jackling, B. (2004). The role of computer-assisted learning packages in determining learning outcomes of accounting students. Unpublished working paper, Deakin University, Melbourne. Michlitsch, J. F., & Sidle, M. W. (2002). Assessing student learning outcomes: a comparative study of the technologies used in business school disciplines. Journal of Education for Business, 77, 125–130. Milne, M. J., & McConnell, P. J. (2001). Problem-based learning: a pedagogy for using case material in accounting education. Accounting Education, 10(1), 61–82. Mundell, B., & Pennarola, F. (1999). Shifting paradigms in management education: what happens when we take groups seriously? Journal of Management Education, 23, 663–683. Novin, A. M., & Pearson, M. A. (1989). Non-accounting knowledge qualifications for entry-level public accountants. Journal of Accounting Education, 2, 309–325. Nunan, T., George, R., & McCausland, H. (2000). Rethinking the ways in which teaching and learning are supported: the flexible centre at the University of South Australia. Journal of Higher Education Policy and Management, 22, 85–98. Paisey, C., & Paisey, N. J. (2005). Improving education through the use of action research. Journal of Accounting Education, 23, 1–19. Ramsden, P. (1985). Student learning research: Retrospect and prospect. Higher Education Research and Development, 4, 51–69. Ramsden, P. (1992). Learning to learn in higher education. London: Routledge.

34

B.N. Potter, C.G. Johnston / J. of Acc. Ed. 24 (2006) 16–34

Rankin, M., Silvester, M., Vallely, M., & Wyatt, A. (2003). An analysis of the implications of diversity for students’ first level accounting performance. Accounting and Finance, 43, 365–393. Rohde, F., & Kavanagh, M. (1996). Performance in 1st year university accounting: quantifying the advantage of secondary school accounting. Accounting and Finance, 36, 275–285. Sawyer, A. J., Tomlinson, S. R., & Maples, A. J. (2000). Developing essential skills through case study scenarios. Journal of Accounting Education, 18(3), 257–282. Schmeck, R. (1988). Learning strategies and learning styles. New York: Plenum. Severiens, S., & Dam, G. T. (1998). A multi-level meta analysis of gender differences in learning orientations. British Journal of Educational Psychology, 68, 595–608. Stanley, T., & Edwards, P. (2005). Interactive multimedia teaching in Accounting Information systems (AIS) cycles: Student perceptions and views. Journal of Accounting Education, 23, 21–46. Trigwell, K., & Prosser, M. (1991). Improving the quality of student learning: the influence of the learning context and student approaches to learning on learning outcomes. Higher Education, 22, 251–266. Watkins, D. (1982). Identifying the study process dimensions of Australian university students. Australian Journal of Education, 26, 76–85. Watson, S. F., Apostolou, B., Hassell, J. M., & Webber, S. A. (2003). Accounting education literature review 2000–2002. Journal of Accounting Education, 21, 267–325.

Suggest Documents