Conceptions of and approaches to learning through online peer ...

3 downloads 12593 Views 160KB Size Report
c Graduate School of Technological and Vocational Education, National Taiwan University of Science and Technology, #43, Sec.4, Keelung Rd., ... categories of conceptions and approaches to learning via online PA. ..... This idea underlies the sentence ''I also tried my best to give ...... Edwards, S. L., & Bruce, C. S. (2006).
Learning and Instruction 20 (2010) 72e83 www.elsevier.com/locate/learninstruc

Conceptions of and approaches to learning through online peer assessment Yu-Fang Yang a,b, Chin-Chung Tsai c,* a

Applied Foreign Languages Department, Jen-Teh Junior College of Medicine, Nursing and Management, Miaoli 356, Taiwan b Graduate Institute of Engineering National Taiwan University of Science and Technology, Taipei 106, Taiwan c Graduate School of Technological and Vocational Education, National Taiwan University of Science and Technology, #43, Sec.4, Keelung Rd., Taipei 106, Taiwan Received 21 June 2008; revised 17 October 2008; accepted 14 January 2009

Abstract The present study investigated junior college students’ conceptions of and approaches to learning via online peer assessment (PA) using a phenomenographic approach. Participants were 163 college students. Students were asked to accomplish a given learning task via an online PA system. Of the participants, 62 were interviewed after the activity. The interviews revealed hierarchically related and qualitatively different categories of conceptions and approaches to learning via online PA. The main and achieved levels of conceptions of and approaches to learning were determined. The results showed that, within each level, conceptions emphasizing a fragmented and cohesive learning tended to be associated with approaches focusing on surface and deep learning, respectively. In addition, students with cohesive learning conceptions and deep learning approaches were likely to make greater progress in the early stages of online PA activity. The present study finally found that approaches to learning via online PA were less related to the learning outcomes than conceptions of learning. Ó 2009 Elsevier Ltd. All rights reserved. Keywords: Peer assessment; Online learning; Conceptions of learning; Approaches to learning

1. Introduction The use of peer assessment (PA) in higher education is not new. In fact, the rapid development of computer technology has boosted its application in various educational settings. Researchers have been arguing for the potential benefits of implementing online PA (Barak & Rafaeli, 2004; Cho & Schunn, 2007; Davis, 2000; Sung, Chang, Chiou, & Hou, 2005; Tsai, Lin, & Yuan, 2002), yet have spent little time identifying how students interpret and what students do in such a learning environment. The ways in which one is experiencing a situation are vital for the kind of learning that takes place (Chan & Law, 2003; Chan & Sachs, 2001; Marton, Dall’Alba, & Beaty, 1993), particularly because conceptions or beliefs about learning have an effect on the learning process (Hofer & Pintrich, 1997; Zimmerman, 1990). Thus, the

* Corresponding author. Tel.: þ886 2 27376511; fax: þ886 2 27376433. E-mail address: [email protected] (C.-C. Tsai). 0959-4752/$ - see front matter Ó 2009 Elsevier Ltd. All rights reserved. doi:10.1016/j.learninstruc.2009.01.003

present study was conducted to explore students’ conceptions of and approaches to learning through online PA. The role of conceptions of learning and approaches to learning in student performance in tasks implemented in the online PA learning environment was also examined. 1.1. Peer assessment In a review of PA studies, Falchikov and Goldfinch (2000) noted that PA is becoming almost a global trend. In higher education PA has been used extensively as an alternative assessment method. It is believed to increase studentestudent and studenteteacher interactions as well as improving students’ appreciation of other students’ ideas during the learning process (Butler & Hodge, 2001; LeMare & Rubin, 1987; McGourty, 2000; Sluijsmans, Dochy, & Moerkerke, 1999). The use of PA has been reported in a wide range of subject domains with examples from psychology, geography, language, the social sciences, and engineering (Topping, Smith, Swanson, & Elliot, 2000). Moreover, teachers agree

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

that PA is useful because it helps students learn about the evaluation process (Zevenbergen, 2001). Lin, Liu, and Yuan (2001) proposed that online PA has more advantages than traditional paper-and-pencil PA, because it helps solving the problems of increased teacher workload and class size (Davis, 2000), saves grading time (McGourty, 2000), and makes the anonymity of authorship possible (Wen & Tsai, 2006). Moreover, it has been found that the anonymous environment created by online PA allowed students to freely express their ideas and thoughts about other students’ work without restrictions of time and location (McConnell, 2002; Rubin, 2002; Topping, 1998; Tsai et al., 2002; Tsai, Liu, Lin, & Yuan, 2001). Some studies have also shown that instead of being uncomfortable when evaluating their peers, students can give honest and fair assessment in a PA learning environment (Freeman & McKenzie, 2002; Tsai et al., 2001). In this line of research, the study reported here was an attempt to implement online PA to help student learning in an English course. 1.2. Phenomenography Phenomenography is a research method that investigates the qualitatively different ways with which people experience something or think about something in the world around them (Marton, 1986). The focus of a phenomenographic analysis is on individual variations: variations in the perceptions or conceptions of the phenomenon experienced by the persons. In addition, it aims at a collective analysis of individual experi˚ kerlind, 2005), that is, sorting the variations into ences (A specific categories called outcome spaces. Following this research methodology, Marton et al. (1993), Prosser and Trigwell (1999) and Sa¨ljo¨ (1979) studied how students conceived learning. In Sa¨ljo¨’s (1979) study, for example, five qualitatively different conceptions were identified, namely ‘‘increasing knowledge’’, ‘‘memorization’’, ‘‘acquiring facts and procedures’’, ‘‘abstracting meaning’’, and ‘‘understanding reality.’’ Within the field of student learning studies, those framed by a phenomenographic perspective have also attempted to explore two aspects of learning: what is learned and how learning is taking place (Ellis & Calvo, 2006; Marton & Sa¨ljo¨, 1976a, 1976b; Prosser & Trigwell, 1999). The former is referred to as conceptions of learning (what students think they are learning) and the latter as approaches to learning (how students approach their learning). 1.3. Conceptions of learning and approaches to learning Educational research is showing a growing interest in understanding students’ conceptions of learning and approaches to learning in different learning subjects and situations, including engineering (Marshall, Summer, & Woolnough, 1999), science (Tsai, 2004; Tsai & Kuo, 2008), classroom discussions (Ellis & Calvo, 2006; Ellis, Goodyear, Calvo, & Prosser, 2008; Ellis, Goodyear, Prosser, & Ohara, 2006), writing process (Scheuer, de la Cruz, Pozo, Huarte, & Sola, 2006), information searching

73

(Edwards & Bruce, 2006) and distance education (Makoe, Richardson, & Price, 2008). Conceptions of learning are concerned with what the learner thinks the objects and process of learning are (Benson & Lor, 1999). Phenomenographic research on conceptions of learning has identified hierarchical categories of conceptions (Benson & Lor, 1999; Prosser, Trigwell, & Taylor, 1994; Vermunt & Vermetten, 2004). Broadly speaking, conceptions of learning can be categorized as ‘‘fragmented’’ and ‘‘cohesive’’ (Marton et al., 1993). Fragmented conceptions reflect none or little understanding of the relationship of the learning environment with the promoted student learning. Cohesive conceptions, on the other hand, reflect a better understanding of the connectedness and dependency between the learning environment and student learning. In addition, there has been empirical evidence suggesting that cohesive conceptions of learning lead to better personal learning outcomes (Edwards & Bruce, 2006; Ellis & Calvo, 2006; Scheuer et al., 2006). However, as Marshall et al. (1999) summarized in their study, although overall similarities in the conceptions of learning have been identified in many studies, variations within the conceptions which are context-dependent may still occur. In other words, conceptions may take various forms within different cultural or educational contexts (Buehl & Alexander, 2001; Hofer, 2000; Tsai, 2006; Tsai & Kuo, 2008). Approaches to learning have been extensively investigated in various subject domains and on different academic tasks. For example, approaches to problem solving in engineering (Marshall et al., 1999), approaches to solving an applied problem (Lonka, Joram, & Bryson, 1996), and approaches to learning through discussion (Ellis, Goodyear, Brillant, & Prosser, 2008). Approaches to learning are distinguished into ‘‘surface’’ and ‘‘deep’’ approaches (Chin & Brown, 2000). Surface approaches show an orientation toward engagement with the learning environment in ways that promote reproduction while deep approaches show an orientation toward real understating. One consistent result from the studies is that surface approaches tend to be linked with performance of lower quality while deep approaches with performance of higher quality (Ellis, Goodyear, Brillant et al., 2008; Ellis, Goodyear, calvo, & Prosser, 2008; Lonka et al., 1996; Marshall et al., 1999). It is generally found that conceptions of learning are related to approaches to learning (Dart et al., 2000; Lee, Johanson, & Tsai, 2008; Purdie, Hattie, & Douglas, 1996). For example, Purdie et al. (1996) found that a conception of learning as understanding was associated with extensive use of strategies. Prosser and Trigwell (1999) verified the associations between conceptions of and approaches to learning, and pointed out that cohesive conceptions complemented deep approaches to learning while fragmented conceptions complemented surface approaches. Similarly, Minasian-Batmanian, Lingard, and Prosser (2006) found that students with conceptions related to restructuring of existing knowledge (i.e., cohesive conceptions of learning), were more likely to adopt approaches associated with an orientation toward active learning (i.e., deep approach).

74

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

Students’ conceptions of learning have also been found to be strongly related to the learning process, and therefore to learning outcomes (Cano, 2005; Chin & Brown, 2000; Dart et al., 2000; Purdie et al., 1996; Tsai, 2004). Biggs (1993) proposed a framework known as 3P model (presage, process, and product factors) for understanding student learning. In this model not only interactions between the components but, also, a linear movement from presage to process to product is envisaged. It is important to note that in this path of linear movement, conceptions of learning are listed among variables of presage and approaches to learning are termed as process. That is to say, conceptions of learning influence the approaches to learning (Dart et al., 2000) and approaches to learning are consequently influencing learning outcomes.

press; Tseng & Tsai, 2007; Wen & Tsai, 2006). Based on their model, the PA activity in this study was designed to be conducted in three rounds. Students were, first, asked to submit their assignments via an online system for PA. They were then asked to give comments or suggestions on their peers’ assignments also via the online system. Each student’s assignment was assessed by four or five peers, that is, each student evaluated four or five peers’ work. Then, they were asked to revise their own work after getting their peers’ comments and suggestions. All students assessed their peers’ work three times and revised their own work twice. The whole PA activity was undertaken in an anonymous way for about two months.

1.4. The present study e hypotheses

Prior to the first assignment, all participants had a training session on assessment which proceeded through the following stages: (a) First, the instructor explained the evaluation dimensions (described below) in detail and gave examples. (b) Students received a sample work and commented on it according to the evaluation dimensions. (c) Some of the students were asked to report what remarks they had made on each dimension. The instructor questioned these students why a particular suggestion was given and explained why certain suggestions could be more appropriate than others. (d) Finally, the second and third stage was repeated with another sample work. Such a warm-up activity was expected to familiarize students with the evaluation dimensions before assessing peers’ work on the online PA system.

Since there is limited phenomenographic research on online PA, in the present study we opted for a phenomenograhpic analysis of individual variations in the conceptions of and approaches to learning via online PA. The common ideas present in individual variations were also identified to develop hierarchically related categories for a collective description. In addition, the relationships between conceptions of learning, approaches to learning, and learning outcomes via online PA assessment were explored. Based on the literature overview, we formulated the following hypotheses: (a) Students’ conceptions of learning via online PA will be associated with students’ approaches to learning via online PA (Hypothesis 1). (b) Students’ conceptions of learning via online PA will be related to their learning outcomes (Hypothesis 2). (c) Students’ approaches to learning via online PA will be related to their learning outcomes (Hypothesis 3). (d) Students’ approaches to learning via online PA will show a stronger relationship to learning outcomes than their respective conceptions of learning (Hypothesis 4). 2. Method 2.1. Participants The participants were 163 college students (49 males and 114 females) in Taiwan. Most of them were majors of nursing, medicine, or management. Their mean age was 17.34 (SD ¼ 1.69). They all enrolled in an English course of intermediate level. They were asked to complete an assignment in English, namely to design the studying material of a lesson in English on a topic they preferred. In this assignment, one should first decide a topic he/she was interested in, and then develop a lesson in English at least with text, vocabulary, and exercise parts in it to be used as studying material in future English classes. 2.2. Online PA of the assignment in English The online system and PA module utilized in the present study were similar to that used in previous research (Tsai & Liang, in

2.3. Training session

2.4. Evaluation and scoring method For each PA round, students’ assignments were qualitatively evaluated by their peers and quantitatively assessed by the instructor on three dimensions adopted from Tsai et al. (2002) and Tseng and Tsai (2007). These dimensions were feasibility, creativity, and knowledge. Since peer assessments have not always been valid in some studies (Mathews, 1994; Mockford, 1994; Mowl & Pain, 1995), only the instructor’s scores were taken as indicators of students’ performance for this study. Similar to what was adopted in Tsai and Liang (in press) and Tseng and Tsai (2007), the instructor gave a score between 1 and 7 (with one point as unit) to each student’s work on each dimension. However, their scores were not revealed during the whole PA process. A second rater, an English instructor with seven years of teaching experience was also asked to evaluate twenty assignments of the students’ work. The aim was the examination of the reliability of scoring by using the same 1e7 score for each dimension. The mean correlation coefficient between the two raters on each assessment dimension was .74 ( p < .001). 2.4.1. Feasibility Feasibility refers to the extent to which the students’ work could be practically exploited in real English classrooms. For example, in the text part, student A was scored 1 because he/ she selected an English text which was too lengthy (e.g., more

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

than 600 words) and too difficult (e.g., more than 20 unknown vocabulary words) for his/her peer students to comprehend while student B was scored 7 because he/she chose an English article with more appropriate written material (a text of 400e 600 words with 15e20 unknown vocabulary words). 2.4.2. Creativity Creativity refers to the extent to which the students’ work was designed with originality or new ideas. For example, in the exercise part, student A was scored 1 because he/she simply copied some irrelevant grammar quizzes from the Internet or other textbooks while student B was scored 7 because he/she designed a set of questions to test applicable knowledge related to the chosen text. 2.4.3. Knowledge Knowledge refers to the extent to which the students’ work was accompanied with useful knowledge or information to be learned. For example, in the vocabulary part, student A was scored 1 because he/she did not give sample sentences to acquaint the readers with word usage while student B was scored 7 because he/she put sufficient sample sentences to elaborate a word meaning and its usage. 2.5. Interviews Students were informed that after the activity they were going to talk about their experience with the online PA and they were specifically asked what they had done in the PA learning environment. As the interview data would be coded into different conception and approach categories, which would then be used for statistical analyses, a large number of interviews were therefore necessary. From the 163 participating students, 80 were randomly selected and were invited to the interviews. However, only 62 of the selected students completed the interviews, because it was the last week of students’ 18-week semester. The main questions used in the interviews were as follows: 1. Based on your experiences, what, do you think, is online PA? 2. What, do you think, is the purpose of online PA for learning? 3. Why do you think the teacher used online PA as a way of learning, and if possible, why will you use online PA as a way of learning in the future? 4. What sort of things did you do to engage in online PA? What did you do during online PA activity? 5. When you were undertaking an online PA task, what strategies did you use and why did you use those strategies? Questions 1e3 investigated student conceptions of learning via online PA. Questions 4 and 5 investigated student approaches to learning via online PA. All interviews (about 15e20 min for each one) were audio-recorded and subsequently transcribed verbatim in Chinese.

75

2.6. Method of analysis The responses to the five interview questions were analyzed by two researchers following a phenomenographic approach, which is a qualitative methodology in terms of its emphasis on collecting data in the form of interviews, analyzing the data by identifying qualitative relationships and displaying its outcomes in ordered categories. Conceptions of learning via online PA were analyzed according to the overall responses to Questions 1eQuestion 3. Approaches to learning via online PA were analyzed based on the overall responses to Questions 4 and 5. It was noted that in many situations, students’ responses overlapped among the various categories. To systematically assess the qualitative variation across the range of students’ learning conceptions and approaches, the idea of using ‘‘main’’ and ‘‘achieved’’ levels to categorize each student’s overall response was adopted. Thus, responses to the interview questions were labeled both as ‘‘main level’’ and ‘‘achieved level’’ within both conception and approach types. Take, for example, responses to Questions 4 and 5 (for approaches); all students’ responses to Questions 4 and 5 were read carefully to get an idea of the possible variations in the approach employed. As the researchers read and discussed the responses, key ideas began to emerge in the interview data. The key ideas were agreed between the researchers and were grouped into logically and hierarchically related categories which form the outcome space for approaches to online PA with classifications letter from A to E. All the responses were read again by one of the researchers to see if they fell within the categories being identified both at ‘‘main level’’ and ‘‘achieved level.’’ The main level was determined based on the key idea which appeared with the highest frequency in the student’s response; the achieved level was based on the key idea which appeared, even once, and represented the highest level in a hierarchy of related categories based on all of the students’ responses. For instance, one of the students responded to Questions 4 and 5 that: ‘‘I kept in mind every single deadline for submitting my English assignments. I’d fail the course if I missed any of the tasks. It’s a very novel idea for me to hand in the homework via the Internet. Besides, during the activity, I usually could get correct answers or more suggestions from peers for improvement in my work. I also tried my best to give helpful advice on peers’ work in return.’’ In this case, the ‘‘main level’’ was decided based on the idea associated with ‘‘meeting course requirements (Category A)’’ due to its higher frequency (with a total of three times) among the response sentences. The idea appeared repeatedly, for example, in the sentences ‘‘I kept in mind every single deadline for submitting my English assignments’’, ‘‘I’d fail the course if I missed any of the tasks’’, and ‘‘It’s a very novel idea for me to hand in the homework via the Internet.’’ On the other hand, the ‘‘achieved level’’ was based on the idea that reflected the aim of the online PA, namely ‘‘carefully evaluating the work of different individuals (Category D).’’

76

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

This idea underlies the sentence ‘‘I also tried my best to give helpful advice on peers’ work in return.’’ Thus, this idea represents the highest level achieved in approach to learning by online PA as expressed by the student, even though it was not mentioned by the student frequently. (Full details about the hierarchical categories of students’ responses belonging to the main and achieved levels are shown later.) The ‘‘main level’’ and ‘‘achieved level’’ represent the initial classification of students’ responses regarding their conceptions of and approaches to learning via online PA. Previous research on students’ conceptions of learning found that they vary across a number of different categories (Lee et al., 2008; Marton et al., 1993). The characterization of conceptions (and approaches) as ‘‘main’’ and ‘‘achieved’’ provides an alternative way of analyzing students’ conceptions of learning (and approaches to learning), because it allows the identification of predominant conceptions as well as the highest level these conceptions represent. In essence, the distinction between main and achieved level shows the zone of proximal development (Vygotsky, 1978) of each student. After the initial classification of student-interview responses into main and achieved levels, the data of 20 interviews were chosen at random and were categorized by another researcher independently using the same outcome space and coding criteria. The percentage of agreement was then calculated to determine the reliability of the researcher’s coding. The percentage of agreement regarding the main conception, achieved conception, main approach, and achieved approach were 80%, 85%, 85% and 95%, respectively, for the initial classifications. These percentages were increased up to 95%, 95%, 95% and 100% respectively after consultation of the two coders. Finally, for the testing of the hypotheses, the Pearson chisquare test was performed to find out the associations between categories. Also, individual t-tests were carried out to find out the differences between categories and students’ progress in the assignments. Effect size (Cohen’s d ) was also computed to ensure the statistical power of the analysis. 3. Results 3.1. Categories of conceptions of learning via online PA Table 1 illustrates the qualitative variations in students’ conceptions of learning via online PA. Analysis of the responses gathered suggested an initial framework of six hierarchically related categories, namely categories in a ranked series that capture students’ different conceptions of online PA. Each of the categories is described using the words expressed by the students as they described their experiences. Also, following Prosser and Trigwell (1999), we characterized the conceptions as ‘‘fragmented’’ or ‘‘cohesive.’’ Each category of responses begins with a classification letter from A, which is the most fragmented conception, to F, which is the most cohesive conception. For example, Category A does not suggest any association between conceptions of online PA and learning promoted by the learning environment;

it only concerns technical practice in the activity. However, Category F reveals more associations between conceptions of online PA and the learning promoted by the learning environment and involves awareness of critical thinking. The categories in Table 1 are also accompanied by a descriptive phrase that depicts the key aspects of the category, and sample quotations that best represent the meaning of the category. 3.2. Categories of approaches to learning via online PA Table 2 presents the qualitative variations in approaches to learning via online PA. Students’ responses regarding the approaches to online PA were structured into five hierarchically related categories. They were characterized in terms of surface and deep approaches (Ellis, Steed, & Applebee, 2006; Marton, 1983). In surface approaches, there is an orientation toward reproduction or memorization of knowledge, whereas in deep approaches, there is an orientation toward understanding, critical thinking and active learning. As in Table 1, each group of responses begins with a classification letter from A (the category reflecting a high surface approach) to E (the category reflecting a high deep approach). For example, Category A separates online PA activity from the learning outcomes which were intended to be promoted. On the other hand, Category E shows engagement in activities that promote one of the key purposes or ideas underlying the use of online PA activity. The categories in Table 2 are followed by a descriptive phrase that depicts the key aspects of the category and sample quotations that best represent the meaning of the category. 3.3. Coding of categories and distributions of variation in conceptions and approaches Table 3 shows the distribution of ‘‘main’’ and ‘‘achieved’’ levels in conceptions of and approaches to learning via online PA. As described earlier, main level was determined based on the key idea which appeared with the highest frequency in a student’s response, while achieved level was determined based on the key idea which appeared, even once, at the highest level. In Table 3, the cross-tabulation of responses classified in each of the categories of the conceptions of and approaches to learning via online PA (along with the type of conceptions and approaches they reflect) at the main and achieved levels is presented. As can be seen, 63% (n ¼ 39) of responses categorized as main conceptions were classified as fragmented, and 37% (n ¼ 23) as cohesive. As to main approaches, 67% (n ¼ 42) of responses were classified as surface and 33% (n ¼ 20) as deep. On the other hand, 29% (n ¼ 18) of responses belonging to the achieved conceptions were classified as fragmented and 71% (n ¼ 44) as cohesive. As to achieved approaches, 39% (n ¼ 24) of responses were classified as surface and 61% (n ¼ 38) as deep. It should be noted that at the main level most students understood learning via online PA in a fragmented manner (63%, n ¼ 39) and followed a surface approach (67%, n ¼ 42).

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

77

Table 1 Categories of conceptions of learning via online PA. Description Category A Online PA is seen as a drill for some related computer skills. Category B Online PA is seen as a procedure of submitting assignment.

Category C Online PA is seen as a platform of learning new information. Category D Online PA is seen as a channel to exchange ideas.

Examples of interview responses by students ‘‘.I thinkit’s a verygood way to practicesome computer basic skills, such as copying, pasting, uploading and downloading.’’ ‘‘.It’s not like paper-and pencil work, all you have to do is typing, browsing and surfing the Internet.’’ ‘‘.All I remember is that I have to pay attention to every deadline set for submitting my assignment.’’ ‘‘.It’s interesting to me. I haven’thad the experience handing in any homework via the Internet.’’ ‘‘.An innovative way to collect students’ homework via the Internet, I think it’ll save a lot of time for the teacher.’’ ‘‘.When I am collecting useful materials on the Internet, I learn about new information at the same time.’’ ‘‘.The topicsand articles included in peers’ assignmentusually bring me new knowledge. ’’ ‘‘.Some suggestions from the peer reviewers are quite useful while others are not. But they do help me think in another way. I usually explain my ideas to them in the following run, though I don’t know who they are.’’

Category E Online PA is seen as a way to understand the ideas ‘‘.I think the teacher want me to have a look at what my classmates are doing and thinking in their from different perspectives. assignment. ’’ ‘‘.I try to figure out what kind of message is conveyed.In this way I get to knowthat not everyone shares the same point of view.’’ Category F Online PA is seen as a training of critical thinking. ‘‘It’s kind of excited.on the one hand, I can see how my peers comment on my work,on the other hand, I am responsible for criticizing others’ work.’’ ‘‘I feel like being a teacher. I have to make judgments on peers’ work.it is a pressured job for me ‘ cause I haveto find out strengths and weaknesses in peers’work before giving my suggestions .’’

Table 2 Categories of approaches to learning via online PA. Description Category A I engage in online PA to meet course requirements.

Category B I engage in online PA to collect extra information outside physical classrooms.

Category C I engage in online PA to get solutions for correcting possible mistakes.

Examples of interview responses by students ‘‘.I try not to miss the deadline of every task in order to complete the whole assignment given by the teacher.’’ ‘‘I am afraid that I would fail this course if I don’t involve in this online activity.’’ ‘‘Compared to lectures,it’s more interesting.I hate boring lectures in classrooms.I can learn some useful English vocabulary while collecting materials on the Internet.’’ ‘‘.I like to learn in this way ‘cause I always can find quite a lot of extra information about weight control in peers’ work.’’ ‘‘Sometimes I have no idea whether I’m doing everything right. Once I uploaded my assignment, I started to look forward to peers’ responses.’’ ‘‘.That’s one of the best ways to get correct answers.I could get more suggestions in this way.’’

Category D I engage in online PA to carefully evaluate the work of ‘‘.I am responsible for giving opinions to my peers; just like my teachers.I have to tell who has different individuals. made efforts in his or her assignment to give him/her a higher score.’’ ‘‘.assessing peers’work is the most difficult job for me ‘cause I need to spend lots of time figuring out what is good and what is not good in peers’ work.’’ Category E I engage in online PA to reflect on my work using peer ‘‘ .I think that peers would understand me more than the teachers do.some of their ideas really comments. inspire me a lot .it’s a goodway to think more about my own work.’’ ‘‘.different suggestions come from different peers.I gather all the opinions to evaluate and improve my own work. ’’

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

78

Table 3 The distribution of variation in conceptions of and approaches to learning (N ¼ 62). Type

Category

Conceptions of learning via online PA Fragmented A B C Cohesive

D E F Total

Approaches to learning via online PA Surface A B C Deep

D E Total

Main

Achieved

7 (11%) 14 (23%) 18 (29%)

1 (2%) 7 (11%) 10 (16%)

9 12 2 62

10 27 7 62

(15%) (19%) (3%) (100%)

(16%) (44%) (11%) (100%)

15 (24%) 20 (32%) 7 (11%)

4 (7%) 12 (19%) 8 (13%)

17 (28%) 3 (5%) 62 (100%)

20 (32%) 18 (29%) 62 (100%)

Interestingly, the situation was reversed at the achieved level since most students participated in the activity having a cohesive conception (72%, n ¼ 43) and deep approach (61%, n ¼ 38). While one might be disappointed by the fact that students tended to have fragmented learning conceptions and adopt surface learning approaches at the main level, it is encouraging, at the same time, that at the achieved level students had mainly cohesive conceptions and deep approaches. This suggests that students have the potential for improving their learning. However, with the main level representing students’ dominant ideas, the high percentages of fragmented learning conceptions and surface learning approaches deserve more attention among educators who are interested in online PA. 3.4. Association between the conceptions of and the approaches to learning via online PA The Pearson chi-square test was performed to identify the association between conceptions and approaches to learning via online PA at both the main level and the achieved level (see Table 4). It was found that at the main level there was a significant association between conceptions and approaches, c2(1, N ¼ 62) ¼ 17.46, p < .001, namely students with fragmented conceptions tended to adopt Table 4 Cross-tabulation of conceptions of learning and approaches to learning at the main level and achieved level (N ¼ 62). Conceptions/approaches

Fragmented

Cohesive

Main level Surface Deep

35 5

8 14

Achieved level Surface Deep

17 3

5 37

surface approaches (56%) while those with cohesive conceptions were choosing deep approaches (23%). Similar result was obtained in the case of the achieved level, c2(1, N ¼ 62) ¼ 31.62, p < .001. Students with fragmented conceptions tended to adopt a surface approach (27%) and those with cohesive conceptions were choosing deep approaches (60%). Therefore, it can be concluded that conceptions and approaches at both the main and achieved level are highly associated. 3.5. Learning outcomes of groups with different conceptions and approaches Table 5 shows students’ performance scores in the dimensions of feasibility, creativity, and knowledge in each of the three rounds. In Table 5, conceptions are classified as fragmented and cohesive and approaches as surface and deep, within the main and achieved levels. As this study used students’ progress in the online PA learning environment as an indicator of their learning outcome, it was important to carefully examine students’ initial performance to ensure that no extreme scores would be misinterpreted as progress or nonprogress for different conception or approach groups. A series of t-tests were performed on the scores of each dimension in the first round between students of different conception or approach groups. All tests were conducted at the .05 level of significance. At the main level, there was no significant difference found between students with fragmented conceptions and those with cohesive conceptions in the scores of feasibility, t(60) ¼ .772, p ¼ .097; of creativity, t(60) ¼ 1.11, p ¼ .136; and of knowledge, t(60) ¼ .034, p ¼ .348. At the achieved level, there was also no significant difference found between students with fragmented conceptions and those with cohesive conceptions in the scores of feasibility, t(60) ¼ 1.238, p ¼ .517; of creativity, t(60) ¼ 1.020, p ¼ .601; and of knowledge, t(60) ¼ 1.188, p ¼ .412. Likewise, at the main level, there was no significant difference found between students with surface approaches and those with deep approaches in the scores of feasibility, t(60) ¼ .858, p ¼ .095; of creativity, t(60) ¼ .085, p ¼ .471; and of knowledge t(60) ¼ .623, p ¼ .098. At the achieved level of different approaches, there was also no significant difference found in the scores of feasibility, t(60) ¼ .121, p ¼ .101; of creativity, t(60) ¼ .623, p ¼ .229; and of knowledge t(60) ¼ .811, p ¼ .154. Simply stated, whether at the main or achieved level, in the beginning stage of the online peer assessment, no significant differences were found between students with fragmented conceptions and those with cohesive conceptions or between students with surface approaches and those with deep approaches, which indicated that students from different conception or approach groups started with similar quality of performance in the online PA learning environment. Tables 6 and 7 show students’ progress, that is, gain scores in the dimensions of feasibility, creativity, and knowledge. Students are categorized according to the type of conception

14.68 (3.00)

(3.12) (2.10) (2.67) (2.58) 13.95 16.20 12.78 15.87

13.47 (3.18)

(3.30) (2.58) (2.97) (2.91) 12.66 15.15 11.58 14.67

11.75 (3.33)

(3.00) (3.99) (2.79) (3.66) 11.61 12.09 11.46 11.94

5.02 (.95)

(1.01) (.68) (.93) (.76) 4.83 5.40 4.42 5.39

4.61 (1.12)

(1.17) (.85) (1.06) (.99) 4.38 5.10 4.00 5.00

4.11 (1.19)

(1.80) (1.41) (1.04) (1.28) 4.05 4.25 3.96 4.21

4.84 (1.26)

(1.27) (.89) (1.15) (1.11) 4.50 5.55 4.13 5.29

4.39 (1.27)

(1.23) (1.08) (1.13) (1.15) 4.10 5.00 3.67 4.84

3.58 (1.22)

(1.17) (1.35) (1.02) (1.34)

4.82 (1.03)

3.57 3.60 3.46 3.66 (1.08) (.79) (.85) (.98) Total

N ¼ 62

4.06 (1.17)

4.47 (1.12)

4.62 5.25 4.25 5.18 4.19 5.05 3.92 4.82 Approaches to learningvia online PA Main Surface (n ¼ 42) 3.98 (1.00) Deep (n ¼ 20) 4.25 (1.48) Achieved Surface (n ¼ 24) 4.04 (.96) Deep (n ¼ 38) 4.08 (1.30)

(1.09) (.95) (.97) (1.06)

14.22 16.98 12.12 15.72 13.17 15.48 10.62 14.64 (2.70) (4.26) (3.03) (3.42) 12.45 11.94 10.95 12.09 4.87 5.89 4.28 5.32 4.74 5.85 3.78 5.27 4.39 5.24 3.33 4.82 3.99 3.69 3.33 3.68 4.60 5.23 4.06 5.14 4.25 4.86 3.56 4.84 (1.00) (1.44) (1.00) (1.23) 4.15 3.91 3.78 4.18 Conceptions of learningvia online PA Main Fragmented (n ¼ 39) Cohesive (n ¼ 23) Achieved Fragmented (n ¼ 18) Cohesive (n ¼ 44)

2nd

(1.15) (.94) (1.04) (.91)

3rd

(1.08) (.81) (.94) (.91)

1st 1st

(.91) (1.23) (1.09) (1.27)

2nd

(1.26) (1.15) (1.14) (1.06)

3rd

(1.22) (1.01) (1.17) (1.02)

4.33 4.34 3.83 4.23

(1.00) (1.22) (1.10) (1.22)

4.53 5.39 3.72 4.98

(1.10) (1.05) (1.02) (.95)

3rd 2nd 1st

Knowledge Creativity Feasibility Type Level

Table 5 Means (and SD) of performance scores in the three rounds as a function of level and type of conceptions of and approaches to learning via online PA.

(.93) (.74) (.96) (.77)

1st

Overall score

2nd

(3.24) (2.91) (300) (2.61)

3rd

(3.00) (2.31) (2.91) (2.37)

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

79

and approach. The gain scores representing the difference of the scores of the second round from the scores of the first round were marked as ‘‘early’’; the scores of the third round minus the scores of the second round were marked as ‘‘later,’’ and the gain scores of the third round from the scores of the first round as ‘‘total.’’ The overall gain in the online PA learning environment was obtained by adding the total gains from the three dimensions. A series of t-tests were performed for each gain score within each of the three dimensions to assess whether the means of different conception and approach groups are statistically different from each other. In Table 6, the gain scores between different conception groups toward online PA are presented. Students with cohesive conceptions at ‘‘main’’ level were more likely to make progress than those with fragmented conceptions in the early stages of online PA on the dimensions of feasibility, t(60) ¼ 4.02, p < .001, Cohen’s d ¼ 1.03; of creativity, t(60) ¼ 5.35, p < .001, Cohen’s d ¼ 1.4; and of knowledge, t(60) ¼ 3.93, p < .001, Cohen’s d ¼ 1.06. For example, in the dimension of feasibility, the mean gain score of students with fragmented conceptions was .10 (SD ¼ .74) while that of students with cohesive conceptions is .95 (SD ¼ .90). In dimension of creativity, the mean gain score of students with fragmented conceptions is .40 (SD ¼ .78) while that of students with cohesive conceptions was 1.55 (SD ¼ .86). In dimension of knowledge, the mean gain score of students with fragmented conceptions was .20 (SD ¼ .82) while that of students with cohesive conceptions was 1.05 (SD ¼ .79). The same pattern of results could also be drawn from ‘‘achieved’’ level in the early stages in the dimensions of feasibility, t(60) ¼ 3.91, p < .001, Cohen’s d ¼ 1.19; of creativity, t(60) ¼ 4.90, p < .001, Cohen’s d ¼ 1.41; and of knowledge, t(60) ¼ 3.77, p < .001, Cohen’s d ¼ 1.10. In Table 7, the gain scores between different approach groups in online PA assessment are presented. Students with deep learning approaches at main level were apt to make more progress than those with surface learning approaches in early stages in the dimensions of feasibility, t(60) ¼ 2.51, p < .05, Cohen’s d ¼ .68; of creativity, t(60) ¼ 3.63, p < .01, Cohen’s d ¼ .99; and of knowledge, t(60) ¼ 2.18, p < .05, Cohen’s d ¼ .61. For example, in the dimension of feasibility, the mean gain score of students with surface approaches was .21 (SD ¼ .84) while that of students with deep approaches was .80 (SD ¼ .89). In the dimension of creativity, the mean gain score of students with surface approaches was .52 (SD ¼ .89) while that of students with deep approaches was 1.40 (SD ¼ .88). In the dimension of knowledge, the mean gain score of students with surface approaches was .33 (SD ¼ .90) while that of students with deep approaches was .85 (SD ¼ .81). The same pattern of results at ‘‘achieved’’ level could also be observed in the early stages in the dimensions of feasibility, t(60) ¼ 4.81, p < .001, Cohen’s d ¼ 1.17; of creativity, t(60) ¼ 4.39, p < .001, Cohen’s d ¼ 1.15; and of knowledge, t(60) ¼ 3.46, p < .001, Cohen’s d ¼ .93. Overall, the results from the above analyses showed that students with cohesive learning conceptions tended to benefit more than those with fragmented conceptions in their learning

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

80

Table 6 Means (and SD) of gain scores in the three dimensions of conceptions of learning via online PA as a function of level, type, and stage. Feasibility Type

Early

Main level Surface (n ¼ 42) .10 (.74)

Creativity Later

Total

.35 (.58) .45 (.85)

Knowledge

Early

Later

Total

.40 (.78)

.35 (.58) .75 (.74)

Overall gains

Early

Later

Total

.20 (.82)

.35 (.53) .55 (.64)

Early

Later

Total

.69 (1.98)

1.05 (1.23) 1.74 (1.83)

Deep (n ¼ 20)

.95 (.90)

t(Two-tailed)

4.02*** .93

3.25**

5.35*** 1.96

6.25***

3.93*** .92

3.66***

5.13***

1.32

4.52***

Cohen’s d

1.03

.81

1.4

.05

1.57

1.06

1.04

1.34

.33

1.28

.00 (.77)

.44 (.70) .44 (.61)

.36 (.49) 1.32 (1.25) 1.55 (.86) .64 (.49) 2.18 (1.05) 1.05 (.79) .50 (.74) 1.55 (1.20) 3.54 (2.25) 1.50 (1.38) 5.04 (3.15)

.02

Achieved level Surface (n ¼ 24) .22 (.55) .50 (.62) .28 (.75)

.23

.11 (.68) .56 (.51) .44 (.62)

.33 (1.74) 1.50 (1.47) 1.17 (1.71)

Deep (n ¼ 38)

.66 (.89)

t(Two-tailed)

3.91*** 1.35

2.32*

4.90*** .06

5.24***

3.77*** 1.26

2.43*

4.84***

1.13

3.37**

Cohen’s d

1.19

.71

1.41

1.31

1.10

.75

1.42

.31

1.03

.30 (.51) .95 (1.14)

.35

1.14 (.85) .45 (.50) 1.59 (1.08) .75 (.87)

.02

.34 (.65) 1.09 (1.05) 2.52 (2.25) 1.08 (1.23) 3.63 (2.91)

.38

*p < .05; **p < .01; ***p < .001. Cohen’s d is the effect size of the practical significance between groups. The practical significance is large when Cohen’s d is larger than .80, medium when it is between .50 and .80, small to medium when it is between .20 and .50, and small when it is smaller than .20.

outcomes, especially in the early stage. Similarly, students with deep learning approaches were likely to improve more than those with surface learning approaches. However, when learning conceptions and learning approaches were compared as regards their effect sizes, at both the ‘‘main’’ and ‘‘achieved’’ levels, almost all of the effect size values in Table 6 (for conception) were larger than those in Table 7 (for approach). For example, the effect size of overall gain for the conception of learning in ‘‘main’’ level in the early stage was 1.34, in the late stage .33, and the total was 1.28 (see Table 6), while that for the approach of learning in ‘‘main’’ level in the early stage was .86, in the late stage .19, and the total was .62 (see Table 7). It is, therefore, implied that the importance of learning approaches was not as dominant as that of learning conceptions.

4. Discussion The present study investigated the learning experience of students in an online PA learning environment. The analysis of qualitative variations in conceptions of and approaches to learning via online PA provided a window for the researchers to investigate learning from the students’ perspective. Moreover, the main and achieved levels of students’ responses were identified, showing the student potential in the hierarchy of conceptions and approaches. Finally, the quantitative analysis presented the learning outcomes between different conception and approach groups. In the present study, the conceptions of learning via online PA have ranged from ‘‘a drill for some related computer

Table 7 Means (and SD) of gain scores in the three dimensions of approaches to learning via online PA as a function of level, type, and stage. Feasibility Type

Early

Main level Surface (n ¼ 42) .21 (.84)

Creativity Later

Total

.43 (.59) .64 (1.03)

Knowledge

Early

Later

Total

.52 (.89)

.40 (.59) .92 (.95)

Early

Later

Overall gains Total

.33 (.90) .45 (.63) .79 (.93)

Early

Later

Total

1.08 (1.83) 1.29 (1.38) 2.37 (2.61)

Deep (n ¼ 20)

.80 (.89)

.20 (.41) 1.00 (1.17) 1.40 (.88) .55 (.51) 1.95 (1.10) .85 (.81) .30 (.57) 1.15 (1.09) 3.06 (2.28) 1.05 (1.14) 4.11 (2.97)

t(Two-tailed)

2.51*

1.77

1.22

Cohen’s d .68 .45 .33 Achieved level Surface (n ¼ 24) .13 (.45) .33 (.57) .21 (.66)

3.63**

.95

3.77***

2.18*

.91

1.37

3.14**

.67

2.34*

.99

.27

.99

.61

.25

.36

.86

.19

.62

.21 (.78)

.46 (.66) .67 (.70)

.12 (1.65)

1.32 (1.38) 1.44 (1.50)

.04 (.69) .42 (.50) .46 (.59)

Deep (n ¼ 38)

.74 (.95)

t(Two-tailed)

4.81*** .25

3.89**

4.39*** .07

4.10***

3.46** .14

3.40**

4.60***

.01

4.47***

Cohen’s d

1.17

.95

1.15

1.01

.93

.20

1.24

.00

1.08

.37 (.54) 1.11 (1.16) 1.18 (.90) .45 (.50) 1.63 (1.15) .79 (.91) .39 (.68) 1.18 (3.27) 2.70 (2.43) 1.20 (1.26) 3.90 (3.03)

.07

.02

.05

*p < .05; **p < .01; ***p < .001. Cohen’s d is the effect size of the practical significance between groups. The practical significance is large when Cohen’s d is larger than .80, medium when it is between .50 and .80, small to medium when it is between .20 and .50, and small when it is smaller than .20.

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

skills’’ to ‘‘a training of critical thinking.’’ The approaches to learning via online PA ranged from ‘‘meeting course requirements’’ to ‘‘reflecting on the work of oneself through peer comments.’’ As regards Hypothesis 1, we found that strong associations exist between conceptions of and approaches to learning via online PA at both the main and achieved levels. The associations between conceptions and approaches were similar to previous studies (Ellis & Calvo, 2006; Ellis, Goodyear, calvo, et al., 2008; Ellis, Goodyear, et al., 2006; Ellis, Steed, et al., 2006; Lee et al., 2008); that is, surface approaches tended to be associated with fragmented conceptions of the learning environment, and on the contrary, deep approaches tended to be associated with cohesive conceptions. According to Ellis, Goodyear, et al. (2006), the interaction between conceptions and approaches could provide a possible way for teachers to promote students’ learning at higher levels. Considering that students may have a zone of proximal development in their conceptions and approaches as shown by their level of responses, it can be concluded that in an online PA learning environment, instructors could help students develop cohesive conceptions of learning if they want to encourage them to adopt deeper approaches to learning. The importance of conceptions of and approaches to learning for students’ progress in the online PA activity was also investigated. Taken together, the findings of the present study are consistent with Hypotheses 2 and 3 as well as with the conclusion of previous research that learning conceptions and approaches have an impact on learning outcomes (Dart et al., 2000; Purdie et al., 1996; Tsai, 2004). Students with more mature conceptions, namely cohesive conceptions, had greater progress than those with fragmented conceptions in their assignment. Likewise, students with a deep approach tended to outperform those with surface approach. These gains were observed mainly in the early stages of the online PA activity. However, in contrast to Hypothesis 4, the role of conceptions of online PA had stronger impact than the approaches to online PA when compared in their effect sizes. In other words, approaches to learning seemed to be less influential than conceptions of learning on the learning outcomes in this study, which is similar to the finding of Law, Chan, and Sachs (2008). In their study, Law et al. (2008) reported that in a text comprehension task, children’s beliefs about learning contributed to their performance more than their learning strategies. These findings seem to contradict the conclusion reached by previous research that learning outcomes are mainly affected by approaches, which, in their turn, are guided by learning conceptions (Chin & Brown, 2000). It is also worth noting that the innovative use of ‘‘main’’ and ‘‘achieved’’ levels in coding students’ conceptions and approaches was introduced in this study. For one thing, it helped to systematically assess the qualitative variation across the range of students’ learning conceptions and approaches. For another, it helped to better categorize the interview data because there are always situations that students’ ideas overlapped among the categories in the materials to be analyzed. The difference between ‘‘main’’ and ‘‘achieved’’ levels can

81

also suggest the possible ‘‘zone of proximal development’’ for educators. Finally, the decision to analyze students’ gain scores as ‘‘early’’ and ‘‘late’’ stages was helpful because it showed that already from the beginning of the assignment students were differentiated in their performance depending on their conception or approach to learning. It is interesting that gains were significant at the early stage of learning rather than at the later stage. However, the online PA activity lasted only two months, which was a rather short period for detecting long-term different rates of progress for the different groups of students; therefore, it is our hope to conduct similar activities on a long-term basis in our future work. Besides, as Wen and Tsai (2006) pointed out, investigation of students’ attitudes toward online PA might provide a further lens on the factors that affect learning in the context of online PA activity. Therefore, future research should also focus on the interplay between students’ attitudes toward online PA, their conceptions of learning, their approaches to learning via online PA, and their subsequent learning outcomes. While the results of this study were significant, we are only at the beginning of understanding the complex relationships among student conceptions of learning, approaches to learning and their learning outcomes via the online PA. Much more evidence is needed to support the findings of the present study. Acknowledgments Funding of this research work is supported by National Science Council, Taiwan, by grant numbers 94-2511-S-009003, 95-2511-S-011-002 and 96-2511-S-011-001. References ˚ kerlind, G. (2005). Variation and commonality in phenomenographic A research methods. Higher Education Research & Development, 24, 321e334. Barak, M., & Rafaeli, S. (2004). On-line question-posing and peer assessment as means for web-based knowledge sharing in learning. International Journal of Human-Computer Studies, 61, 84e103. Benson, P., & Lor, W. (1999). Conceptions of language and language learning. System, 27, 459e472. Biggs, J. B. (1993). What do inventories of students’ learning processes really measure? A theoretical view and clarification. British Journal of Educational Psychology, 63, 3e19. Buehl, M. M., & Alexander, P. A. (2001). Beliefs about academic knowledge. Educational Psychology Review, 13, 325e351. Butler, S. A., & Hodge, S. R. (2001). Enhancing student trust through peer assessment in physical education. Physical Educator, 58, 30e41. Cano, F. (2005). Consonance and dissonance in students’ learning experience. Learning and Instruction, 15, 201e223. Chan, C. K. K., & Law, Y. K. (2003). Metacognitive beliefs and strategies in reading comprehension. In C. McBride, & C. H. Chen (Eds.), Reading development in Chinese children (pp. 171e182). Westport, CT: Praeger. Chan, C. K. K., & Sachs, J. (2001). Children’s beliefs about learning and understanding of science texts. Contemporary Educational Psychology, 26, 192e210. Chin, C., & Brown, D. E. (2000). Learning in science: a comparison of deep and surface approaches. Journal of Research in Science Teaching, 37, 109e138.

82

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83

Cho, K., & Schunn, C. D. (2007). Scaffolded writing and rewriting in the discipline: a web-based reciprocal peer review system. Computers & Education, 48, 409e426. Dart, B. C., Burnett, P. C., Purdie, N., Boulton-Lewis, G., Campbell, J., & Smith, D. (2000). Students’ conceptions of learning, the classroom environment, and approaches to learning. Journal of Educational Research, 93, 262e270. Davis, P. (2000). Computerized peer assessment. Innovations in Education and Teaching International, 37, 346e355. Edwards, S. L., & Bruce, C. S. (2006). Panning for gold: understanding students’ information searching experiences. In C. S. Bruce, G. Mohay, G. Smith, I. Stoodley, & R. Tweedale (Eds.), Transforming IT education: Promoting a culture of excellence (pp. 351e369). Santa Rosa, CA: Informing Science Press. Ellis, R. A., & Calvo, R. (2006). Discontinuities in university student experiences of learning through discussions. British Journal of Educational Technology, 37, 55e68. Ellis, R. A., Goodyear, P., Brillant, M., & Prosser, M. (2008). Student experiences of problem-based learning in pharmacy: conceptions of learning, approaches to learning and the integration of face-to-face and on-line activities. Advances in Health Sciences Education, 13, 675e692. Ellis, R. A., Goodyear, P., Calvo, R. A., & Prosser, M. (2008). Engineering students’ conceptions of and approaches to learning through discussions in face-to-face and online contexts. Learning and Instruction, 18, 267e282. Ellis, R. A., Goodyear, P., Prosser, M., & Ohara, A. (2006). How and what university students learn through online and face-to-face discussion: conceptions, intentions and approaches. Journal of Computer Assisted Learning, 22, 244e256. Ellis, R. A., Steed, A. F., & Applebee, A. C. (2006). Teacher conceptions of blended learning, blended teaching and associations with approaches to design. Australasian Journal of Educational Technology, 22(3), 312e335. Falchikov, N., & Goldfinch, J. (2000). Student peer assessment in higher education: a meta-analysis comparing peer and teacher marks. Review of Educational Research, 70, 287e322. Freeman, M., & McKenzie, J. (2002). SPARK, a confidential web-based template for self and peer assessment of student teamwork: benefits of evaluating across different subjects. British Journal of Educational Technology, 33, 551e569. Hofer, B. K. (2000). Dimensionality and disciplinary differences in personal epistemology. Contemporary Educational Psychology, 25, 378e405. Hofer, B. K., & Pintrich, P. R. (1997). The development of epistemological theories: beliefs about knowledge and knowing and their relation to learning. Review of Educational Research, 67, 88e140. Law, Y.-K., Chan, C. K. K., & Sachs, J. (2008). Beliefs about learning, selfregulated strategies and text comprehension among Chinese children. British Journal of Educational Psychology, 78, 51e73. Lee, M.-H., Johanson, R. E., & Tsai, C.-C. (2008). Exploring Taiwanese high school students’ conceptions of and approaches to learning science through a structural equation modeling analysis. Science Education, 92, 191e220. LeMare, L. J., & Rubin, K. H. (1987). Perspective taking and peer interaction: structural and developmental analysis. Child Development, 58, 306e315. Lin, S. S. J., Liu, E. Z.-F., & Yuan, S.-M. (2001). Web-based peer assessment: feedback for students with various thinking-styles. Journal of Computer Assisted Learning, 17, 420e432. Lonka, K., Joram, E., & Bryson, M. (1996). Conceptions of learning and knowledge: does training make a difference? Contemporary Educational Psychology, 21, 240e260. Makoe, M., Richardson, J. T. E., & Price, L. (2008). Conceptions of learning in adult students embarking on distance education. Higher Education, 55, 303e320. Marshall, D., Summer, M., & Woolnough, B. (1999). Students’ conceptions of learning in an engineering context. Higher Education, 38, 291e309. Marton, F. (1983). Beyond individual differences. Educational Psychology, 3, 289e303. Marton, F. (1986). Phenomenography. A research approach investigating different understandings of reality. Journal of Thought, 21, 28e49. Marton, F., Dall’Alba, G., & Beaty, E. (1993). Conceptions of learning. International Journal of Educational Research, 19, 277e299. Marton, F., & Sa¨ljo¨, R. (1976a). On qualitative differences in learning. I. Outcome and process. British Journal of Educational Psychology, 46, 4e11.

Marton, F., & Saljo, R. (1976b). On qualitative differences in learning. I. Outcome as a function of the learner’s conception of the task. British Journal of Educational Psychology, 46, 115e127. McConnell, D. (2002). The experience of collaborative assessment in elearning. Studies in Continuing Education, 24, 73e102. McGourty, J. (2000). Using multisource feedback in the classroom: a computer-based approach. IEEE Transactions on Education, 43, 120e124. Mathews, B. P. (1994). Assessing individual contributions. Experience of peer evaluation in major group projects. British Journal of Educational Technology, 25, 19e28. Minasian-Batmanian, L. C., Lingard, J., & Prosser, M. (2006). Variation in student reflections on their conceptions of and approaches to learning biochemistry in a first-year health sciences’ service subject. International Journal of Science Education, 28, 1887e1904. Mockford, C. D. (1994). The use of peer group review in the assessment of project work in higher education. Mentoring and Turtoring, 2, 45e52. Mowl, G., & Pain, R. (1995). Using self and peer assessment to improve students’ essay writing. A case-study from geography. Innovation in Education and Training International, 32, 324e335. Prosser, M., & Trigwell, K. (1999). Understanding learning and teaching: The experience in higher education. Buckingham, UK: Society for Research into Higher Education/Open University Press. Prosser, M., Trigwell, K., & Taylor, P. (1994). A phenomenographic study of academics’ conceptions of science learning and teaching. Learning and Instruction, 4, 217e231. Purdie, N., Hattie, J., & Douglas, G. (1996). Student conceptions of learning and their use of self-regulated learning strategies: a cross-cultural comparison. Journal of Educational Psychology, 88, 87e100. Rubin, L. (2002). ‘‘I just think maybe you could.’’ peer critiquing through online conversations. Teaching English in the Two-Year College, 29, 382e392. Sa¨ljo¨, R. (1979). Learning in the learner’s perspective, 1: Some commonsense conceptions. Gothenburg, Sweden: Institute of Education, University of Gothenburg. Scheuer, N., de la Cruz, M., Pozo, J. I., Huarte, M. F., & Sola, G. (2006). The mind is not a black box: children’s ideas about the writing process. Learning and Instruction, 16, 72e85. Sluijsmans, D., Dochy, F., & Moerkerke, G. (1999). Creating a learning environment by using self-, peer- and co-assessment. Learning Environment Research, 1, 293e319. Sung, Y.-T., Chang, K.-E., Chiou, S.-K., & Hou, H.-T. (2005). The design and application of a web-based self- and peer-assessment system. Computers & Education, 45, 187e202. Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68, 249e276. Topping, K. J., Smith, E. F., Swanson, I., & Elliot, A. (2000). Formative peer assessment of academic writing between postgraduate students. Assessment and Evaluation in Higher Education, 25, 149e169. Tsai, C.-C. (2004). Conceptions of learning science among high school students in Taiwan: a phenomenograhic analysis. International Journal of Science Education, 26, 1733e1750. Tsai, C.-C. (2006). Biological knowledge is more tentative than physics knowledge: Taiwan high school adolescents’ views about the nature of biology and physics. Adolescence, 41, 691e703. Tsai, C.-C., & Kuo, P. C. (2008). Cram school students’ conceptions of learning and learning science in Taiwan. International Journal of Science Education, 30, 353e375. Tsai, C.-C., & Liang, C.-C. The development of science activities via on-line peer assessment: the role of scientific epistemological views. Instructional Science, in press. doi:10.1007/s11251-007-9047-0. Tsai, C.-C., Lin, S. S. J., & Yuan, S.-M. (2002). Developing science activities through a network peer assessment system. Computers & Education, 38, 241e252. Tsai, C.-C., Liu, E. Z.-F., Lin, S. S. J., & Yuan, S.-M. (2001). A networked peer assessment system based on a Vee heuristic. Innovations in Education and Teaching International, 38, 220e230.

Y.-F. Yang, C.-C. Tsai / Learning and Instruction 20 (2010) 72e83 Tseng, S. C., & Tsai, C.-C. (2007). On-line peer assessment and the role of the peer feedback: a study of high school computer course. Computers & Education, 49, 1161e1174. Vermunt, J. D., & Vermetten, Y. J. (2004). Patterns in student learning: relationships between learning strategies, conceptions of learning, and learning orientations. Educational Psychology Review, 16, 359e384. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

83

Wen, M. L., & Tsai, C. C. (2006). University students’ perceptions of and attitudes toward (online) peer assessment. Higher Education, 51, 27e44. Zevenbergen, R. (2001). Peer assessment of student constructed posters: assessment alternatives in pre-service mathematics education. Journal of Mathematics Teacher Education, 4, 95e113. Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: an overview. Educational Psychologist, 25, 3e17.

Suggest Documents