The development of an instrument for assessing students' perceptions ...

10 downloads 1253 Views 163KB Size Report
... l Journal of S cience Education ISSN 0950±0693 print/ISSN 1464±5289 online © 2000 T aylor & Francis Ltd http://www.tandf.co.uk/journals/tf/09500693.html ...
INT . J. S CI. EDUC. ,

2000,

VOL.

22,

NO.

4, 385 ± 398

Th e d e v e lopm e n t of an in strum e n t for asse ssin g stud e nts’ pe rc e ption s of te ac h e rs’ kn ow le d ge

Hsiao-L in T uan, Huey-Por Chang and Kuo-Hua W ang, National Changhua University of Education, T aiwan and Dav id F. T reagust, Curtin University of T echnology , Perth, A ustralia; e-mail: [email protected] T he purpose of this study was to describe the development and validation of an instrument on Student Perceptions of T eachers’ Knowledge (SPOT K) in relation to their pedagogy. Features of teachers’ knowledge from the research literature related to instruction, representation, subject matter knowledge, and knowledge of how to assess students’ understanding were used to generate categories in the SPOT K. T he result of a pilot study with 634 T aiwanese junior high school students showed high reliability of the scales, a good factor structure, and provided suggestions to delete weak items. In the main study, for which nine to ten items under each category were generated making a total of 37 items in the SPOT K, the instrument was administered to 1879 T aiwanese and 1081 Australian junior high school students varying in grades, sex and ability levels. Reliability and validity measures of the instrument were established based on Cronbach alpha and factor analysis. After the validating process, 28 items remained in the final instrument and reliabilities of the scales ranged from 0.97 to 0.82. Comment is made about the differences between Australian and T aiwanese students’ responses and suggestions for using the instrument in future research.

In trod u c tion T he teacher’ s role in the classroom includes creating a learning environment that promotes students’ cognitive and affective learning outcomes. Indeed, learning is influenced by many educational factors, including students’ perceptions of the appropriatenes s of the learning environment (Fraser 1994, McRobbie and Fraser 1993); teaching and instructional styles; the examples provided; the teaching model used in the design of lessons and the difficulty level of the academic tasks (Bull and Solity 1987). Many of the above factors are similar to research on the essential knowledge that teachers need to know (S hulman 1986, 1987). T he research literature on teachers’ knowledge, pedagogical content knowledge (PCK), and students’ perceptions of teachers’ knowledge is now reviewed. T eachers’ knowledge T eachers’ knowledge has been conceptualized by researchers in terms of beliefs (Clark and Peterson 1986), practical theories (S anders and McCutcheon 1986), craft knowledge (Brown and McIntyre 1986), personal knowledge (Connelly and Clandinin 1984), and knowledge-in-action (Scho È n 1983). Further, teachers’ knowledge stored as teachers’ cognition includes metaphors, practical knowledge, Internationa l Journal of S cience Education ISSN 0950± 0693 print/ISSN 1464± 5289 online # 2000 T aylor & Francis Ltd http://www.tandf.co.uk/journals/tf/09500693.htm l

386

H.-L. T UAN ET AL .

beliefs, images and events (Carter and Gonzalez 1993). Fenstermacher (1994) categorized knowledge into formal and practical knowledge, the former being process-produc t studies on effective teaching, the latter being practical, personal, situated, local, relational, and tacit knowledge. Fenstermacher considered that both kinds of knowledge were important in understanding how teachers learn to teach. His claim also inferred that both teachers’ teaching performance and thinking of teaching be labelled as teachers’ knowledge. Among the various kinds of teachers’ knowledge, Shulman and his co-workers provided the substantial and essential framework for a knowledge base of teaching. T his framework included content knowledge, general pedagogical knowledge, curriculum knowledge, pedagogical content knowledge, knowledge of learners and their characteristics, knowledge of educational context, and knowledge of educational ends, purposes and values (S hulman 1987, Wilson, et al. 1987) . Pedagogical content knowledge Research has mainly focused on identifying the nature and substance of pedagogical content knowledge, examining the difference between expert and novice teachers’ PCK, and suggesting how to facilitate pre- and in-service teachers’ PCK (T uan 1996). However, no studies on PCK appear to have examined students’ perceptions of teachers’ knowledge. In summarizing how different researchers capture the substance of PCK, T uan (1996) identified a number of features. Grossman (1988), for example, addressed the importance of curriculum knowledge; Geddis (1993) believed that conceptual change teaching strategies were the essence of PCK; T amir (1988) addressed the importance of assessment knowledge of PCK; and McDiarmid et al. (1989) advocated content representation. Reynolds’ (1992) summary of the literature on PCK indicated that it consists of teaching methods, content organization , knowledge of students’ content learning, content representation, and assessment knowledge. Cochran et al. (1993) used the term `pedagogical content knowing’ instead of pedagogical content knowledge which infers that teachers’ pedagogical content knowledge is an actively evolving knowledge of pedagogy, content, students, and context. Based on the above researchers’ definitions, T uan (1996) claimed that pedagogical content knowledge integrated seven domains of knowledge: pedagogical knowledge; representational knowledge; subject matter knowledge; curriculum knowledge; assessment knowledge; student knowledge; and context and social knowledge. T o some extent, teachers’ knowledge and the substance of pedagogical content knowledge covers similar domains of knowledge but PCK has more relevance to content knowledge. S tudents’ perceptions of teachers’ knowledge Knight and Waxman (1991) advocated the importance of investigating students’ perceptions of teachers’ knowledge because they provide rich information for understanding students’ cognition and classroom processes. T hese authors also pointed out that although students’ perceptions might not be consistent with the reality generated by outside observers, they can present the range of reality for individual students and subgroups in the classroom. Using students’ perceptions can enable researchers and teachers to appreciate the perceived instructional and environmental influences on students’ thought processes. Knight and Waxman

ASS ESS ING S T UDENT S’ PERCEPT IONS OF T EACHERS’ KNOWLEDGE

387

identified three areas of students’ perceptions of classroom processes, namely, specific strategy instruction, generic teacher behaviours and the classroom learning environment. S pecific strategy instruction focused on teachers’ direct instruction in specific cognitive strategies: teachers’ generic behaviours focused on the effective teaching behaviour promoting students’ learning and the classroom learning environment focused on the classroom atmosphere generated by teacher and student interactions. Research on students’ perceptions of teachers’ teaching includes effective teaching, perception of mastery learning, and co-operative learning (T urley 1994). Olson and Moore (1984) revealed that, from the students’ perspective, a good teacher knows the subject well, explains things clearly, makes the subject interesting, gives regular feedback, gives extra help to students, has a good sense of humour, and is fair and consistent. According to Lloyd and Lloyd (1986), students expected teachers to provide a sense of how the constituent parts of a discipline fit together, to have strong content knowledge, and to be able to teach this content knowledge to their understanding level. Similarly, T urley (1994) found that students’ perceptions of effective teaching were a combination of method, context, student effort, and teacher commitment. S tudents thought effective teachers were those who knew their subject, showed evidence of thoughtful planning, used appropriat e teaching methods, strategy repertoires and delivery skills, and gave adequate structure and direction. In brief, research on students’ perceptions of teachers’ teaching revealed that students expect teachers to have strong content knowledge, inferring that they were able to perceive whether teachers’ content knowledge was good or bad. Students also expected teachers to use effective instructional methods; in other words, they expected teachers to have good pedagogical content knowledge (Shulman 1986, 1987). However, previous research on learning environments has seldom addressed students’ perceptions of teachers’ knowledge.

Pu rpose of the re search T he purpose of this research was to develop an instrument that could be used to identify and evaluate students’ perceptions of teachers’ knowledge. Students have encountered various learning environments during their time at school and are in a good position to form accurate impressions about classrooms (Fraser 1998). Consequently, it is worthwhile to develop an instrument addressing students’ perceptions of teachers’ knowledge that could help science teachers understand how their knowledge may be recognized by their teaching and how their teaching could be improved based on these students’ perceptions.

Me th od ology T he conceptual framework of the S POT K instrument has progressed through several development stages (T uan et al. 1996, 1997). T rials of the previous categories showed that they only captured how teachers generated new knowledge on teaching and did not conceptualize the substance of the different kinds of teachers’ knowledge. T he final definition of the categories evolved to be `Instructional Repertoire’ , `Representational Repertoire’ , `Subject Matter Knowledge’ , and `Knowledge of S tudents’ Understanding ’ .

388

H.-L. T UAN ET AL .

`Instructional Repertoire’ (I R) refers to students’ perceptions of the extent to which the teacher selects from among an instructional repertoire. T his category was mainly aimed at soliciting students’ perceptions on whether or not their teachers’ teaching strategies could benefit their content learning. Examples of items in this category are: 1. My teacher’ s teaching methods keep me interested in science. 7. My teacher uses a variety of teaching approaches to teach different topics.

`Representational Repertoire’ (RR) refers to students’ perceptions of the extent to which the teacher uses a representational repertoire that challenges students’ previous concepts and which includes analogies, metaphors, examples, and explanations. Examples of items in this category are: 9. My teacher uses familiar examples to explain scientific concepts. 10. My teacher uses appropriate diagrams and graphs to explain science concepts.

`S ubject Matter Knowledge’ (S MK) refers to students’ perceptions of the extent to which the teacher demonstrates a comprehension of purposes, subject matter and ideas within the discipline. Examples of items in this category are: 17. My teacher knows how science theories or principles have been developed. 18. My teacher knows the answers to questions that we ask about science concepts.

`Knowledge of S tudents’ Understanding ’ (KS U) refers to students’ perceptions of the extent to which the teacher evaluates student understanding during interactive teaching and at the end of lessons and units. Examples of items in this category are: 23. My teacher’ s questions evaluate my understanding of a topic. 24. My teacher’ s assessment methods evaluate my understanding.

Once the conceptual framework for the instrument was established, several issues were taken into consideration. T he items should be easy to comprehend for eighth and ninth grade students both in T aiwan and in Australia; each category of items should be meaningful from the students’ perspective; and the response format should have five alternatives ± almost never, seldom, sometimes, often, and very of ten. Nine to ten items were generated under each of the four categories based on the agreement among the four researchers; the instrument was revised based on suggestions from 30 experienced science teachers. T hese items were translated into Chinese and translated back into English by a Chinese science educator outside the research team. Australian researchers then checked the meaning of the back-translation in order to decide whether or not the Chinese translation should be revised or the English version should be revised. Back-translation occured several times until neither the T aiwanese nor the Australian researchers had any questions on the meaning of the items. During the back-translation processes, some items were elaborated in order to make items more meaningful for both T aiwanese and Australian students. T he item `My teacher uses reference books other than textbooks when teaching us science’ , for example, proved to be problematic because Australian researchers considered that reference books were extra curricula materials, but the T aiwanese researchers considered that these books comprised problem-solving exercises. T herefore, this item was deleted. In another example, the item `My teacher uses tests that only assess what I have memorized not what I understand’ was changed to `My teacher uses tests to

ASS ESS ING S T UDENT S’ PERCEPT IONS OF T EACHERS’ KNOWLEDGE

389

check that I understand what I have learned’ . Finally, 37 items were established in the instrument; three categories ± Instructional Repertoire, Subject Matter Knowledge and Knowledge of Students’ Understanding comprised nine items; the Representational Repertoire category comprised ten items. A nalysis of data Six hundred and thirty four T aiwanese junior high school students from fifteen classes in seven junior high schools were selected for the pilot study. T he pilot study indicated that the SPOT K has appropriate reliability and validity results (T uan et al. 1997) and some items from the original instrument were removed or revised. T he revised instrument, consisting of nine items per category for a total of 36 items, was administered to 50 classes in T aiwan and 50 classes in Australia, comprising 1879 T aiwanese and 1081 Australian junior high school students, respectively, who varied in grades, sex and ability levels. Based on data analysis for item-scale and item-total correlation, as well as factor analysis, eight additional items were deleted. All but one of the item-total correlation values for the IR category for both T aiwanese and Australian students was in the range of 0.55 to 0.79. T he lowest correlation was 0.36 and 0. 47, respectively, for the item `My teacher divides the class into groups for teaching activities’ which was dropped in the final version of the instrument because its factor loading also was less than 0.4. Similar results applied to the item-scale correlation for the items in the RR category. T he items `My teacher explains science concepts using language that I can understand’ and `My teacher explains scientific concepts using videotapes’ had item-scale correlations of 0.20 and 0.27, respectively, and the factor loadings less than 0.4, so they were deleted from the final version of the instrument. Item-scale correlations for the SMK category were satisfactory for all except three items, `My teacher explains science concepts clearly’ , `My teacher cannot answer our questions about science concepts’ , and `My teacher does not know the content (s)he is teaching’ , each of which had item-scale correlations less than 0.37. T hese three items were deleted from the final version of the instrument. In the KSU category, the items, `My teacher knows whether I understand what (s)he is teaching’ and `My teacher checks that I understand what is being taught’ had item-scale correlations lower than 0.3 and factor loadings less than 0.4; these items were deleted from the final version of the instrument.

Re su lts an d d isc u ssion T aiwanese and Australian students’ responses on the 28 items of the instrument to measure students’ perceptions of teachers’ knowledge are shown for the four scales as descriptive statistics (table 1) , reliabilities (table 2), and factor analysis (table 3). D escriptive statistics In table 1, the total mean responses per class for Australian and T aiwanese students for IR was 24.68 and 26.26, respectively, indicating that the events that these items were investigating occur between `sometimes’ and `often’ (3.09, 3. 28). In addition, individual Australian students ranked higher (29.29) on this scale than

24.68 23.54 23.52 25.64

29.29 23.60 23.63 25.70

Indiv idual mean N ˆ 1081 26.26 25.03 23.51 27.19

Class mean N ˆ 50 26.40 25.16 23.59 27.24

Individual mean N ˆ 1879

T aiwan

3.48 2.59 2.19 2.26

Class unit N ˆ 50 6.96 5.61 4.54 5.38

Individual unit N ˆ 1081

Australia

3.99 2.72 1.79 2.04

Class unit N ˆ 50

7.63 6.15 4.63 5.58

Indiv idual unit N ˆ 1879

T aiwan

S tandard Dev iation

0.91 0.87 0.86 0.89

Individual N ˆ 1081

0.97 0.96 0.94 0.95

Class unit N ˆ 50

T aiwan

0.89 0.88 0.82 0.86

Individual N ˆ 1879

0.17* 0.16* 0.19* 0.17*

Australia Class unit N ˆ 50

0.35* 0.35* 0.65* 0.33*

T aiwan Class unit N ˆ 50

ANOVA Eta2

Note: *p < 0.001. T he eta2 statistic (Which is the ratio of between to total sums of squares) represents the proportion of variance explained by class membership)

0.97 0.94 0.95 0.95

Class unit N ˆ 50

Australia

Alpha reliability

Re liability for th e fou r sc ales of th e in stru m e n t to m easu re stu d en ts’ pe rc eption s of teac h e rs’ kn ow led ge in Taiw an an d Au stralia

I nstructional repertoire Representational repertoire Subject matter knowledge Knowledge of student understanding

Scale

Table 2.

8 7 6 7

No of items

Class mean N ˆ 50

Australia

Mean

De scriptiv e statistic s for th e fou r c ategorie s of th e in stru m en t to m e asu re stu d e n ts’ pe rce ption s of Au stralian an d Tiaw an ese te ac h e rs’ kn ow led ge .

I nstructional repertoire Representational repertoire Subject matter knowledge Knowledge of student understanding

Scale

Table 1.

390 H.-L. T UAN ET AL .

391

ASS ESS ING S T UDENT S’ PERCEPT IONS OF T EACHERS’ KNOWLEDGE

Table 3.

Factor an alysis of the 28-item in stru m e n t based on d ata from 50 c lasse s in Grad e s 8 an d 9 in Au stralia an d Taiw an . Factor loading Instructional repertoire

Items No. 1(IR2) 2(IR3) 3(IR4) 4(IR5) 5(IR6) 6(IR7) 7(IR8) 8(IR9) 9(RR1) 10( RR2) 11( RR3) 12( RR4) 13( RR5) 14( RR6) 15( RR7)

Representational repertoire

S ubject Matter knowledge

Knowledge of student understanding

Australia T aiwan Australia T aiwan Australia T aiwan Australia T aiwan 0.71 0.56 0.75 0.55 0.77 0.54 0.70 0.58

0.68 0.54 0.76 0.53 0.72 0.64 0.66 0.67

0.44

0.41 0.43 0.49 0.58 0.65 0.67 0.70

16( SMK1) 17( SMK3) 18( SMK5) 19( SMK6) 20( SMK7) 21( SMK8) 22( KSU1) 22( KSU2) 24( KSU3) 25( KSU4) 26( KSU6) 27( KSU7) 28( KSU8)

0.56 0.57 0.48 0.58 0.49 0.60 0.62 0.46

0.40

0.55 0.69 0.69 0.69 0.73 0.46

Ð 0.47 0.46 0.68 0.65 0.49 0.70 0.72 0.71 0.49 0.55 0.52 0.61

0.76 0.75 0.66 0.47 0.41 0.52 0.59

Note: Loadings smaller than 0.4 omitted

T aiwanese individual students (26.40). For RR, the mean response per class for Australian and T aiwanese students was 23.54 and 25.03, respectively, indicating that the events that these items were investigating occur between `sometimes’ and `often’ (3.36, 3.57). For S MK, the mean response for Australian and T aiwanese students was very similar being 23.52 and 23.51, respectively, indicating that the events that these items were investigating occur `often’ (3.92, 3.92). For KSU, the mean response per class for Australian and T aiwanese students was 25.64 and 27.19, respectively indicating that the events that these items were investigating occur more toward `often’ (3.66, 3.88). Among the above four categories about their teachers’ instruction, both Australian and T aiwanese students’ mean scores per item per class from highest to lowest were Subject Matter Knowledge, Knowledge of S tudent Understanding ,

392

H.-L. T UAN ET AL .

Representational Repertoire and Instructional Repertoire. For the individual mean scores, however, Australian students ranked IR higher than RR, with T aiwanese individual students’ mean scores following the above pattern. T he difference between these results is worthy of further investigation. Reliabilities and item-scale correlations T he scales for each category for both T aiwanese and Australian student groups had high Cronbach alpha values for both individual and class units ranging from 0.82 to 0.97 (see table 2), indicating that the scales were a reliable measure of the categories of teachers’ knowledge being investigated. Further, these internal consistency values for the class mean as the unit of analysis were larger than the individual as the unit of analysis, a tendency that has been reported by Fraser (1994) and Fraser et al. (1996). T he scales themselves proved to be consistent with only a small number of items having less than 0.30 item- scale correlations. T he analysis of variance values (eta2 ) in table 2 showed that each scale of the instrument differentiated between the perceptions of individually different students and between different classes in both Australia and T aiwan. V alidity of the instrument T he validity of the new instrument was confirmed in terms of its content and construct validity. Content validity refers to the extend that the content of the items measure what is claimed to be measured (Anastasi 1988, Zeller 1994); in this case content validity was ascertained by responses during the process of developing the instrument, from junior high school students and experienced science teachers. Additionally, researchers in teachers’ pedagogical content knowledge areas provided comment and critical feedback. According to Zeller (1994: 6572), construct validity `focus[es] on the assessment of whether a particular measure relates to other measures consistent with a theoretically anticipated way’ and this is usually done by factor analysis (Anastasi 1988). Based on the factor loadings in table 3, the items in each category were deleted that had low item- scale correlations from the reliability section and had factor loadings less than 0.40. T wo items have a different factor loading pattern in table 3. For item 11, `My teacher uses demonstrations to show science concepts’ , factors load on both IR (0.44) and RR (0. 48) for the T aiwanese students, and factors load on both RR (0.49) and KSU (0.40) for the Australian students. Although this item has loaded on two factors, for both groups of students, the largest factor loading was on RR and for this reason, this item was retained in this category. Interviews with T aiwanese students on the instrument showed that they tended to consider that teachers’ demonstration s were teaching activities to stimulate their learning. However, based on the researchers’ observations of Australian science teachers doing laboratory activities and conducting demonstration s in the same laboratory setting, the teachers asked many questions during the demonstrations . Students had the idea that demonstration s are used to check their understanding . A second item with different factor loadings between T aiwanese and Australian students is No. 16, `My teacher knows the content s/he is teaching’ . For the T aiwanese students, this item has a 0.46 factor loading on the RR category but has less than 0.40 on the S MK category for which the Australian student

ASS ESS ING S T UDENT S’ PERCEPT IONS OF T EACHERS’ KNOWLEDGE

393

group has a factor loading of 0. 55. T his item remains in the S MK category in the final version of the instrument, but further in-depth interviews with T aiwanese students will be conducted to examine their interpretations of this item.

Con c lu sion s T he data analysis indicated that the instrument on SPOT K in relation to teachers’ pedagogy has satisfactory validity and reliability measures. T he uniqueness of SPOT K is that it is specifically related to teachers’ knowledge within each particular teaching and learning environment. T his is important because research has shown that teacher’ s knowledge influences students’ perceptions of the learning environment. Fraser and T obin (1990) and T obin (1996), for example, showed that teachers’ metaphors and beliefs influenced how they taught and implemented the curriculum and their level of content knowledge influenced whether or not students were taugh t for factual retention or for understanding. Earlier, T obin and Fraser (1987) investigated the teaching characteristics of exemplary science teachers. T hese effective teachers used management strategies to sustain students’ engagement, used teaching strategies such as problem solving activities, provided concrete examples for abstract concepts, asked questions to increase students’ understanding , helped students to engage in both large and small group activities, and maintained favourable classroom learning environments. In a case study with one physical science teacher interested in improving her classroom learning environment (T uan 1998), students’ perceptions of the learning environment (CLES) instrument validated by Huang et al. (1998) and the SPOT K were highly correlated. In addition, Chiu and Chiang (1997) summarized the viewpoints of Wilson et al. (1987) and Cochran et al. (1991), that one feature of PCK was for the teacher to be able to plan the teaching and learning environment. Consequently, the researchers believe that SPOT K can be used to investigate the relation between S POT K and students’ perceptions of the learning environment. T he instrument can help teachers identify students’ perceptions on their own science teaching in terms of their Instructional Repertoire, Representational Repertoire, Subject Matter Knowledge, and Knowledge of Students’ Understanding . By examining the results from administration of the instrument, researchers and teachers can identify those aspects of their teaching that need to be improved in order to match students’ needs and expectations. At the outset, this study was intended to design an instrument to determine students’ perceptions of teachers’ various kinds of knowledge which could be performed in classroom teaching. However, during the development process of the instrument, only four distinct aspects of teachers’ knowledge were identified as being viable Instructional Repertoire, Representation Repertoire, Subject Matter Knowledge and Knowledge of S tudents’ Understanding . Future research could examine other aspects of teachers’ knowledge from students’ perceptions such as contextual knowledge, curriculum knowledge, and knowledge of students. Of interest also is whether researchers and students hold the same constructs of the various kinds of teachers’ knowledge and if they do not, to decide how to identify students’ own perceptions of the various kinds of teachers’ knowledge. By solving these emerging issues, it is likely that a better understanding of students’ perceptions of science teachers’ teaching performance will be achieved.

394

H.-L. T UAN ET AL .

Our cross-national collaboration in developing the instrument indicated that both Australian and T aiwanese class mean scores per item were highest for teachers’ Subject Matter Knowledge and lowest on teachers’ Instructional Repertoire. However, in terms of individual mean scores per item, Australian students ranked Instructional Repertoire higher than Representational Repertoire, which were not the same as the T aiwanese individual mean score. In addition, in table 3, items 11 and 16 have different factor loadings among Australian and T aiwanese students, indicating that there might be different ways for the T aiwanese and Australian students to interpret items in the instrument. T herefore, we need to carefully examine the differences of students’ perceptions between the two countries and analyse those cultural differences that influenced their interpretations . T o establish the new instrument’ s usefulness, future research is needed to provide more specific analysis concerning the relationship between this instrument and students’ responses to the items by interviews. Other research is needed to examine whether teachers with acknowledged weak knowledge on instruction, representation, subject-matter knowledge, and knowledge of students’ understanding would be scored lower on the four scales of SPOT K compared to teachers with strong knowledge on these four areas.

Ac kn ow le d ge m e n ts T he authors are grateful for the financial support of the National Science Council of the Republic of China (NSC 87-2511-S -018-012), Professors Jong-Hsiang Yang and Barry Fraser for their leadership in organizing this international learning environment project, and to Jill M. Aldridge for her assistance with data analysis.

Re feren ce s Anastasi, A. (1988) Psychological T esting (New York: Macmillan). Brown, D. S. and M cInt yre, D. (1986) How do teachers think about their craft? In M. B. Peretz, R. Bromme and R. Halkes (eds), Adv ances of Research on T eacher T hinking (Lisse: Swets and Zeitlinger), 36± 45. Bull, S. and Solit y, J. (1987) Classroom Management (New York: Croom Helm). Cart er, K. and Gonzalez, L. (1993) Beginning teachers’ knowledge of classroom events. Journal of T eacher Education, 44, 223± 232. Chiu, M . H. and Chiang, Y. T . (1997) A study of pedagogical content knowledge held by earth science teachers in junior high schools. Chinese Journal of S cience Education, 5, 419± 460. Clark, C. and Peterson, P. (1986) T eacher’ s thought process. In M. C. Wittrock (ed.), Handbook of Research on T eaching (NY: Macmillan), 255± 296. Cochran, K . F., DeRuiter, J. A. and K ing, R. A. (1993) Pedagogical content knowing± an integrative model for teacher preparation. Journal of T eacher Education, 44, 263± 272. Connelly , F. M . and Clandinin, D. J. (1984) Stories of experience and narrative inquiry. Educationa l Researcher, 19, 2± 14. Fenstermacher, G. D. (1994) T he knower and the known: T he nature of knowledge in research on teaching. In L. Darling-Hammond (ed.) Review of Research in Education, 20 (Washington DC: American Educational Research Association), 3± 56. Fraser, B. J. (1994) Research classroom and school climate. In D.L. Gabel (ed.) Handbook of Research on S cience T eaching and L earning (New York: Macmillan), 493± 541.

ASS ESS ING S T UDENT S’ PERCEPT IONS OF T EACHERS’ KNOWLEDGE

395

Fraser, B. J. (1998) Science learning environments: Assessment, effects and determinants. In B. J. Fraser and K. T obin (eds) International Handbook of S cience Education (Dordrecht: Kluwer), 527± 564. Fraser, B. J., Fisher, D. L. and McRobbie, C. J. (1996) Development, validation and use of personal and class forms of a new classroom environment instrument. Paper presented at the annual meeting of the American Educational Research Association, New York. Fraser, B. J. and T obin, K. (1990) Combining qualitative and quantitative methods in classroom environment research. In B. J. Fraser and H. J. Walberg, (eds) Educational Env ironment (Oxford: Pergamon Press), 271± 292. Geddis, A. N . (1993) T ransforming subject-matter knowledge: the role of pedagogical content knowledge in learning to reflect on teaching. International Journal of S cience Education, 15, 673± 683. Grossman, P. L. (l988) A study in contrast: S ource of pedagogical content knowledge for secondary English teacher. Unpublished doctoral dissertation. Stanford University, Stanford. Huang, I. T . C., Aldridge, J. M. and Fraser, B. (1998) A cross-national study of perceived classroom environments in T aiwan and Western Australia: Combining quantitative and qualitative approaches. Chinese Journal of S cience Education, 6, 343± 362. Knight, S. L. and Waxman, H. C. (1991) Students’ cognition and classroom instruction. In H. C. Waxman and H. J. Walberg (eds) Effective T eaching: Current Research (Berkeley, CA: McCutchan), 239± 255. Lloyd, B. C. and Lloyd, R. C. (1986) T eaching/learning: T he student viewpoint. Reading Horizons, 26, 266± 269. McDiarmid, G. W., Ball, D. L. and Anderson, C. W. (1989) Why staying one chapter ahead doesn’ t really work: Subject-specific pedagogy. In M. C. Reynolds (ed.), Knowledge Base for the Beginning T eacher (Oxford: Pergamon), 185± 192. McRobbie, C. J. and Fraser, B. J., (1993) Associations between student outcomes and psychosocial science environment. Journal of Educationa l Research, 87, 78± 85. Olson, L. and Moore, M. (1984) Voices from the classroom: Students and teachers speaking out on the quality of teaching in our schools. Oakland, CA: A report of the Students for Quality T eaching Project Center. (ERIC Document Reproduction Service No. ED 252 497). Reynolds, A. (1992) What is competent beginning teaching? A review of the literature. Review of Educationa l Research, 62, 1± 35. Sanders, D. P. and McCutcheon, G. (1986) T he development of practical theories of teaching. Journal of Curriculum S tudies, 20, 167± 169. È n, D. (1983) T he Reflective Practitioner: How Professionals T hink in Action. (New SchO York: Basic Books). Shulman, L. S. (1986) T hose who understand: Knowledge growth in teaching. Educational Researcher, 15, 4± 14. Shulman, L. S. (1987) Knowledge and teaching: Foundations of the new reform. Harvard Educational Rev iew, 57, 1± 22. T amir, P. (1988) Subject matter knowledge and related pedagogical knowledge in teacher education. T eaching and T eacher Education, 4, 99± 110. T obin, K. G. (1996) Analytical and holistic approaches to research on teacher education. In D. F. T reagust, R. Duit and B. J. Fraser (eds) Improv ing T eaching and L earning in S cience and Mathematics (New York: T eachers College Press), 175± 189. T obin, K. and Fraser, B. J. (eds) (1987) Exemplary Practice in S cience and Mathematics Education. (Perth: Curtin University of T echnology). T uan, H. L. (1996) Pedagogical content knowledge ± a revelation for the reform of future science teacher education. Proceedings of the First Mathematics and S cience T eaching and T eacher Education conference, pp. 118± 143. T uan, H. L. (1998) How a beginning physical science teacher improves her learning environment ± A case study. Paper presented at the Science Education Conference of the Republic of China. Kaohsiung, T aiwan, R.O.C. T uan, H. L., Chang, H. R. and Wang, K. H. (1996) T he development of an instrument on students’ perceptions of teachers’ pedagogical content knowledge. Paper presented at

396

H.-L. T UAN ET AL .

the Science Education Conference of the Republic of China. T aichung, T aiwan, R.O.C. T uan, H. L., Chang, H. R, Wang, K. H. and T reagust, D. F. (1997) T he development of an instrument for assessing student perceptions of teachers’ knowledge in T aiwan and Australia: A pilot study. Paper presented at the International Conference on Science, Mathematics and T echnology Education, Hanoi, Vietnam. T urley, S. (1994) `T he way teachers teach is, like, totally whacked’ : T he student voice on classroom practice. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA. Wilson, S., Shulman, L. S. and Richert , A. (1987) . `150 different ways of knowing’ Representations of knowledge in teaching. In J. Calderhead (ed.), Exploring T eacher T hinking (London: Cassell), 104± 124. Zeller, R. A. (1994) Validity. In T . Husen and T .N. Postelthwaite (eds). T he International Encyclopedia of Education: V ol 11 (Oxford: Pergamon press), 6569± 6576.

Appe n d ix S tudents’ perceptions of teachers’ knowledge Directions for students: T his questionnaire contains statements about practices which could take place in this class. You will be asked how of ten each practice takes place. T here are no `right’ or `wrong’ answers. Your opinion is what is wanted. T hink about how well each statement describes what this class is like for you. Draw a circle around: 1. 2. 3. 4. 5.

if if if if if

the the the the the

practice practice practice practice practice

takes takes takes takes takes

place place place place place

Almost Never S eldom S ometimes Often Almost Always

Be sure to give an answer for all questions. If you change your mind about an answer, just cross it out and circle another . S ome statements in this questionnaire are fairly similar to other statements. Don’ t worry about this. Simply give your opinion about all statements.

397

ASS ESS ING S T UDENT S’ PERCEPT IONS OF T EACHERS’ KNOWLEDGE

Your Name _________________ _ T eacher’ s Name ___________________ _ School __________________ _ Grade _________ _ Male

Female

Science Class

Physical S cience

Biology

IR 1. My teacher’ s teaching methods keep me interested in science. 2. My teacher provides opportunities for me to express my point of view. 3. My teacher uses different teaching activities to promote my interest in learning. 4. My teacher uses appropriate models to help me understand science concepts. 5. My teacher uses interesting methods to teach science topics. 6. My teacher’ s teaching methods make me think hard. 7. My teacher uses a variety of teaching approaches to teach different topics. 8. My teacher shows us activities that I can use to continue my study of a topic.

RR 9. My teacher uses familiar examples to explain scientific concepts. 10. My teacher uses appropriate diagrams and graphs to explain science concepts. 11. My teacher uses demonstrations to show science concepts. 12. My teacher uses real objects to help me understand science concepts. 13. My teacher uses stories to explain science ideas. 14. My teacher uses analogies with which I am familiar to help me understand science concepts. 15. My teacher uses familiar events to describe scientific concepts.

Alm ost Som e Ne v e r Se ldom tim e s

Alm ost Ofte n Alw ays

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

Alm ost Som e Ne v e r Se ldom tim e s

Alm ost Ofte n Alw ays

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

398

AS SES SING ST UDENT S ’ PERCEPT IONS OF T EACHERS’ KNOWLEDGE

SMK 16. My teacher knows the content (s)he is teaching. 17. My teacher knows how science theories or principles have been developed. 18. My teacher knows the answers to questions that we ask about science concepts. 19. My teacher knows how science is related to technology. 20. My teacher knows the history behind science discoveries. 21. My teacher explains the impact of science on society.

KUS 22. My teacher’ s tests evaluate my understanding of a topic. 23. My teacher’ s questions evaluate my understanding of a topic. 24. My teacher’ s assessment methods evaluate my understanding. 25. My teacher uses different approaches (questions, discussion, etc. ) to find out whether I understand. 26. My teacher assesses the extent to which I understand the topic. 27. My teacher uses tests to check that I understand what I have learned. 28. My teacher’ s tests allow me to check my understanding of concepts.

Alm ost Som e Ne v e r Se ldom tim e s

Alm ost Ofte n Alw ays

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

Alm ost Som e Ne v e r Se ldom tim e s

Alm ost Ofte n Alw ays

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5