WEISHEN WU, HUEY-POR CHANG and CHORNG-JEE GUO
THE DEVELOPMENT OF AN INSTRUMENT FOR A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT Received: 22 February 2007; Accepted: 19 November 2007
ABSTRACT. This study developed, validated, and utilized the Technology Integrated Classroom Inventory (TICI) to examine technology-integrated science learning environments as perceived by secondary school students and teachers. Using technology-oriented classroom climate instruments and considering the science classroom’s characteristics, TICI was developed. More than 1,100 seventh through ninth grade science students validated the instrument, revealing eight scales: technological enrichment, inquiry learning, equity and friendliness, student cohesiveness, understanding and encouragement, competition and efficacy, audiovisual environment, and order, with alpha reliabilities ranging between 0.69 and 0.91 (0.93 for the entire questionnaire). In measuring actual and preferred learning environments, TICI results indicated that both students and teachers ranked equity and friendliness highest. The largest actual–preferred discrepancy was order (students) and inquiry learning (teachers). TICI offers additional utilities for technologyenriched science leaning environments. KEY WORDS: classroom climate, instrumentation, learning environment, science teaching, technology integration
INTRODUCTION Classrooms are often regarded as a meaningful place wherein students construct their understanding of subject matters. During lessons, the classroom is composed of various forms of communication and interaction practices that render themselves to an overall characterization of the learning environment. The dynamics of the classroom evolve based on how all members feel and experience the characteristics of this milieu, often referring to the climate, culture, ambiance, or atmosphere in which teaching and learning take place (Fraser, 1986). In the classroom, the teacher’s role as manager unveils the importance of planning, organizing, leading, and controlling the learning environment (Tobin & LaMaster, 1995). As an ecological system, any intervention in the classroom may cause changes of contextual variables, which in turn influences the learning environment as a whole. From the socialconstructivism perspective, the teacher manages to create an effective learning environment in which students are actively engaged in International Journal of Science and Mathematics Education (2009) 7: 207Y233 # National Science Council, Taiwan (2007)
208
WEISHEN WU ET AL.
knowledge construction through the interactions between peoples and artifacts. Assessing the learning environment is a particularly valuable process for the teachers who seek to maintain students’ positive attitude toward the class while justifying their provisions to the transformative settings. With the rapid diffusion of technology in schools, integrating technology into teaching has reshaped the ways in which teachers, students, curricula, and technologies interact, ultimately changing the classroom setting. In the science classroom, teachers have adopted various technologies to facilitate learning, ranging from collecting, measuring, and analyzing data to modeling, simulating, and visualizing concepts as well as promoting multiple representations and interactions of ideas (Cox, 2000; Newton & Rogers, 2001). Students are often attracted by the fascinating presentations of technology-mediated course content, which provide additional opportunities to explore scientific principles, thereby making it easier for students to concentrate on their studies (Mistler-Jackson & Songer, 2000). Having reacted to the technology-supported setting, students’ enthusiasm for learning science is gradually encouraged. As such, educators are keen to discover how technology integration actually impacts science teaching and learning. One promising methodology involves investigating the learning environment (Zandvliet, 2002). The nature of the learning environment is judged based on students’ perceptual consensus about the educational, psychological, social, and physical aspects of the environment (Dunn & Harris, 1998). Historically, assessments of science learning environment instruments paid relatively little attention to technological dimensions. To ensure that technology integration falls in line with the teacher’s expectations, recognizing students’ perceptions toward the technology-enriched learning environment has become a critical issue. Recently, several instruments have been embedded with technological components in order to assess the wellappointed science classroom in which students have sufficient opportunities to access technology in learning (Aldridge, Dorman & Fraser, 2004; Maor, 2000; Newhouse, 2001). However, existing instruments may not be applicable to the general classroom, where technology is limited for students’ use in learning due to the constraints of the equipment or schedules. In addition, existing instruments supply insufficient measurements for technology integration, resulting in inadequate explications of how technology influences the science learning environment. Therefore, the need exists to develop a new instrument specifically designed for the technology-integrated science learning environment that takes the conditions of the general classroom into account.
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
209
This study seeks to explore distinct aspects within the technologyintegrated science learning environment—particularly in the general classroom, where technology is mainly controlled by the teacher and students have little opportunity to use it. The purpose of the current study is to develop, validate, and use an instrument to examine the science learning environment enriched by technology. As such, the research questions focus on the development of such an instrument as well as the examination of the actual and preferred perceptions of students and teachers. While science teachers are attempting to integrate technology into teaching, this study attempts to explore alternatives to the technology-enriched learning environments currently occurring in the science classroom. LITERATURE REVIEW The Science Classroom Learning Environment Over the past several decades, research has established relationships between the classroom environment and student outcomes as well as evaluated educational programs and identified determinants of learning environments (Fraser, 1994; 2002). In addition, learning environment research in the field of science education has grown vigorously, particularly in the areas of instrumentation and applications (Tobin & Fraser, 1998). A rich array of instruments have been developed for various types of science classes, such as the Learning Environment Inventory (LEI), Classroom Environment Scale (CES), My Class Inventory (MCI), Science Laboratory Environment Inventory (SLEI), Questionnaire on Teacher Interaction (QTI), What Is Happening In This Class? (WIHIC), and Constructivist Learning Environment Survey (CLES). Such historically important instruments have been widely used to assess primary and secondary students’ social and psychological perceptions of their science classrooms. Many instruments originated in the West and have been translated into several Asian languages (Fraser, 2002). Despite the existing instruments, research continues to explore new instruments in an effort to investigate every aspect of modern learning environments, such as web-based learning environments (Chang & Fisher, 2001), technology-rich learning environments (Khine, 2003), and distance learning environments (Walker & Fraser, 2005). Applications of learning environment instruments for science classrooms have focused primarily in four areas: (1) associations between students’ cognitive, affective, and behavioral outcomes and the learning environ-
210
WEISHEN WU ET AL.
ment (Anderson, Hamilton & Hattie, 2004; Dorman, 2001; Roth, 1998); (2) influences of curriculum innovation and teacher’s belief and knowledge of the learning environment (Chen, Taylor & Aldridge, 1998; Suárez, Pias, Membiela & Dapia, 1997; Wang, Tuan & Chang, 1998); (3) evaluation of students’ actual and preferred environments compared to teachers’ (Chang, Hisao & Barufaldi, 2006; Maor & Fraser, 1996); and (4) cross-national comparisons of learning environments (Aldridge, Fraser, Taylor & Che, 2000; Dorman & Adams, 2004; Fisher, Goh, Wong & Richards, 1997). Prior findings indicate that the learning environment in the science classroom has evolved dynamically from characteristics related to students and teachers, curriculum reform, technological and pedagogical changes, school climate, and socio-cultural factors. A favorable science learning environment correlates significantly to student involvement, teacher support, and classroom order and organization (Fraser & Tobin, 1989). Indeed, students’ perceptions toward their learning environment are generally more realistic than teachers’. Thus, a learning environment assessment is useful in that it can identify gaps between teachers’ expectations and students’ perceptions to promote further improvements. Three general approaches exist for assessing learning environments: (1) classroom observations of explicit phenomena; (2) questionnaire surveys of actual and preferred forms; and (3) ethnographic data collection (Dorman, 2002). The first two approaches rely primarily on quantitative data collection methods. Research that adopts qualitative methodologies has the advantages of detailed insight into the learning environment (Tobin & Fraser, 1998). However, the time-consuming data collection process often delays feedback to the teacher; consequently, refinement cannot be made in a timely manner. Impact of Technology Integration on the Science Learning Environment The trend of technology integration in science classrooms has received great attention from educators in recent years. Among the technologysupported teaching approaches, three pivotal features contribute to effective science learning: (1) high computational power allows for copious interactivity and direct feedback through which students share what they have learned with others; (2) networked computing enables students to access a large amount of learning resources and social interactions, resulting in them taking ownership over knowledge construction; and (3) high-quality audiovisual interface supplies students with authentic learning facilities for modeling conceptual understandings
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
211
(Gerjets & Hesse, 2004). In essence, technological capabilities transform the ways in which students learn science by promoting efficient cognitive inquiry and social sharing. For science teachers, technology not only serves as a productivity tool, but also as a key to better learning outcomes (Mayer-Smith, Pedretti & Woodrow, 1998; Nolen, 2003; Songer, 1998). Learning in the science classroom emphasizes the acquisition of scientific understanding through inquiry involving the identification of assumptions, use of critical and logical thinking, and consideration of alternative explanations (NSC, 1996). Teaching with technology invites students to engage in inquiry-based learning through activities such as multimedia hint giving, prompts for reflection, and connection to online discussions; moreover, technology empowers students to gather, organize, and display information in new ways (Linn & His, 2000). Hence, an inquiry-based culture can be cultivated by incorporating innovative pedagogies with technology in the science classroom. Several instruments contain inquiryrelated scales, such as the Critical Voice in CLES and the Investigation in CCEI and WIHIC. Previous research has indicated that technology integration teaching is conducive to science learning in that students have great opportunities to experience social, affective, and cognitive contingencies. The benefits of technology integration into science learning— including positive attitude, motivation, and interest (Edelson, 2001; Lajoie, 1993); conceptual development and reinforcement and higherorder skills like classification and reasoning ability (Bell & Linn, 2000; Wu, Krajcik & Soloway, 2001) have been well documented. Although technology integration impacts science learning in a variety of ways, reaping the benefits requires that teachers deal with classroom management issues. The emergence of student-centered learning environments driven by technology integration has led teachers to believe that such integration could affect classroom control and student discipline (Bowman, Newman & Masterson, 2001). The complexity of technology integration requires that rules and procedures be established to facilitate the smooth running of instructional activities in the effective learning environment (Lim, Pek & Chai, 2005). Technology-Oriented Learning Environment Instruments The development of instruments specifically aimed as assessing technology-rich learning environments has led to enhanced learning environment research. The latest instruments involving the use of technology in the science classroom include the Constructivist Multimedia Learning Environment Survey (CMLES) (Maor, 2000), New Classroom Environ-
212
WEISHEN WU ET AL.
ment Instrument (NCEI) (Newhouse, 2001), and Technology-rich Outcomes-focused Learning Environment Inventory (TROFLEI) (Aldridge et al., 2004). These instruments provide different views of technologyrelated scales for use in the science learning environment. The CMLES was designed to assess the degree to which students and teachers perceived their classroom learning environment as inquiry-based and constructivist-oriented. CMLES scales include student negotiation, inquiry learning, reflective thinking, authenticity, and complexity—some of which were derived from CLES (Taylor, Fraser & Fisher, 1997) and the Computer Classroom Environment Inventory (CCEI) (Maor & Fraser, 1996). CMLES was validated through the process of learning with a multimedia program. Of the five scales, authenticity and complexity were found to be those scales that measured in particular the extent to which students perceive the multimedia program as simulating an authentic learning environment and providing multiple representations of the data, respectively. Evolving from CES and the Classroom Interaction Patterns Questionnaire (CIPQ) (Woods, 1995), the NCEI was developed to assess the impact of portable computers on the classroom learning environment. The NCEI consists of 56 items in eight scales: involvement, affiliation, teacher support, group work, competition, order and organization, teacher control, and innovation. The innovation and group work scales are designed to measure the extent to which the teacher attempts to use new techniques and encourages students’ creative thinking and collaborative learning. A study implementing the use of NCEI across various subjects found that computers were not indispensable to the learning environments except in the case of science classes. The TROFLEI, based on the WIHIC (Fraser, McRobbie & Fisher, 1996), focuses on technology and outcome dimensions of the learning environment. Employing multitrait-multimethod (MTMM) modeling for validation, the TROFLEI utilizes 80 items in 10 scales: student cohesiveness, involvement, teacher support, investigation, cooperation, task orientation, young adult ethos, equity, differentiation, and computer usage. Among these scales, the young adult ethos and computer usage are additions to the WIHIC; the young adult ethos scale assesses the degree to which the teacher give students responsibility and treats them as young adults, while the computer usage scale measures the extent to which students use computers to communicate with others as well as access information. In contrast to the notable technological impacts on science learning, several measurements have not been considered in the existing technol-
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
213
ogy-oriented learning environment instruments, such as interactivity, comprehensibility, and productivity. The exclusion of such distinct measurements obscures the importance and uniqueness of technology integration and might threaten the integrity of the learning environment assessment. Furthermore, the length of the instrument may limit its utility. An effective instrument for the technology-integrated learning environment should provide practitioners with valuable information to distinguish technological benefits from other scales. Thus, the current study seeks to create an instrument to address these limitations of previous instruments. METHOD Instrumentation The development of the instrument for the technology-integrated science learning environment started by gathering salient scales from previous technology-oriented instruments, including CMLES, NCEI, and TROFLEI. Comparing these instruments resulted in identifying the specific scales each used in relation to the technology-enriched science learning environment. For example, the NCEI’s innovation scale emphasizes pedagogical use of portable computers to encourage students’ creative thinking, the TROFLEI’s computer usage scales measures which teaching and learning tasks are done by computer applications, and the CMLES highlights the multimedia impact on teaching and learning through the authenticity and complexity scales. The collective scales were subsequently categorized into three dimensions as suggested by Moos (1987), serving as the conceptual framework for initial measurements (as shown in Table I). Items with distinctive meanings were extracted for use in the collective scales; when more scales with identical meanings, only one was retained. In addition to the measurements included in the technologyoriented instrument, several items related to technological impacts on science learning—namely, interactivity, comprehensibility, and productivity—were added into the item pool. Although the collective scales contained the socio-psychological aspects of learning environments, they lacked the physical environment components. Zandvliet and Straker (2001) developed the Computerized Classroom Ergonomic Inventory (CCEI), an ergonomic inventory containing five scales: workspace environment, computer environment, visual environment, spatial environment, and air quality. Thus, the CCEI brought the physical computer environment into the instrument, empha-
Inquiry Learning Reflective Thinking
Personal Development
* Moos (1987)
Authenticity Complexity
Student Negotiation
Relationship
System Maintenance and Change
CMLES
Schema*
Teacher Control Order & Organization Innovation
Group Work Competition
Affiliation Involvement Teacher Support
NCEI Student Cohesiveness Involvement Teacher Support Investigation Cooperation Task Orientation Young Adult Etho Equity Differentiation Computer Usage
TROFLEI
Collective scales from technology-oriented instruments
TABLE I
Technological Impact
Equity & Controlling
Cohesiveness Involvement Teacher Support Inquiry & Reflection Cooperation & Competition Self-regulation
Collective Scales
214 WEISHEN WU ET AL.
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
215
sizing the interactions between users and equipment in the computer laboratory. Further, the CCEI tested its association with the WIHIC and demonstrated significant correlations between certain physical environment factors and psychosocial environment variables (Zandvliet & Fraser, 2005). However, some CCEI scales are incompatible with the science classroom in which technological equipment is dominated by the teacher rather than used by students during class. Due to resource inaccessibility for students in the general classroom, only the visual environment and spatial environment scales from CCEI were incorporated into the items already compiled. Taken together, the item pool generated 81 questions derived from the collective scales, focusing on the physical environment in the general classroom and prominent technological impacts on science learning. Pretests To ensure the instrument possessed acceptable validity, pretests were conducted to enable teachers and students to review the item pool. First, three science teachers with extensive experience in technology integration were invited to review the item pool using a card-sorting method. As expert teachers, the three science teachers were skillful in inquiry-based teaching using the latest technologies (e.g., interactive simulations or web logs) in the high school science classroom. All initial items, printed on index cards, were randomly shuffled and presented to the three teachers. Each teacher was asked to pick out those items whose statements most closely matched his or her experiences and then sort them into the appropriate predetermined categories (as set by the researchers). Items were retained only if they were selected by all teachers during the card-sorting process. As a result, 10% of the presented items were dropped, including “My learning materials differ from those used by other students,” “I am responsible for my learning,” “I use different assessment methods from other students,” and “Technological equipment is easy for the teacher to use.” Of the remaining items, 85% were placed in the same categories. The selected 52 items were subsequently administered to 76 eighth grade students in a public school to test their readability among secondary school students. As a result, a few wording issues were adjusted according to the students’ responses. Field Test Participants included seventh through ninth grade science class students and teachers from two districts in Taiwan. Instead of dividing the students into different levels of classes, tradition dictates that the entire class
216
WEISHEN WU ET AL.
attends most courses in the same classroom. Therefore, the targeted classes were chosen based on whether science teachers had ever used technology in the classroom during the semester. Using a list-wise investigation of school administrations in two districts, science teachers who actually used technology in teaching were identified. Personal visits or telephone calls were then made to the targeted teachers in order to provide detailed study information and solicit their support. With the approval of the school administration, a questionnaire survey was conducted with the targeted teachers and their students. The questionnaire package contained a cover sheet for recording the survey details, a teacher questionnaire, and student questionnaires. Both teacher and student questionnaires included three sections: the participants’ information, questions related to the actual classroom, and questions related to the preferred classroom. The questions related to the actual classroom asked about the current learning environment; and questions related to the preferred classroom asked about students’ and teachers’ ideal learning environment. In this manner, teachers’ perceptions of the environment can be compared to those of students’. Considering the reliability of positively worded items in terms of response accuracy and internal consistency (Barnette, 2000; Schriesheim, Eisenbach & Hill, 1991), all questions were worded using a positive scoring direction and measured using a 5-point Likert-type scale with anchors from almost never (scored as 1) to almost always (scored as 5). Respondents were asked to respond the questions related to the actual classroom environment first. The questions related to the preferred classroom were completed after an interval of an hour. To ascertain subjects’ perspective of what had happened in their science classes throughout the course, data were collected near the end of the semester. Of the 1,155 student questionnaires and 23 teacher questionnaires returned from 33 science classes, 37 student samples were discarded due to incomplete data. Consequently, the total number of effective responses was 1,118 student questionnaires and 23 teacher questionnaires from 19 public schools. The student questionnaires were randomly divided into two sample groups; the first group (654 samples) was prepared for exploratory factor analysis while the second group (464 samples) was arranged for confirmatory factor analysis. Data Analysis Data were analyzed using descriptive statistics, factor analyses, and correlative analyses between factor score estimates. The descriptive
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
217
statistics estimated the item means and deviations. The psychometric characteristics of the instrument were examined using factor analyses of students’ responses to actual and preferred environments. The exploratory factor analysis was conducted to determine the items for each specific factor as well as factorial structure of the instrument. Considering the limited time for data collection (near the end of semester), exploratory factor analyses were performed on students’ responses to actual environments, followed by data on preferred environments, to confirm the factorial structure. The confirmatory factor analysis using a structural equation modeling approach was employed to measure the goodness-offit indices and construct reliability of the instrument. Finally, correlations between factor scores were measured.
RESULTS Psychometric Characteristics of the Instrument The actual environment data of the first sample group (n=654) was tested by measuring the psychometric characteristics of the instrument. As summarized in Table II, item means ranged from 2.61 (SD=1.09) to 4.36 (SD=0.94). Factors were extracted based on eigenvalues and the proportion of variance explained by each factor using the principal components of varimax with Kaiser normalization rotation. Items with factor loading values below the cutoff value of 0.4 on their own scales or greater than 0.4 on each of the other scales were eliminated (Hair, Anderson, Tatham & Black, 1998). The results indicated no cross loadings but three items with loadings below 0.4 were excluded: “The teacher treats me like a friend,” “The teacher designs different problems for students,” and “The student will be punished if he/she violates the classroom rule.” Thus, a total of 49 items in eight factors were extracted, with the total accounting for 60% of the explained variance. The preferred environment data of the first sample group were tested for the factorial structure using the 49 items validated by the actual environment data. The results indicated that the preferred environment data accounted for 71.9% of total explained variance of the eight factors. The factorial structure of the preferred environment was the same as the actual environment’s a priori structure except for the different loading for each item, as shown in Table II. Based on the average factor loadings of two exploratory factor analyses, the highest two pattern coefficients of marker variables were used as the
Scale and measure
Technological enrichment (TE) TE1 The course content is more abundant. TE2 It is easy for students to understand scientific principles. TE3 The teacher presents more real-world phenomena. TE4 Technology enables students to explain concepts in alternative ways. TE5 Time to go through the course content is shorter. TE6 The course content attracts students’ attention. TE7 The course provides more information for students to learn. TE8 Students can easily identify their own incorrect ideas. TE9 Students have more opportunities to discuss issues with the teacher. TE10 Students have more opportunities to discuss issues with each other. Inquiry learning (IL) IL1 Students collect evidence to verify their thoughts. IL2 Students consider a question from various aspects. IL3 Students design the methods to solve problems. IL4 Students try new ways to solve the problems. IL5 Students find the answers using various methods. IL6 Students’ views differ from one another. IL7 Students often reflect on their own ideas.
Item
Summary of measurement scales
TABLE II
1.08 1.05 1.14 1.05 1.11 1.10 0.95 1.11 1.17 1.15 1.11 1.05 1.07 1.03 1.02 0.99 1.05
3.18 3.30 3.05 3.47 3.63 3.59 3.49
SD
4.07 3.94 3.78 4.03 3.76 3.98 4.31 3.66 3.44 3.58
Actual
M
0.73 0.71 0.71 0.67 0.65 0.55 0.53
0.80 0.78 0.76 0.73 0.72 0.72 0.67 0.65 0.65 0.55
Actual
0.70 0.73 0.69 0.65 0.66 0.59 0.67
0.66 0.73 0.70 0.78 0.69 0.74 0.67 0.66 0.56 0.61
Preferred
Factor loading
218 WEISHEN WU ET AL.
Equity & friendliness (EF) EF1 The teacher treats students fairly. EF2 Every student gets the same chance to answer the teacher. EF3 The teacher does not embarrass students who answer incorrectly. EF4 The teacher is tolerant of students’ misbehavior. EF5 The teacher is always smiling and engages students. EF6 The teacher answers students’ questions zealously. Student cohesiveness (SC) SC1 Students are friendly to each other. SC2 Students are willing to help each other. SC3 It is easy to find members for grouping. SC4 Students share information with each other. SC5 Students have opportunities to discuss questions with classmates. SC6 Group members complete assignments together in class. Understanding & encouragement (UE) UE1 The teacher walks to students’ seats to talk. UE2 The teacher cares about students’ feelings. UE3 The teacher invites students to describe their ideas. UE4 Students have opportunities to ask the teacher questions. UE5 The teacher stops to help students when they are in trouble. UE6 The teacher supports or praises students’ performance. Competition & efficacy (CE) CE1 Students care about their own performance. CE2 Students work hard to outperform others. CE3 Classmates’ performances push students to be more diligent. CE4 Students set up study goals on their own. CE5 Comparisons among groups occur. CE6 Students are confident of learning this subject well. 0.94 0.97 1.07 1.20 0.94 0.92 0.85 0.87 0.97 0.95 0.97 0.83 1.09 1.10 1.24 1.16 1.14 1.12 1.15 1.06 1.12 1.09 1.20 1.00
4.36 4.27 4.14 3.39 4.24 4.28 4.28 4.16 4.07 3.95 3.78 4.15 2.61 3.63 3.43 3.55 3.67 3.54 3.83 3.77 3.84 3.55 3.33 3.73
0.75 0.71 0.70 0.58 0.57 0.43
0.74 0.67 0.66 0.58 0.58 0.45
0.78 0.75 0.72 0.68 0.49 0.48
0.72 0.69 0.64 0.52 0.52 0.46
0.69 0.59 0.56 0.66 0.73 0.49
0.71 0.55 0.65 0.51 0.53 0.40
0.78 0.77 0.76 0.74 0.64 0.51
0.70 0.54 0.52 0.45 0.59 0.54
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
219
environment (AE) Students can see projected visuals clearly from their seats. The projections are clear without turning off the lights or closing the curtains. The equipment does not block students’ view of projections. The projection size is moderate for students. The speaker volume is moderate for students.
Audiovisual AE1 AE2 AE3 AE4 AE5 Order (OR) OR1 OR2 OR3
Items with loading less than 0.4 are omitted.
Classmates maintain order in the classroom. It is rarely noisy in class. Classmates concentrate on what the teacher says.
Scale and measure
Item
(continued)
TABLE II
3.14 2.98 3.76
4.24 2.69 4.12 4.19 3.63
Actual
M
1.10 1.10 0.81
1.03 1.40 1.21 1.14 1.36
SD
0.89 0.86 0.46
0.65 0.58 0.58 0.58 0.52
Actual
0.78 0.80 0.52
0.54 0.56 0.59 0.61 0.69
Preferred
Factor loading
220 WEISHEN WU ET AL.
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
221
criteria for nominating a given factor. Eight factors were assigned— technological enrichment, inquiry learning, equity and friendliness, student cohesiveness, understanding and encouragement, competition and efficacy, audiovisual environment, and order. The description and sample items for each scale are provided in Table III. Internal reliability was tested using the individual student as a unit of analysis for the Cronbach’s alpha coefficient. As shown in Table IV, except for the reliability coefficient of audiovisual environment, which was 0.69 (quite close to the threshold value of 0.7), all the remaining Cronbach’s α-values for the actual environment data were over 0.76, exceeding the common threshold recommended by previous research (Nunnally, 1978). The alpha values tested using the preferred environment data were higher than the actual environment data. Thus, the reliability data suggest that the instrument has acceptable internal consistency. The mean correlation of one scale with the remaining scales was used as an index of discriminate validity. The results demonstrated that the scales somewhat overlapped, but not to the extent that would violate the psychometric structure of the instrument. Verification of the Instrument To validate the instrument, the second sample group (n=464) was assessed using the confirmatory factor analysis with a structural equation modeling approach. The hypothesized model was further improved by allowing few correlations between error variances. Seven common model-fit measures were used to examine the model’s overall goodness of fit: chi-square/degree of freedom (x2/d.f.), goodness-of-fit index (GFI), adjusted goodness-of-fit index (AGFI), normed fit index (NFI), comparative fit index (CFI), root mean square residuals (RMSR), and root mean square error of approximation (RMSEA). As shown in Table V, with the exception of GFI, all fit indices for the measurement model exceeded the recommended values for the respective indices according to previous research (Hu & Bentler, 1999; Jöreskog, 1993). Altogether, the fit indices indicated that the hypothesized model was well fitted with the data collected and the instrument was provided with construct validity. The model was further assessed for construct reliability calculated as: (square of the summation of the factor loadings)/{(square of the summation of the factor loadings)+(summation of error variances)} (Reuterberg & Gustafsson, 1992). The interpretation of the resultant coefficient of construct reliability is similar to that of Cronbach’s alpha, except that it also takes into account the actual factor loadings rather than
The extent to which technological impact on teaching and learning are perceived. The extent to which processes of investigation, problem solving, reflection, and creation are emphasized. The extent to which the teacher treats students equally and friendly. The extent to which students are supportive of each other. The extent to which the teacher understands and encourages students. The extent to which students are motivated and confident to compete each others. The extent to which audiovisual effects are adequate for students. The extent to which classroom organization is conducive to keeping students focused.
Technological enrichment
Student cohesiveness
Order
Audiovisual environment
Competition & efficacy
Understanding & encouragement
Equity & friendliness
Inquiry learning
Description
Scale
Classmates maintain order in the classroom.
The teacher cares about my feelings. The teacher stops to help me when I am in trouble. I work hard to outperform others. Classmates’ performances push me to be more diligent. I can see the projection clearly from my seat.
Everyone gets the same opportunity to answer the teacher’s questions. The teacher does not embarrass me when my answer is incorrect. Classmates share information with each other.
I consider a question from various aspects.
It is easy to understand scientific principles.
Sample items
Scale description and sample items in the instrument
TABLE III
222 WEISHEN WU ET AL.
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
223
TABLE IV Alpha reliability and discriminate validity for the scales Cronbach α reliability Scale
Items
Actual
Preferred
Mean correlation with other scales*
Technological enrichment Inquiry learning Equity & friendliness Student cohesiveness Understanding & encouragement Competition & efficacy Audiovisual environment Order
10 7 6 6 6 6 5 3
0.93 0.87 0.80 0.84 0.85 0.83 0.69 0.76
0.94 0.94 0.86 0.94 0.91 0.86 0.89 0.81
0.55 0.55 0.57 0.53 0.53 0.57 0.47 0.26
* Mean correlation using the actual form data.
assuming that each item is equally weighted in the composite load determination (Chau & Hu, 2001). As shown in Table VI, the construct reliability for all scales, ranging from 0.73 to 0.91, were above the acceptable threshold suggested by previous research (Bagozzi & Yi, 1988). Correlations between scales ranged from 0.22 to 0.81, within significant levels, except for the higher correlation between the equity and friendliness and understanding and encouragement scales. These results indicate that the discriminate validity of measures was supported by the confirmatory factor analysis. Based on the empirical evidence of psychometric properties, the instrument demonstrated adequate validity and reliability. The final instrument, subsequently titled the Technology Integrated Classroom Inventory (TICI), contained 49 measurements within eight scales specifically created for the technology-integrated science learning environment. TABLE V Fit indices of the model Goodness-of–fit measure
Recommended value
Measurement model
x2/d.f. GFI AGFI NFI CFI RMSR RMSEA
≤ ≥ ≥ ≥ ≥ ≤ ≤
1.70 0.86 0.84 0.93 0.93 0.05 0.04
3.00 0.90 0.80 0.90 0.90 0.10 0.08
224
WEISHEN WU ET AL.
TABLE VI Construct reliability and correlation matrix Construct reliability TE
Scale Technological enrichment Inquiry learning Equity & friendliness Student cohesiveness Understanding & encouragement Competition & efficacy Audiovisual environment Order
IL
EF
SC
UE
CM
AE
OR
0.91
1.00
0.87 0.79 0.82 0.81
0.59 0.61 0.58 0.51
0.83 0.73 0.76
0.60 0.76 0.55 0.61 0.50 1.00 0.71 0.41 0.53 0.55 0.41 0.40 1.00 0.22 0.23 0.29 0.30 0.25 0.30 0.25 1.00
1.00 0.53 1.00 0.69 0.64 1.00 0.62 0.81 0.63 1.00
pG.05 for all scales
Comparison of Actual and Preferred Learning Environments Mean scores of the scales used with regard to the actual and preferred environments were calculated for the entire student (N=1,188) and teacher (N=23) population. Table VII lists the means and standard deviations for all scales used regarding both actual and preferred environments for students and teachers. The means in the actual environment ranged from 3.28 to 4.29, indicating that students and TABLE VII Means and standard deviations for the actual and preferred forms Student (N=1,188)
Teacher (N=23)
Actual
Preferred
Actual
Preferred
Scale
M.
S.D.
M.
S.D.
M.
S.D.
M.
S.D.
Technological enrichment Inquiry learning Equity & friendliness Student cohesiveness Understanding & encouragement Competition & efficacy Audiovisual environment Order Total mean
3.84 3.40 4.10 4.06 3.43
0.85 0.69 0.79 0.68 0.85
4.34 4.13 4.24 4.38 3.90
0.81 0.90 0.79 0.82 0.98
4.04 3.29 4.29 4.18 3.80
0.44 0.70 0.44 0.35 0.46
4.75 4.74 4.58 4.62 3.97
0.38 0.42 0.41 0.45 0.44
3.68 3.76 3.28 3.69
0.81 0.82 0.84 1.12
4.13 4.31 4.20 4.20
0.85 0.89 0.91 1.09
4.00 3.99 4.18 3.97
0.39 0.62 0.58 1.18
4.67 4.63 4.36 4.54
0.41 0.33 0.64 1.13
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
225
teaches perceived all environmental scales to either sometimes or often occur in their classrooms. In the actual environment, students ranked equity and friendliness highest (4.10) and order lowest (3.28); teachers rated equity and friendliness highest (4.29) as well and inquiry learning lowest (3.29). These results suggest that both students and teachers perceive equity and friendliness as significant in the technologyintegrated science learning environment. However, students perceive order lower while teachers perceive insufficient inquiry learning occurring during the technology-integrated science classes. In the preferred environment, students’ means ranged from 3.90 to 4.38, compared to teachers’ means ranging from 3.97 to 4.75. The scopes of means demonstrate that teachers’ expectancies toward the technologyintegrated science learning environment had a higher distribution than students’. With regard to technology integration, students and teachers expected student cohesiveness and technological enrichment to be the highest level, respectively. Students’ and teachers’ data resulted in different means in regards to the preferred environment, except for understanding and encouragement, which was rated lowest by both populations. All scale means for students and teachers in the actual environment were lower than those in the preferred environment. The differences of the total actual– preferred means were 0.51 for students and 0.57 for teachers. Similar to the actual–preferred discrepancies, the mean actual subtracted from the mean preferred using the paired t-test for each scale for both students and teachers (see Table VIII) indicated statistically significant actual–preferred discrepancies for all scales, excluding order (−0.18, ns) for teachers. The largest actual–preferred discrepancy for students was order (−0.93, pG.001), followed by inquiry learning (−0.74, pG.001). The largest actual–preferred discrepancy for teachers was inquiry learning (−1.45, pG.01), followed by technological enrichment (−0.71, pG.001). The smallest actual–preferred discrepancy was equity and friendliness (−0.14, pG.01) for students and understanding and efficacy (−0.17, pG.05) for teachers. The student–teacher discrepancy for each scale in both environments was calculated by mean actual subtracted from mean preferred using an independent t-test (see Table IX). All teachers’ mean scores were higher than students’ except for inquiry learning (0.11, ns) in the actual environment. Three significant discrepancies emerged in the actual environment: order (−0.90, pG.001), understanding and encouragement (−0.37, pG.001), and competition and efficacy (−0.32, pG.05). Another five emerged in the preferred environment: technological enrichment (−0.41, pG.05), inquiry learning (−0.60, pG.01), understanding and encouragement (−0.06, pG.001), competition and efficacy (−0.54, pG.01), and audiovisual
226
WEISHEN WU ET AL.
TABLE VIII A comparison of actual-preferred discrepancy for students and teachers Actual-preferred discrepancy Scale
Student
t
Teacher
t
Technological enrichment Inquiry learning Equity & Friendliness Student cohesiveness Understanding & encouragement Competition & efficacy Audiovisual environment Order
−0.50 −0.74 −0.14 −0.32 −0.48 −0.45 −0.55 −0.93
−12.61*** −17.41*** −3.49** −8.37*** −10.06*** −10.49*** −12.15*** −18.91***
−0.71 −1.45 −0.29 −0.44 −0.17 −0.67 −0.65 −0.18
−5.12*** −4.94** −2.93* −2.76* −2.98* −3.23** −4.98** −1.61
* pG.05 **pG.01 ***pG.001.
environment (−0.33, pG.001). The largest student–teacher discrepancy was order (actual environment) and inquiry learning (preferred environment). Significant discrepancies among both groups emerged in understanding and encouragement and competition and efficacy. DISCUSSION This study employed the exploratory and confirmatory factor analysis to validate the psychometric structure of a new instrument for the technologyintegrated science classroom—namely, the Technology Integrated ClassTABLE IX A comparison of student–teacher discrepancies for actual and preferred environments Student-teacher discrepancy Scale
Actual
t
Preferred
t
Technological enrichment Inquiry learning Equity & friendliness Student cohesiveness Understanding & encouragement Competition & efficacy Audiovisual environment Order
−0.19 0.11 −0.19 −0.12 −0.37 −0.32 −0.23 −0.90
−1.41 0.37 −0.89 −1.11 −5.55*** −2.60* −1.41 −3.55***
−0.41 −0.60 −0.34 −0.24 −0.06 −0.54 −0.33 −0.16
−3.37** −4.20** −1.41 −0.96 −5.00*** −4.21** −4.94*** −0.59
* pG.05 **pG.01 ***pG.001.
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
227
room Inventory (TICI). Based on the empirical results, the TICI is distinctive from existing technology-oriented learning environment instruments. First, the TICI consists of critical aspects derived from the current instruments. Although some items were withdrawn by the instrumentation, these do not threaten the factor structure. Second, the TICI redefined the technological aspects of the learning environment by incorporating additional technological impacts according to the previous findings. The technological enrichment scale’s significant correlation with other scales in the TICI confirmed the role of technology integration in the science learning environment. Third, the audiovisual environment in the TICI was tailored to the general technological usage in science classrooms. This scale differs from Sandlot and Fraser’s (2005) physical factor, which incorporates more ergonomic scales for computer laboratories or computers available for student use in classrooms. Overall, the TICI was verified as a parsimonious instrument for accessing the technology-integrated science learning environment in the general classroom. Two hybrid scales—namely equity and friendliness and competition and efficacy—were retrieved in the TICI; such a finding is interesting and worth noting. Most measurements of the two scales were mainly derived from the equity, teacher support, competition, and task orientation scales adopted by the previous researches (Aldridge et al., 2004; Newhouse, 2001). In the TICI, equity and friendliness was combined to access teachers’ attitudes toward students prior to the equal learning opportunities presented during class. It is plausible that treating all class members equally and friendly produces respect among students. The competition and efficacy scale emphasizes the grade-based competition that motivates students to work hard or endeavor to perform better than their peers (Elliott & Dweck, 1988). Students are easily affected by each other in the same classroom. In many countries such as Taiwan, secondary students strive to pass the entrance examination for high school. In such competitive contexts, students’ self-belief and self-regulatory capabilities can be enhanced in terms of academic efficacy (Zimmerman, 1995). The communication behavior in the classroom is considered the crucial influence on the student–teacher relationship and related learning outcomes. Previous studies have investigated the multidimensionality of teachers’ communication behavior with students in the science classroom (She, 1998; Wubbels, 1993). In the current study, the higher correlation coefficient between the equity and friendliness and understanding and encouragement scales suggests that a second-order factorial structure might emerge as supportive teacher behavior in the technology-integrated science learning environment. In addition, both students and teachers
228
WEISHEN WU ET AL.
rated equality and friendliness highest, followed by student cohesiveness; these may be slightly increased. It is implicated that technological enhancement such as interactivity enhances the reciprocal relationships among students and with the teacher in the science classroom. However, both students and teachers indicated significant actual–preferred discrepancies on understanding and encouragement, suggesting that technology integration might hinder the teacher from delivering personal support to students especially in the general classroom. The lower correlation between the order and remaining scales could be attributed to the low number of measurement items, even if it did not violate the “three measures rule” for identification (Rigdon, 1995). Two items related to order were withdrawn from the initial items during development and testing of the instrumentation. One item, “Violation of the rules will be punished by the teacher,” was discarded from the pretest; another item, “The teacher stipulates the rules for the class,” was dropped after the factor analysis. These seemed to create unnecessary confusion as one teacher commented, “It seems uncomfortable to me if students are too quiet in my class”— teachers would like students to participate in class enthusiastically and orderly. In the actual environment, the significant student–teacher discrepancy in order suggests that teaching with technology leads to loose student discipline, although teachers think it is controllable in the science classroom. Except for inquiry learning in the actual environment, results of the student–teacher discrepancies supported previous findings that teachers’ perceptions of the learning environment are often more positive than their students’ (Fraser, 1998). The largest actual–preferred discrepancy for students as well as the largest student–teacher discrepancy in the preferred environment was inquiry learning, which parallels the finding that Taiwanese students and teachers perceived lower inquiry learning in their science classes. Like these results reported almost a decade ago, the current findings demonstrated that students and teachers still sense insufficient inquiry-based science learning. Within the technologyenriched setting, our finding implicates that students’ inquiry capabilities will not improve unless students have hands-on experiences with technology-mediated learning activities. Hence, increasing technology accessibility must be a top priority for schools particularly for students use in general classrooms. The further deployment of technology-supported inquiry learning strategies is urged for secondary science classes. This study has several limitations. First, the findings and their implications are obtained from one single study that examined the factorial structure of instrument by two sample groups instead of different time scale. Thus,
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
229
caution needs to be taken when generalizing the findings and discussion to other classroom settings. Second, most measurements were basically drawn from the validated measures in existing instruments. However, two hybrid scales composed of four distinct scales in this study suggests a need to reevaluation of instrument across different targeted samples.
CONCLUSIONS Effective teaching and learning are linked to sufficient fit between the person and classroom environment. Based on the emergence of technology integration in the science classroom, this study has developed the Technology Integrated Classroom Inventory (TICI) using data from secondary science classes in Taiwan. The results of validation support the fact that the TICI has a parsimonious structure and sound psychometric properties. Eight scales consisting of 49 socio-psychosocial and physical factors are included in the TICI in order to access students’ and teachers’ perceptions of technology-integrated science learning environments. The technological enrichment scale significantly correlates to the remaining scales in the TICI, thereby confirming the requisite of technology integration into the science learning environment. Equity and friendliness as well as competition and efficacy emerged as hybrid scales, demonstrating the mixed nature of teacher–student and peer relationships in the science classroom. Using TICI to assess the actual–preferred discrepancies of technologyintegrated science learning environment sheds light on subtle differences for students and teachers. Students perceive order and teachers perceive inquiry learning as the largest actual–preferred discrepancies within the technology-integrated science learning environment. When teaching with technology in the general classroom, students are concerned about the classroom order while teachers expect more technology-supported inquiry learning. In the actual and preferred environments, both students and teachers indicated significant discrepancies in equity and friendliness as well as competition and efficacy, suggesting that science teachers ought to pay more attention to the communication behavior and student growth during technology integration. Although the TICI is designed for the technology-integrated science learning environment in the general classroom, it also can be used in the well-equipped classroom in which students are able to use technology in learning. Further empirical investigations of the TICI in regards to students’ science outcomes, such as attitudes, satisfaction, and achievements, are highly encouraged.
230
WEISHEN WU ET AL.
ACKNOWLEDGEMENTS This research was substantially supported by the National Science Council (NSC) of Taiwan under grant number NSC 94-2511-S-212001. The authors wish to thank reviewers for their valuable suggestions. REFERENCES Aldridge, J.M., Fraser, B.J., Taylor, P.C. & Che, C.C. (2000). Constructivist learning environments in a cross-national study in Taiwan and Australia. International Journal of Science Education, 22, 37–55. Aldridge, J.M., Dorman, J.P. & Fraser, B.J. (2004). Use of multitrait-multimethod modeling to validate actual and preferred forms of the technology-rich outcomesfocused learning environment inventory (Troflei). Australian Journal of Educational and Developmental Psychology, 4, 110–125. Anderson, A.A., Hamilton, R.J. & Hattie, J. (2004). Classroom climate and motivated behavior in secondary schools. Learning Environments Research, 7, 211–225. Bagozzi, R.P. & Yi, Y. (1988). On the evaluation of structural equation models. Journal of the Academy of Marketing Science, 16, 74–94. Barnette, J.J. (2000). Effects of stem and Likert response option reversal on survey internal consistency: If you feel the need, there is a better alternative to using those negatively worded stems. Educational and Psychological Measurement, 60, 361–370. Bell, P. & Linn, M.C. (2000). Scientific arguments as learning artifacts: Designing for learning from the web with KIE. International Journal of Science Education, 22(8), 797–817. Bowman, J., Jr., Newman, D.L. & Masterson, J. (2001). Adopting educational technology: Implications for designing interventions. Journal of Educational Computing Research, 25(1), 81–94. Chang, V. & Fisher, D.L. (2001). A new learning instrument to evaluate online learning in higher education. In M. Kulski & A. Herrmann (Eds.), New horizons in university teaching and learning (pp. 23–34). Perth, Western Australia: Curtin University of Technology. Chang, C.Y., Hisao, C.H. & Barufaldi, J.P. (2006). Preferred-actual learning environment spaces and earth science outcome in Taiwan. Science Education, 90(3), 420–423. Chau, P.Y.K. & Hu, P.J. (2001). Information technology acceptance by individual professionals: A model comparison approach. Decision Science, 32(4), 699–719. Chen, C.C., Taylor, P.C. & Aldridge, J.M. (1998). Study on teachers’ beliefs about science and their effect on classroom environment in junior high school. Chinese Journal of Science Education, 6(4), 383–402. Cox, M. (2000). Information and communications technologies: Their role and value for science education. In M. Mork & J. Osborne (Eds.), Good practice in science teaching: What research has to say (pp. 190–207). Buckingham: Open University Press. Dorman, J. (2001). Associations between classroom environment and academic efficacy. Learning Environments Research, 4, 243–257. Dorman, J. (2002). Classroom environment research: Progress and possibilities. Queensland Journal of Educational Research, 18, Queensland Institute for Educational Research. http://education.curtin.edu.au/iier/qjer/qjer18/dorman.html.
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
231
Dorman, J. & Adams, J. (2004). Associations between students’ perceptions of classroom environment and academic efficacy in Australian and British secondary schools. Westminster Studies in Education, 27(1), 69–85. Dunn, R.J. & Harris, L.G. (1998). Organizational dimensions of climate and the impact on school achievement. Journal of Instructional Psychology, 25, 100–115. Edelson, D.C. (2001). Learning-for-use: A framework for the design of technologysupported inquiry activities. Journal of Research in Science Teaching, 38(3), 355–385. Elliott, E. & Dweck, C. (1988). Goals: An approach to motivation and achievement. Journal of Personality and Social Psychology, 54(5), 5–12. Fisher, D.L., Goh, S.C., Wong, A.F.L. & Richards, T.W. (1997). Perceptions of interpersonal teacher behavior in secondary science classrooms in Singapore and Australia. Journal of Applied Research in Education, 25, 125–133. Fraser, B.J. (1986). Classroom environment. London: Coom Helm. Fraser, B.J. (1994). Research on classroom and school climate. In D. Gabel (Ed.), Handbook of research on science teaching and learning (pp. 493–541). New York: Macmillan. Fraser, B.J. (1998). Classroom environment instruments: Development, validity and applications. Learning Environments, 1, 7–33. Fraser, B.J. (2002). Learning environments research: yesterday, today and tomorrow. In S. C. Goh & M.S. Khine (Eds.), Studies in Educational Learning Environments (pp. 1– 25). Singapore: Word Scientific. Fraser, B.J. & Tobin, K. (1989). Student perceptions of psychosocial environment in classrooms of exemplary science teachers. International Journal of Science Education, 11, 19–34. Fraser, B.J., McRobbie, C. J. & Fisher, D. L. (1996). Development, validation and use of personal and class forms of a new classroom environment instrument. Paper presented at the annual meeting of American Educational Research Association, NY. Gerjets, P.H. & Hesse, F.W. (2004). When are powerful learning environments effective? The role of learner activities and of students’ conceptions of educational technology. International Journal of Educational Research, 41, 445–465. Hair, J.F., Anderson, R.E., Tatham, R.L. & Black, W.C. (1998). Multivariate data analysis. New Jersey: Prentice-Hall. Hu, L.T. & Bentler, P.M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6, 1–55. Huang, S.Y. (2003). Antecedents to psychosocial environments in middle school classrooms in Taiwan. Learning Environments Research, 6, 119–135. Jöreskog, K.G. (1993). Testing structural equation models. In K.A. Bollen & J.S. Long (Eds.), Testing structural equation models (pp. 294–316). California: Sage. Khine, S.M. (2003). Creating a technology-rich constructivist learning environment in a classroom management module. In S.M. Khine & D. Fisher (Eds.), Technology-rich learning environments: A future perspective (pp. 21–39). New Jersey: World Scientific. Lajoie, S.P. (1993). Computer environments as cognitive tools for enhancing learning. In S.P. Lajoie & R. Derry (Eds.), Computer as cognitive tools (pp. 261–288). Hillsdale, New Jersey: Lawrence Erlbaum Associates. Lim, C.P., Pek, N.S. & Chai, C.S. (2005). Classroom management issues in information and communication technology (ICT)-mediated learning environments: Back to the basics. Journal of Educational Multimedia and Hypermedia, 14(4), 391–414.
232
WEISHEN WU ET AL.
Linn M.C. & His, S. (2000). Computers, teachers, peers: Science learning partners. Mahwah, New Jersey: Lawrence Erlbaum Associates. Maor, D. (2000). A teacher professional development program on using a constructivist multimedia learning environment. Learning Environments Research, 1, 307–330. Maor, D. & Fraser, B.J. (1996). Use of classroom environment perceptions in evaluating inquiry-based computer-assisted learning. International Journal of Science Education, 18(4), 401–421. Mayer-Smith, J., Pedretti, E. & Woodrow, J. (1998). An examination of how science teachers’ experiences in a culture of collaboration inform technology implementation. Journal of Science Education and Technology, 7(2), 127–134. Mistler-Jackson, M. & Songer, N.B. (2000). Student motivation and Internet technology: Are students empowered to learn science? Journal of Research in Science Teaching, 37, 459–479. Moos, R.H. (1987). The social climate scales: A user’s guide. Consulting, Palo Alto, California: Psychologists Press. National Science Council (1996). National science education standards. Washington, DC: National Academy Press. Newhouse, C.P. (2001). Development and use of an instrument for computer-supported learning environments. Learning Environments Research, 4, 115–138. Newton, L.R. & Rogers, L. (2001). Teaching Science with ICT. London: Continuum. Nolen, S.B. (2003). Learning environment, motivation, and achievement in high school science. Journal of Research in Science Teaching, 40(4), 347–368. Nunnally, J.C. (1978). Psychometric theory. New York: McGraw-Hill. Reuterberg, S.E. & Gustafsson, J.E. (1992). Confirmatory factor analysis and reliability: Testing measurement model assumptions. Educational and Psychological Measurement, 52, 795–811. Rigdon, E.E. (1995). A necessary and sufficient identification rule for structural models estimated in practice. Multivariate Behavioral Research, 30, 359–383. Roth, W.M. (1998). Teacher-as-researcher reform: Student achievement and perceptions of learning environment. Learning Environments Research, 1, 75–93. Schriesheim, C.A., Eisenbach, R.J. & Hill, K.D. (1991). The effect of negation and polar opposite item reversals on questionnaire reliability and validity: An experimental investigation. Educational and Psychological Measurement, 51, 67–78. She, H.C. (1998). The development and validation of the teacher-student interaction questionnaire (TSIQ) in the secondary science classroom learning environment. Chinese Journal of Science Education, 6(4), 403–416. Songer, N.B. (1998). Can technology bring students closer to science? In B.J. Fraser & K. G. Tobein (Eds.), International Handbook of Science Education (pp. 333–347). London: Kluwer Academic Publishers. Suárez, M., Pias, R., Membiela, P. & Dapia, D. (1997). Classroom environment in the implementation of an innovative curriculum project in science education. Journal of Research in Science Teaching, 35(6), 655–671. Taylor, P.C., Fraser, B.J. & Fisher, D.L. (1997). Monitoring constructivist classroom learning environments, International Journal of Educational Research, 27, 293–302. Tobin, K. & Fraser, B.J. (1998). Qualitative and quantitative landscape of classroom learning environments. In B.J. Fraser & K.G. Tobin (Eds.), International Handbook of Science Education (pp. 623–640). Dordrecht: Kluwer Academic Publishers.
A TECHNOLOGY-INTEGRATED SCIENCE LEARNING ENVIRONMENT
233
Tobin, K. & LaMaster, S.U. (1995). Relationships between metaphors, beliefs, and actions in a context of science curriculum change. Journal of Research in Science Teaching, 32(3), 225–242. Walker, S.L. & Fraser, B.J. (2005). Development and validation of an instrument for assessing distance education learning environments in higher education: The Distance Education Learning Environments Survey (DELES). Learning Environments Research: An International Journal, 8(3), 289–308. Wang, K.H., Tuan, H.L. & Chang, H.P. (1998). Secondary school student perceptions of science teacher’s knowledge. Chinese Journal of Science Education, 6(4), 363–381. Woods, J.D. (1995). Teaching effectiveness: Using students’ perceptions of teaching style and preferred learning style to enhance teaching performance. Unpublished doctoral thesis, Curtin University of Technology, Perth, Australia. Wu, H.K., Krajcik, J.S. & Soloway, E. (2001). Promoting understanding of chemical representations: Students’ use of a visualization tool in the classroom. Journal of Research in Science Teaching, 38(7), 821–842. Zandvliet, D.B. & Fraser, B.J. (2005). Physical and psychosocial environments associated with networked classrooms. Learning Environments Research, 8, 1–17. Zandvliet, D.B. & Straker, L.M. (2001). Physical and psychosocial aspects of the learning environment in information technology rich classrooms. Ergonomics, 44(9), 838–857. Zimmerman, B.J. (1995). Self-efficacy and educational development. In A. Bandura (Ed.), Self-efficacy in changing societies (pp. 202–231). Cambridge: Cambridge University Press.
Weishen Wu Department of Information Management, Da-Yeh University, 112, Shan-Jiau Road, Da-Tsuen, Chang-Hua, 515, Taiwan E-mail:
[email protected]
Huey-Por Chang Department of Physics, National Changhua University of Education, Jin-De Campus, 1, Jin-De Road, Chang-Hua, 500, Taiwan E-mail:
[email protected]
Chorng-Jee Guo Department of Natural Science Education, National Taitung University, 684, Section 1, Chunghua Road, Taitung, 950, Taiwan E-mail:
[email protected]