Changes in Teachers' Attitudes Toward Instructional Technology ...

2 downloads 789 Views 556KB Size Report
Nov 17, 2015 - Instructional Technology Attributed to Completing ... Certificate of Proficiency Capstone Program, Computers in the Schools, 32:3-4, 240-259, ...
Computers in the Schools Interdisciplinary Journal of Practice, Theory, and Applied Research

ISSN: 0738-0569 (Print) 1528-7033 (Online) Journal homepage: http://www.tandfonline.com/loi/wcis20

Changes in Teachers’ Attitudes Toward Instructional Technology Attributed to Completing the ISTE NETS*T Certificate of Proficiency Capstone Program Richard C. Overbaugh, Ruiling Lu & Mark Diacopoulos To cite this article: Richard C. Overbaugh, Ruiling Lu & Mark Diacopoulos (2015) Changes in Teachers’ Attitudes Toward Instructional Technology Attributed to Completing the ISTE NETS*T Certificate of Proficiency Capstone Program, Computers in the Schools, 32:3-4, 240-259, DOI: 10.1080/07380569.2015.1059254 To link to this article: http://dx.doi.org/10.1080/07380569.2015.1059254

Published online: 17 Nov 2015.

Submit your article to this journal

Article views: 2

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=wcis20 Download by: [Mark Diacopoulos]

Date: 25 November 2015, At: 06:50

Computers in the Schools, 32:240–259, 2015 Copyright © Taylor & Francis Group, LLC ISSN: 0738-0569 print / 1528-7033 online DOI: 10.1080/07380569.2015.1059254

Changes in Teachers’ Attitudes Toward Instructional Technology Attributed to Completing the ISTE NETS∗ T Certificate of Proficiency Capstone Program

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

RICHARD C. OVERBAUGH Old Dominion University, Chesapeake, Virginia, USA

RUILING LU Taiyuan Normal University, Shanxi, China

MARK DIACOPOULOS Old Dominion University, Chesapeake, Virginia, USA

An evaluation was conducted of teachers’ attitudinal perceptions of their confidence for implementation, stages of innovation adoption, and satisfaction, as a result of participating in the International Society for Technology in Education’s National Educational Technology Standards-Teachers (ISTE NETS∗ T) Certificate of Proficiency Capstone Program. The Self-Efficacy and Stages of Concern instruments provide insight into the attitudinal progression of program participants. Data were collected at pre, mid, post, and follow-up points. Results showed that the capstone program is effective; participants were clearly more confident that they could implement the technology-based/enhanced teaching/learning strategies in their schools; had far fewer concerns about their level of preparedness; were anxious to work with others in their schools; and believed their school/classroom climate was conducive to and supportive of technology integration. Additionally, participants’ principals and/or supervisors were asked to provide data on their perception of the influence the capstone courses had on their teachers. KEYWORDS ISTE NETS∗ T, stages of concern, self-efficacy, technology integration, capstone course Address correspondence to Richard C. Overbaugh, Department of Education, Old Dominion University, 5115 Hampton Blvd., Norfolk, VA 23529. E-mail: [email protected] Color versions of one or more of the figures in the article can be found online at www.tandfonline.com/wcis. 240

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

ISTE NETS∗ T Capstone Program

241

The U.S. American Recovery and Reinvestment Act of 2009 provided stimulus funding to states and localities to create jobs and promote economic renewal. The Commonwealth of Virginia was awarded American Recovery and Reinvestment Act funds through the Enhancing Education Through Technology program to conduct online sections of the PBS Teacher Line/International Society for Technology in Education’s National Educational Technology Standards-Teachers (ISTE NETS∗ T) capstone courses. The program consists of three separate courses: an introduction to the capstone (a short, self-paced orientation), a Capstone I course, and a Capstone II course. The capstone courses prepare in-service teachers to incorporate the instructional technology skills and pedagogy reflected by ISTE’s NETS∗ S (for students). The courses are distributed by PBS TeacherLine© through local public media stations such as WHRO, a school division-owned public television and radio syndicate in Virginia. Teachers who successfully complete both capstone courses receive a NETS∗ T Certificate of Proficiency, recognized by some school systems as a qualification to be an instructional technology resource teacher or equivalent. This study was primarily a quantitative design with pre, mid, post, and follow-up measures. A multi-part survey, administered electronically, examined teacher attitude changes as a result of participating in the program. The pre-survey was delivered to participants before they started the 15-week Capstone I course, assuming that the orientation course provided a basic understanding of the nature of the capstone program to enable them to better understand survey questions The mid-survey was administered after Capstone I; the post-survey, after the 15-week Capstone II course, and the follow-up survey, toward the end of the semester following Capstone II. Participants were therefore allowed to begin implementing what they learned in the program. Because the follow-up was designed to give participants opportunity to implement changes, the length of time varied somewhat depending on whether Capstone II was completed in the fall or spring semester. Our study included five components: (a) demographic information, (b) the Self-Efficacy Survey, (c) the Stages of Concern (SoC) survey, (d) the Implementation/Classroom Climate Survey (follow-up only), and an administrator survey (administrator perceptions about the effect of capstone courses on participants’ classroom practice). To guide this evaluation effort, the following questions were posed: 1. What is the effect of the capstone course on participants’ levels of selfefficacy over time? 2. What is the effect of the capstone course on participants’ SoC over time? 3. How do participants perceive the state of technology and technology support in their schools? 4. How well were capstone courses received by participants?

242

R. C. Overbaugh et al.

5. How do school administrators perceive the state of technology in their schools? 6. How do school administrators perceive the effect of capstone courses on participant technology competence?

METHODOLOGY

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

Participant Self-Efficacy Survey The Self-Efficacy Survey was chosen to identify changes in participants’ confidence in learning and using the new knowledge/skills/techniques as a result of participating in the program. In this context, confidence refers to the beliefs, feelings, or expectations that might affect participants’ overt behaviors (classroom/school performance) (Bandura, 1977). Self-efficacy is an important measure as shown in recent research that teachers’ job satisfaction is predicted by their self-efficacy and autonomy (Skaalvik & Skaalvik, 2014), while low self-efficacy leads to dissatisfaction and burnout (Brown, 2012). This contrast is reflected in studies that examine teachers’ self-efficacy and technology integration into their pedagogy. Research has long shown that teachers’ confidence level for learning and teaching with technology directly influences their level of using technology in the classroom. More than a decade ago, Albion (1999) posited that teachers’ beliefs in their ability to work effectively with technology are a strong indicator of successfully and effectively integrating technology. Although not a direct cause of increased technology use, teacher self-efficacy is a necessary condition for adopting technology in education (Wang, Ertmer, & Newby, 2004). In addition, teachers’ self-efficacy affects how technologies are used and may have a positive effect on their teaching and students’ learning. For example, teachers who feel competent in educational Internet use will more likely assist with implementing this technology (Sahin, Celik, Akturk, & Aydin, 2013). Twenty-first century teachers are generally more knowledgeable about how to use technologies, as well as being able to assess the personal ramifications of such use. They are concerned about implementing the changes that new technologies can bring to their teaching and want to make better use of technology in their classroom. However, teachers are not as concerned about time management and collaboration with colleagues that might promote instructional technology use (Chen & Jang, 2014; Yang & Huang, 2008). Moreover, it has been argued that the level of self-efficacy can change depending on the relationship with the technology (Stewart, Antonenko, Robinson, & Mwavita, 2013). This position was supported by Blonder, Jonatan, Bar-Dov, Benny, Rap, and Sakhnini (2013), who found that chemistry teachers had higher self-efficacy when they had a chance to practice and improve their skills—thus teacher self-efficacy tends to be context-specific.

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

ISTE NETS∗ T Capstone Program

243

For example, established teachers connect student engagement to their implementation of technology, and their self-efficacy is connected to that implementation; whereas, inexperienced teachers connect classroom management with their perceptions of competence (Stewart, Antonenko, Robinson, & Mwavita, 2013). Subsequently, it is argued that the relationship between perceived knowledge and self-efficacy beliefs about technology change over time (Abbitt, 2011). The implication of this is that positive intervention can exploit the dynamic and changing relationship between perceived knowledge and self-efficacy. Today’s teachers have a complex and deeper understanding of instructional technology implementation than ever before. This implies that their perceived competence is often in a state of flux (Abbitt, 2011). Self-efficacy, therefore, is a key factor for predicting the level of technology implementation by classroom teachers. It is especially true for teachers who wish to become school or district leaders in instructional technology. After significant course analysis and question development, the evaluators determined that ISTE’s Standards and Performance Indicators accurately and succinctly identify the capstone course components and clearly elucidate the intended outcomes. Therefore, they elected to use the indicators as written and grouped with a 5-point Likert-type scale associated with each item. For example, Performance indicator 1-c, “Promote student reflections using collaborative tools to reveal and clarify students’ conceptual understanding and thinking, planning, and creative processes” simply became “I feel confident that I can promote student reflection using collaborative tools to reveal and clarify students’ conceptual understanding and thinking, planning, and creative processes.” The resultant self-efficacy instrument has 23 items comprising the five domains: (a) Facilitate and Inspire Student Learning and Creativity, (b) Design and Develop Digital-Age Learning Experiences and Assessments, (c) Model Digital-Age Work and Learning, (d) Promote and Model Digital Citizenship and Responsibility, and (e) Engage in Professional Growth and Leadership.

Participant SoC Survey The SoC instrument assesses the attitudinal changes teachers go through when introduced to an innovation. This instrument was chosen for this study because one way to evaluate an innovation is to explore the concerns of the teachers involved, followed by interventions that address these concerns (Hall, George, & Rutherford, 1977; Yip & Cheung, 2005). In this context, innovation refers to the overarching goal of ISTE’s capstone program—to empower teachers in enhancing student learning with technology. The instrument includes seven stages: (a) awareness, (b) information, (c) personal, (d) management, (e) consequence, (f) collaboration, and

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

244

R. C. Overbaugh et al.

(g) refocusing. These stages are intended to represent the stages through which a teacher as a learner will pass through when learning about and subsequently implementing innovations in the classroom. The first stage—awareness—is the stage in which one simply becomes aware of the existence of an innovation. The second stage—information—is when the learner wants to know more about the innovation. This is followed by the third stage—personal—in which the learner now knows enough about the innovation that he/she is concerned about how it might affect him/her personally in terms of daily work. The fourth stage—management—is when the learner is ready to implement the innovation and becomes concerned about how he/she is going to manage the change in the classroom. These first four stages are often considered internal concerns because they are selffocused. The remaining stages are external concerns because the learner has progressed through significant learning about the innovation and is moving into the implementation and practice stages (i.e., he/she has accomplished his/her learner role, and now assumes the teacher role. The fifth stage—consequence—is when the teacher becomes concerned about how the implementation of the innovation will affect his/her students. The sixth stage—collaboration—reflects the point at which the teacher is ready to collaborate with others on how to use the innovation. Finally, the seventh stage—refocusing—is when the teacher begins to look for new and better ways to use the innovation. Those who plan to serve as technology resource teachers clearly need to be in these later stages of instructional technology adoption. Theoretically, the internal concerns should initially be high, whereas the external concerns should be low when the learner is first exposed to an innovation. As learning progresses, initial concerns should drop, while the external concerns rise (Fuller, 1969; Hall, George, & Rutherford, 1977). However, some recent studies involving teachers and technology integration have shown that this theoretical pattern is not typical; rather, the pattern seems to have consistently high external (later) concerns even before the start of interventions (e.g.,; Giordano, 2007; Liu & Szabo, 2008; Overbaugh, Lu, & Pribesh, 2008). This change in pattern should not be interpreted as a reduction in the instrument’s effectiveness, but should be interpreted in light of recent research (Abbitt, 2011; Chen & Jang, 2014; Stewart, Antonenko, Robinson, & Mwavita, 2013; Yang & Huang, 2008),showing that changes in teachers’ attitudes are a result of learning new teaching skills or techniques. As intended by the instrument creators (Hall, George, & Rutherford, 1977), the 35-item instrument was customized for this particular program. For example, the original item “I am concerned about students’ attitudes toward the innovation” is changed to “I am concerned about students’ attitudes toward empowering them to learn with technology.” Because defining a broad goal as a specific innovation is problematic in terms of interpretation, like the self-efficacy instrument, we administered the pretest

ISTE NETS∗ T Capstone Program

245

at the beginning of Capstone I rather than before the introductory course. As mentioned earlier, by completing the introductory course, participants would have developed familiarity with the ISTE/NETS∗ T as well as the expected outcomes of the program.

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

Implementation/Classroom Climate Survey The implementation survey is a 21-item instrument designed to gather data in three domains. The first 12 items assesses program participants’ perceptions of technology implementation support at the school level, including access to technological resources, access to technical support personnel, and the level of administrative support. The second section includes three rating questions, measuring participants’ satisfaction with their learning experiences in the courses. The third section contains six open-ended questions about the capstone courses and instructional technology in general, with the intention of triangulating the quantitative data obtained from the surveys as well as providing a better understanding of the program’s impact on participants’ learning, teaching, and attitudes. The implementation survey is imbedded in the follow-up survey, delivered a few months after participants complete the program.

Administrator Survey The administrator survey was given around the same time as the follow-up. Similar to the Implementation/Classroom Climate Survey, it assesses administrators’ perception of technology support at the school level and the effects of the capstone program on participants’ technology competence, thus serving as a means to triangulate participant data. The two-part, 25-item survey includes 11 questions grouped into the five ISTE NETS∗ T domains plus 14 questions related to current “state of technology” in the school. Administrators were directed to answer the questions according to their perceived influence on the specific participant that teaches in or serves their school.

Treatment Participants enrolled in a capstone course delivered online by PBS TeacherLine. The course included three parts: Capstone Introduction, Capstone I, and Capstone II. The entire course was designed for K–12 educators who had experience with technology integration and wanted to become proficient in the ISTE NETS∗ T standards. This course particularly appealed to teachers interested in becoming school or district level leaders in instructional technology.

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

246

R. C. Overbaugh et al.

The Capstone Introduction consisted of six, self-paced, one-hour sessions which provided an overview of the Capstone Certificate Program. Participants experienced the NETS∗ T framework and the discussion forums. They also were exposed to the content and requirements of the Capstone I and II courses, which included navigating the online portfolio tool used to showcase their progress. Capstone I: Developing a NETS∗ T Technology Portfolio fulfilled two needs: first, to provide professional development through participation in communities of practice; and second, to equip students to obtain professional recognition through certification. One Capstone I goal was to enhance participants’ understanding of the ISTE NETS∗ T framework to allow them to demonstrate proficiency in their portfolio artifacts. The 12-week Capstone I included seven parts: Developing a Learning Community, Digital-Age Learning Experiences, Digital-Age Assessments, Student Learning and Creativity, Digital-Age Work and Learning, Digital Citizenship and Responsibility, Professional Growth and Leadership. Course developers planned a comprehensive and intensive 45-hour course, not including individual reading and special assignments Typical session activities included reading online articles; participating in online discussion based on those articles and writing reflections; collecting online bookmarks of useful Web sites; and developing artifacts for a portfolio that reflected the theme of the session. Course instructors administered the pretest at the beginning of Capstone I. Capstone II: Proficient Use of Technology with the NETS∗ T, offered the following semester, had goals similar to Capstone I, but the level of understanding was deeper and more complex, with an increased focus on professional development and implementation of NETS∗ T beyond the online learning environment. For example, one of the primary goals for Capstone II was to extend participants’ thinking about effective teaching with technology and to challenge them to make ongoing improvements in their own teaching practices. Relatedly, participants were expected to understand the variety of roles that technology can play in supporting K–12 education, be comfortable discussing NETS∗ T standards and their application to teaching, be willing to share strategies and resources with other participants, and further develop their portfolios to demonstrate proficiency with NETS∗ T. At the end of the course, participants completed the posttest.

Data Collection Data collection began in September 2010 and concluded in spring 2013. During that time, nine Capstone I and six Capstone II courses were offered. Capstone I instructors administered the pretest survey as part of the course, before any class material was covered, and the midtest as the final activity.

ISTE NETS∗ T Capstone Program

247

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

Capstone II instructors requested completion of the posttest. Of the 89 participants who started Capstone I and completed the pretest, 38 completed Capstone II. Ten had missing data and could not be included, which resulted in a final N = 28. Much of the decline in sample size resulted from attrition. Administrator data were sought for each participant who completed Capstone II. The authors contacted the administrators by e-mail first, then by phone and e-mail as needed until data were gathered from all but two participants who had retired and no contact information was available, for a total of N = 36.

RESULTS AND DISCUSSION Data presented here were collected from the group of program participants who took ISTE NETS∗ T capstone courses from 2010 to 2012 and who completed all four surveys at pre-course, mid-course, post-course, and follow-up points. One-way repeated-measures analyses of variance (ANOVAs) were performed on each subscale of the Self-Efficacy, and SoC instruments, with survey administration time (pre-, mid-, post- and follow-up) as the independent variable. The data from the participant Implementation Survey and the Administrator Surveys were analyzed descriptively. Results of data analyses are presented next.

Participant Self-Efficacy To answer the first evaluation question, What is the effect of the capstone course on participants’ levels of self-efficacy over time? five repeated measures ANOVAs were performed on the five self-efficacy domains (dependent variables) to examine the changes in participants’ self-efficacy levels from pre-survey to mid-survey to post-survey, and to the follow-up survey points (independent variables). The overall ANOVAs were significant on all the dependent measures, and the effect sizes were large: on Facilitate and Inspire Student Learning & Creativity, Wilks’  = 0.33, F(3, 25) = 17.33, p < .01, η2 = 0.68; on Design and Develop Digital-Age Learning Experiences & Assessments, Wilks’  = 29, F(3, 25) = 20.33, p < .01, η2 = 0.71; on Model Digital-Age Work and Learning, Wilks’  = 0.47, F(3, 25) = 9.53, p < .01, η2 = 0.53; on Promote and Model Digital Citizenship & Responsibility, Wilks’  = 0.33, F(3, 25) = 17.17, p < .01, η2 = 0.67; on Engage in Professional Growth & Leadership, Wilks’  = 0.39, F(3, 25) = 13.04, p < .01, η2 = 0.61. Following the significant ANOVAs, six pairwise comparisons (pre- versus mid/ post/follow-up; mid versus post/follow-up; post versus follow-up) were conducted on each dependent variable to assess which

248

R. C. Overbaugh et al.

TABLE 1 Results of one-way repeated-measures ANOVA on participant self-efficacy Wilks’  df

Dependent variables Facilitate and Inspire Student Learning and Creativity Design and Develop Digital-Age Learning Experiences and Assessments Model Digital-Age Work and Learning Promote and Model Digital Citizenship and Responsibility Engage in Professional Growth and Leadership

F

p

η2

0.33 0.29

3 17.33 .00a,b,c,d 0.68 3 20.33 .00a,b,c 0.71

0.47 0.33 0.39

3 9.53 .00b,c 0.53 3 17.17 .00a,b,c,d 67 3 13.04 .00a,b,c,d 0.61

aSignificant

between pre-survey and mid-survey. between pre-survey and post-survey. cSignificant between pre-survey and follow-up survey. dSignificant between mid-survey and post-survey.

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

bSignificant

means differed significantly from one another. The comparisons revealed that there were significant differences in the means between the pre-survey and mid/post/follow-up surveys on all the dependent measures except on the Model Digital-Age Work and Learning measure. Significant differences were also found between the mid-survey and post-survey on the measures of Facilitate and Inspire Student Learning & Creativity, Promote and Model Digital Citizenship & Responsibility, and Engage in Professional Growth & Leadership. On the measure of Model Digital-Age Work and Learning, significant differences were found between the pre-survey and mid-survey, as well as between the pre-survey and follow-up survey, whereas no significant difference was found at other survey measuring points (Table 1). The descriptive statistics (Table 2 and Figure 1) show that there was a large mean increase in participants’ self-efficacy levels from the pre-survey to mid-survey on all five dependent variables. The mean increase pattern was quite stable between pre-, mid-, and post- measurements, but there were TABLE 2 Descriptive statistics on participant self-efficacy in technology enhanced teaching and learning Pre-survey Dependent variables Facilitate and Inspire Student Learning and Creativity Design and Develop Digital-Age Learning Experiences and Assessments Model Digital-Age Work and Learning Promote and Model Digital Citizenship and Responsibility Engage in Professional Growth and Leadership

Mid-survey Post-survey

Follow-up

M

SE

M

SE

M

SE

M

SE

4.16

0.09

4.48

0.08

4.77

0.07

4.63

0.08

4.11

.08

4.46

0.09

4.73

0.07

4.80

0.07

4.28 4.15

0.11 0.09

4.61 4.50

0.08 0.08

4.71 4.72

0.08 0.07

4.80 4.64

0.07 0.07

4.16

0.10

4.54

0.08

4.78

0.08

4.63

0.09

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

ISTE NETS∗ T Capstone Program

249

FIGURE 1 Participant self-efficacy in technology enhanced teaching and learning by time.

some variations at the follow-up measure, with fallbacks on three of the five dependent variables. However, the most important finding was a significant increase in self-efficacy on all five domains from the pretest to the follow-up test, which indicates the change in confidence was maintained over time. Recall that the follow-up test was not administered until late in the semester following the completion of Capstone II. This result indicated that the ISTE-NETS capstone courses helped participants gain competence and confidence in their professional growth and instructional technology integration.

Self-Efficacy Instrument Reliability Internal consistency estimates of reliability were computed for the 20 items on the participant self-efficacy with ISTE NETS∗ T scale. The value for coefficient alpha was 0.94, which shows high reliability of the scale. As the 20-item self-efficacy scale contains five subscales (participant self-efficacy in (a) facilitating and inspiring student learning and creativity, (b) designing and developing digital-age learning experiences and assessments, (c) modeling digital-age work and learning, (d) promoting and modeling digital citizenship and responsibility, and (e) engaging in professional growth and leadership), internal consistency estimates of reliability were also computed for each subscale. The values for coefficient alpha of each subscale were 0.79 (items 1–4), 0.68 (items 5–8), 0.90 (items 9–12), 0.75 (items 13–16), and 0.83 (items 17–20), each indicating satisfactory reliability.

250

R. C. Overbaugh et al.

TABLE 3 Results of one-way repeated-measures ANOVA on participant SoC Dependent variables

Wilks’ 

df

F

p

η2

0.97 0.70 0.83 0.78 0.96 0.97 0.96

3 3 3 3 3 3 3

25 3.57 1.66 2.36 0.32 0.25 0.38

0.86 0.03∗ 0.20 0.09 0.81 0.86 0.77

0.30 0.30 0.17 0.22 0.04 0.03 0.04

Awareness Information Personal Management Consequence Collaboration Refocusing ∗ Significant

between pre-survey and post survey.

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

TABLE 4 Descriptive statistics on participant SoC

Dependent variables Awareness Information Personal Management Consequence Collaboration Refocusing

Pre-survey

Mid-survey

Post-survey

Follow-up

M

SE

M

SE

M

SE

M

SE

7.96 25.79 24.43 15.39 28.11 34.71 31.64

0.57 0.83 1.45 1.18 0.99 0.86 0.71

7.86 23.50 23.21 13.46 28.82 35.00 31.25

0.70 0.90 1.37 1.21 1.24 0.94 0.93

7.75 21.50 21.39 13.82 29.00 35.32 30.75

0.76 1.22 1.79 1.28 1.39 1.06 1.12

7.32 22.78 20.32 12.93 27.64 35.39 31.86

0.68 1.35 1.89 1.50 1.32 1.09 1.00

Participant SoC To answer the second research question, What is the effect of the capstone course on participants’ SoC over time? seven one-way repeated measure ANOVAs were performed on the seven SoC measures (awareness, information, personal, management, consequence, collaboration, and refocusing—dependent variables) to examine the changes in participants’ concern levels from pre-survey to mid-, post-, and follow-up surveys (independent variables). The overall ANOVAs were significant on one of the seven dependent measures—information—Wilks’  = 0.70, F(3, 25) = 3.57, p = .03, η2 = 0.30. No significant differences were found on the other six dependent variables. Following the significant ANOVAs, six pairwise comparisons (pre- versus mid, pre- versus post, pre- versus follow-up, mid versus post, mid versus follow-up, post versus follow-up) were conducted on the information measure to assess which means differed significantly from one another. The paired comparison results revealed a significant mean difference between the pre-survey and post-survey (Table 3). The descriptive statistics presented in Table 4 and Figure 2 indicate that participants’ concern patterns were quite similar at the four survey points. The lowest type of concern was awareness, followed by management. The three external concerns stayed consistently high across all four survey administration

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

ISTE NETS∗ T Capstone Program

251

FIGURE 2 Participant SoC about course-related knowledge and skills.

points, with collaboration being the highest. Regarding the internal concerns, the general trend was that participants’ concern levels dropped somewhat from the pre-survey point to the mid-, post- and follow-up points, with comparatively greater decrease on information and personal measures. These results showed that the capstone courses were somewhat helpful in lowering certain participants’ internal concerns, but failed to raise their external concerns. The profile in this study does not reflect the profile originally predicted by the original authors (Hall, George, & Rutherford, 1977). Instead of having a single peak at any given stage of adoption, we found consistent peaks in both the internal and external concerns, which we interpreted as meaning that, even though these educators showed an advanced stage of adoption, they continued to want to learn more while at the same time accepting that using technology is an aspect of their professional life. We interpret this pattern as a reflection of the maturation of technology integration and contemporary teachers who are motivated to take advantage of professional development opportunities; we therefore posit that the SoC instrument remains useful. These findings are also consistent with other work with a similar sample (e.g., Liu & Szabo, 2008; Overbaugh & Lu, 2008–2009; Overbaugh & Lu, 2011).

252

R. C. Overbaugh et al.

TABLE 5 Descriptive statistics on participant satisfaction with the capstone program

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

Category How well did your expectations for the capstone program correspond to the learning objectives specified in the capstone courses? How well were your expectations for the capstone program met? How has the capstone program helped you to better prepare your students to meet SOL and other national and/or state standards?

Poorly (%)

Neutral (%)

Well (%)

Very well (%)

0

13.2

42.1

44.7

5.3

10.5

39.5

44.7

5.2

21.1

26.3

47.4

PARTICIPANT PERCEPTION OF THE ISTE-NETS Capstone Program To understand how well the capstone courses met participants’ learning needs, participants’ responses to the three course satisfaction items were analyzed descriptively. Generally speaking, participants had very positive perceptions of course objectives and their learning experiences in the courses. Table 5 and Figure 3 present the data analysis.

FIGURE 3 Participant satisfaction with the capstone program learning experience.

ISTE NETS∗ T Capstone Program

253

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

QUALITATIVE DATA ANALYSES RESULTS To triangulate the quantitative data as well as to better understand the impact ISTE-NETS capstone courses had on participants’ learning and classroom practice, participants’ responses to the six open-ended questions in the Implementation Survey were analyzed qualitatively. Results follow. In answering the question “Why did you choose to enroll in the capstone program?” more than half of participants reported wanting greater knowledge of integrating technology into the curriculum so that they could better serve their students. Getting graduate credits, ISTE-NETS certification, or an instructional technology resource teacher or computer resource specialist position, as well as accelerating professional growth, were also important. One teacher stated “Initially it was for recertification purposes, but I feel that I gained a great deal of knowledge and felt accomplished after completing the courses.” Promisingly, some teachers considered the issue on behalf of their students. Typical comments were “This is the way my students learn and communicate, so I should learn more about it”; “I want learn to empower/advance student learning with technology.” In answering the question “Thinking back over your learning experiences with the capstone program, what stands out as your key accomplishment?” participant responses mainly focused on having attained technology skills, online resources, information knowledge, and instructional strategies (e.g., portfolio development, assessment). For example, “The capstone program provided me vital and significant information and skills. I am now confident to effectively and comfortably guide students and faculty toward implementing digital technology”; “my key accomplishment is ‘learning new ways to integrate technology, and using old tools in new ways,’ and ‘knowing more about formal and informal assessment”’; “my key accomplishment was being asked at the end if my Cap II portfolio could be used as an exemplary exhibit for future participants”; “I feel that the sources and resources that I was able to use and implement helped me become an even better technology integrator.” Some participants were excited about seeing that their participation in the capstone program benefitted their students: “Energy was well received by my students and at the end of the year, I revisited Waves with a similar slant, again using technology, and my students loved it. At the end of the year, I received a book, It’s Not Easy Being Green by Jim Henson, and a note from one of my students. The student thanked me for a great year and said it was more enjoyable because of all the technology projects we did.” Responses to the question “In what ways has your teaching changed, or will change, due to the capstone program?” were more varied. As a large number of participants were ITRTs, they reported how the capstone program enabled them to be more capable and to better help teachers with technology, which in turn resulted in more technology application with/by students.

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

254

R. C. Overbaugh et al.

Typical comments were: “Now I have a more in-depth view of integrating meaningful technology into instruction”; “I now look toward problem-based learning and engaging the students and teachers in real-life projects and more inventive use of technology”; “I have more resources to offer, and encourage teachers to provide options and rubrics, and more collaboration”; “Capstone opened me up to a wealth of new tools and to new ways to use those tools. Being a technology teacher, I used a lot of technology before capstone, but now, the technology use in my room is like air—always there, unnoticed”; “Now I’m more skilled at empowering student learning with technology . . . and look for more ways to globally connect my students and improve the learning process.” In terms of “Which computer technologies, online resources, or instructional strategies do you find most useful in your teaching?” the top ones mentioned by participants were Web 2.0 tools, electronic assessment tools (e.g., surveys, quizzes, polls), collaborative tools (e.g., wikis, blogs, Edmodo), portal, interactive whiteboards/tables, SMART response systems, online learning resources, universal design for learning, video clips (e.g., Discovery Education, Khan Academy), digital databases, and problem-based learning strategies. When talking about “your major concerns about using technology in your classroom,” several participants expressed that “choosing the right tool for the right lesson and right reason challenges them.” A few participants mentioned the access issue: Either “hardware is outdated” or “Internet is unreliable (often drops, no wireless)” or “mobile devices are inadequate.” Some participants worried about online safety for their students: “It seems that no matter how much we teach the students they are still naive about online safety and leaving a massive digital footprint. Make sure that students use the technology in ways that are appropriate for a school setting and reassuring administration that it is ‘safe.”’ A few ITRT expressed difficulty in “motivating teachers to use technology, getting teachers to embrace change, and getting technology-phobic teachers on board.” Another concern for some participants was lack of time for “playing with tech,” “finding resources,” “planning and teaching with tech,” and “for students to complete tech-based tasks.” A few teachers also talked about administrative problems: “I am also very concerned about many administrators and their misconceptions and misunderstandings about effective integration and use of technology. They simply don’t see it. They don’t understand all the factors involved with using us [technology resource teachers]. More training is needed for administrators.” “The filtering software in my district prohibits many of the newer online resources from being readily available for my students. Also, gaining access to the technology tools can be difficult for students, as there are only some of them that can be shared across classrooms.” In answering the question “What would you say are the major limitations of the capstone program?” many participants expressed disappointment that

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

ISTE NETS∗ T Capstone Program

255

the program is not recognized by state or local school systems. About one third of participants thought the program was tremendously time consuming: “The course takes a surprisingly large amount of time to participate and complete assignments”; “The course requires far more than the 45 hours per session. You have to give up most of your outside life if you work full time and take a capstone course. To be honest, for the amount of work required to successfully complete the entire program, much more grad-level credit should be awarded.” Some participants were also not happy with “inconsistency between instructors” and “redundant and non-learner specific curriculum.” They commented: “The biggest limitation is the fact that the course is completely modeled around a classroom teacher, but 90% of the people taking it are in other positions like me—an ITRT. There should be a branch of the course that is specifically for ITRTs, so that it is more realistic and job specific”; “Inconsistency between instructors and the curriculum is a big limitation. It is repetitive between the three courses . . . assignments that were done in Capstone I were added to in Capstone II.” Additionally, a few participants suggested the importance of program instructors. They spoke highly of the Capstone II instructor, but were not satisfied with the Capstone I instructor: “So much depends on your instructor. They can add to the stress or help alleviate it. I think so many of my colleagues dropped out of the capstone course because of our first instructor. I know I almost did. My last instructor was exceptional, however. I felt so at ease. She was great at guiding and supporting us throughout the process. The instructors/facilitators need to understand the program, requirements, and expectations. [The second instructor] did a very good job with this.”

ADMINISTRATOR PERCEPTION

OF

PARTICIPANT TECHNOLOGY COMPETENCE

To answer the sixth research question, How do school administrators perceive the effect of capstone courses on participants’ technology competence? questions similar to those on the participant Self-Efficacy Survey were delivered to administrators to triangulate participant data. The results of data analyses indicated that administrators were very satisfied with all domains of participant technology competence and professional development, which strengthens the validity of participant data. Table 6 and Figure 4 present the data analysis.

DISCUSSION AND LIMITATIONS The capstone program supports existing research regarding self-efficacy and teacher attitudes toward technology integration. The results indicated that self-efficacy went up in all domains as a result of participation in this

256

R. C. Overbaugh et al.

TABLE 6 Descriptive statistics on administrators’ perceptions of the participants’ technology competence

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

Category I believe that as a result of Facilitate and Inspire participating in the capstone Student Learning and program, my faculty Creativity member is better able to Design and Develop Digital-Age Learning Experiences and Assessments Model Digital-Age Work and Learning Promote and Model Digital Citizenship and Responsibility Engage in Professional Growth and Leadership

Disagree Neutral (%) (%)

Agree (%)

Do not know (%)

1.4

9.7

75

13.9

3.5

7.76

74

14.6

1.4

9.7

77.8

11.1

1.4

9.7

72

16.7

4.2

8.3

73.6

13.9

program. This is supported by the qualitative data garnered from the Participant Satisfaction Survey. A good illustration of this sentiment: “I am now confident to effectively and comfortably guide students and faculty toward implementing digital technology.” The elevated information and personal SoC at the mid-point of the capstone program, although not consistent with the original model, makes sense. We believe that because participants were learning material that was new, unfamiliar, and difficult, they were

FIGURE 4 Administrators’ perceptions of the participants’ technology competence.

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

ISTE NETS∗ T Capstone Program

257

apprehensive about Capstone II. In addition, the course workload may have been a factor at this stage: “The course requires more than 45 hours per session. You have to give up most of your life if you want to work full time and take a capstone course.” The nature of the sample is important and should be considered a limitation; this study was based only on those who completed the entire 30-week capstone program, which we believe makes this group stand out in terms of motivation and commitment, even though some opined it was very time consuming and somewhat inconsistent. The comments showed that, overall, the participants were enthusiastic and dedicated. The majority of those that completed the course were ITRTs or teachers who were hoping to be ITRT specialists. They were already interested and dedicated to technology integration and started with a higher degree of motivation. Nevertheless, their self-efficacy also increased while participating in this course, which suggests that the course was instrumental in advancing their professional development. Another limitation was the sample size; an N = 28 is acceptable but not as large as was desirable. There was a high level of attrition between the first and second phase of the course. Those who completed were likely more motivated to succeed for reasons beyond this study. Likewise, there are no data that can help determine the attitude of those that did not complete the course. Future studies on the influence of a capstone course on teacher self-efficacy should also consider those that drop out or fail to complete. Was their motivation and self-efficacy already low? If the workload was a factor, what can course instructors and school administrators do to support participants? These questions need consideration in future research. With this in mind, it is worth mentioning that, even with such a large dropout, data for the pre- and midpoints were largely identical. Therefore, it can be argued that the model was consistent. This study has implications for teachers and school administrators alike. A capstone course such as this is a benefit to teacher self-efficacy as well as useful professional development in technology integration. Administrators should see the value in such courses and encourage their teachers to participate. The courses are intensive and time consuming; therefore, it is hoped that administrators will be supportive of the efforts of participants. If more can be learned about why participants fail to complete the course, then perhaps adjustments to the course can be made for those who are not on the ITRT track.

CONCLUSION Teachers’ self-efficacy can be conceptualized as the belief in their ability to plan, organize, and carry out activities required to attain educational

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

258

R. C. Overbaugh et al.

goals (Skaalvik & Skaalvik, 2010). We are confident these results show that the capstone program is effective at improving teacher self-efficacy for implementing ISTE NETS∗ T standards into classroom practice. However, the amount and durability of the increase in self-efficacy were the most significant finding. Since program completers believed they could successfully design technology-based instructional strategies more than a semester after they finished the program, their teaching practice has most likely changed as well. In terms of stages of adoption, this study is encouraging, not because there were significant changes in concerns, but instead because participants showed that they were already at advanced stages of adoption and were consistently thinking about the effects of technologybased learning in their classroom, collaborating with others, and working to find new ways to teach more effectively. Finally, most participants and their administrators expressed positive impressions of the technology climate in their schools, which enables them to continue to increase their degree of technology integration and advance their pedagogy.

REFERENCES Abbitt, J. T. (2011). An Investigation of the relationship between self-efficacy beliefs about technology integration and technological pedagogical content knowledge (TPACK) among preservice teachers. Journal of Digital Learning in Teacher Education, 27(4), 134–143. Albion, P. R. (1999). Self-efficacy beliefs as an indicator of teachers’ preparedness for teaching with technology. In J. Price et al. (Eds.), Proceedings of the Society for Information Technology & Teacher Education International Conference 1999 (pp. 1602–1608). Chesapeake, VA: Association for the Advancement of Computing in Education (AACE). American Recovery and Reinvestment Act (ARRA) of 2009, Pub. L. No. 111-5, 123 Stat. 115, 516 (Feb. 19, 2009). Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84, 191–215. Blonder, R., Jonatan, M., Bar-Dov, Z., Benny, N., Rap, S., & Sakhnini, S. (2013). Can You Tube it? Providing Chemistry Teachers with Technological Tools and Enhancing Their Self-Efficacy Beliefs, 14(3), 285. Brown, C. G., (2012). A systematic review of the relationship between self-efficacy and burnout in teachers. Educational & Child Psychology, 29(4), 47–63. Chen, Y.-H., & Jang, S.-J. (2014). Interrelationship between stages of concern and technological, pedagogical, and content knowledge: A study on Taiwanese senior high school in-service teachers. Computers in Human Behavior, 32, 79–91. doi:10.1016/j.chb.2013.11.011 Fuller, F. F. (1969). Concerns of teachers: A developmental conceptualization. American Educational Research Journal, 6(2), 207–226. Giordano, V. A. (2007). A professional development model to promote Internet integration into p-12 teachers’ practice: A mixed methods study. Computers in the Schools, 24(3/4), 111–123.

Downloaded by [Mark Diacopoulos] at 06:50 25 November 2015

ISTE NETS∗ T Capstone Program

259

Hall, G. E., George, A. A., & Rutherford, W. L. (1977). Measuring the stages of concern about an innovation: A manual for use of the Stages of Concern questionnaire. Austin, TX: Research and Development Center for Teacher Education, University of Texas. (ERIC Document Reproduction Service No. ED 147342) Liu, Y., & Szabo, Z. (2008). A four-year study of teachers’ attitudes toward technology integration in schools. In K. McFerrin, R. Weber, R. Carlsen, & D. A. Willis (Eds.), Proceedings of the Society for Information Technology & Teacher Education International Conference 2008 (pp. 3845–3852). Chesapeake, VA: Association for the Advancement of Computing in Education (AACE). Overbaugh, R. C., & Lu, R. (2008–2009). The impact of a federally funded grant on a professional development program: Teacher’s stages of concern toward technology integration. Journal of Computing in Teacher Education, 25(2), 45–55. Overbaugh, R. C., & Lu, R. (2011, April). ISTE NETS∗ T Certificate of Proficiency Capstone Program; A pilot evaluation. Paper presented at the annual conference of the American Educational Research Association, New Orleans, LA. Overbaugh, R. C., Lu, R., & Pribesh, S. (2008, March). The impact of a NCLB EdTech funded professional development program on teacher self-efficacy and resultant implementation. Paper presented at the annual conference of the American Educational Research Association, New York, NY. Sahin, I., Celik, I., Akturk, A. O., & Aydin, M. (2013). Analysis of relationships between technological pedagogical content knowledge and educational Internet use. Journal of Digital Learning in Teacher Education, 29(4), 110–117. Skaalvik, E. M., & Skaalvik, S. (2010). Teacher self-efficacy and teacher burnout: A study of relations. Teacher and Teacher Education, 26(4), 1059–1069. Skaalvik, E. M., & Skaalvik, S. (2014). Teacher self-efficacy and perceived autonomy: Relations with teacher engagement, job satisfaction, and emotional exhaustion. Psychological Reports: Employment, Psychology and Marketing, 114(1), 68–77. Stewart, J., Antonenko, P. D., Robinson, J. S., & Mwavita, M. (2013). Intrapersonal factors affecting technological pedagogical content knowledge of agricultural education teachers. Journal of Agricultural Education, 54(3), 157–170. doi:10.5032/jae.2013.03157 Wang, L., Ertmer, P. A., & Newby, T. J. (2004). Increasing preservice teachers’ selfefficacy beliefs for technology integration. Journal of Research on Technology in Education), 36(3), 231–250. Yang, S. C., & Huang, Y.-F. (2008). A study of high school English teachers’ behavior, concerns and beliefs in integrating information technology into English instruction. Computers in Human Behavior, 24, 1085–1103. doi:10.1016/j.chb.2007.03.009 Yip, D. Y., & Cheung, D. (2005). Teachers’ concerns on school-based assessment of practical work. Journal of Biological Education, 39(4), 156–162.