Using data for decision-making: perspectives from 16 principals in ...

8 downloads 59 Views 498KB Size Report
Sep 21, 2010 - P. Reeves. Western Michigan University, 1407 Sangren Hall, Kalamazoo, MI 49008, USA e-mail: ...... the University of Oregon. Norm- ..... He has also acted as a school board trustee for the Gull Lake Community. Schools in ...
Int Rev Educ (2010) 56:435–456 DOI 10.1007/s11159-010-9172-x

Using data for decision-making: perspectives from 16 principals in Michigan, USA Jianping Shen • Van E. Cooley • Patricia Reeves Walter L. Burt • Lisa Ryan • J. Mark Rainey • Wenhui Yuan



Published online: 21 September 2010  Springer Science+Business Media B.V. 2010

Abstract In response to the vast amounts of data associated with the accountability movement and the rhetoric of data-informed decision-making, we interviewed 16 principals to find out what streams of data they used and what decisions they made by using the data. We found that: (a) student achievement data are predominantly used to the extent of neglecting other streams of data such as student and community background data and school process data; (b) student

J. Shen (&)  V. E. Cooley  W. L. Burt Department of Educational Leadership, Research and Technology, College of Education and Human Development, Western Michigan University, Kalamazoo, MI 49008, USA e-mail: [email protected] V. E. Cooley e-mail: [email protected] W. L. Burt e-mail: [email protected] P. Reeves Western Michigan University, 1407 Sangren Hall, Kalamazoo, MI 49008, USA e-mail: [email protected] L. Ryan 161 Fairway Dr., Battle Creek, MI 49015, USA e-mail: [email protected] J. M. Rainey 8812 Tamarisk Circle, Richland, MI 49083, USA e-mail: [email protected] W. Yuan Department of Accountability and Data Quality, Fort Worth ISD, 1407 I. M. Terrell Circle South, Fort Worth, TX 76102, USA e-mail: [email protected]

123

436

J. Shen et al.

achievement data are used more for accountability purposes—for assessing ‘‘of’’ rather than ‘‘for’’ the learning; (c) different streams of data are rarely used together to derive rich meaning for decision-making; and (d) school districts differ in the extent to which their principals use data to improve curriculum and instruction. The study pointed both to the challenges and to the opportunities of making datainformed decisions to improve our schools. Keywords Data  School principals  Decision-making  Accountability  School improvement Re´sume´ Exploiter les donne´es pour prendre les de´cisions: perspectives de 16 directeurs d’e´coles au Michigan (E´tats-Unis) – En re´action aux e´normes quantite´s de donne´es lie´es au mouvement de l’obligation redditionnelle et au discours sur la prise de de´cision fonde´e sur les donne´es, nous avons interroge´ 16 directeurs d’e´tablissements scolaires pour cerner les flux de donne´es qu’ils utilisent et les de´cisions qu’ils prennent a` partir de ces donne´es. Nous avons fait les constatations suivantes : a) ils utilisent principalement les donne´es sur les re´sultats des e´le`ves, au point de ne´gliger les flux d’autres donne´es telles que celles relatives aux contextes des e´le`ves et des communaute´s ainsi qu’aux processus scolaires; b) ils exploitent les donne´es sur les re´sultats des e´le`ves essentiellement a` des fins redditionelles pour une e´valuation « de » l’apprentissage et moins « pour » l’apprentissage; c) ils analysent rarement les diffe´rents flux de donne´es paralle`lement de sorte a` en tirer des conclusions pre´cieuses pour la prise de de´cision; et d) le degre´ d’exploitation des donne´es par les directeurs en vue d’ame´liorer les programmes et l’enseignement diffe`re selon les districts scolaires. Cette e´tude signale a` la fois les de´fis et les chances d’une prise de de´cision e´claire´e par les donne´es dans le but de perfectionner nos e´tablissements scolaires. Zusammenfassung Datenbasierte Entscheidungsfindung: 16 Schulleiterinnen und Schulleiter in Michigan, USA, geben Auskunft – Angesichts der gewaltigen Datenmengen, die mit der Einfu¨hrung einer Rechenschaftspflicht fu¨r Schulen und mit der Formel der datenbasierten Entscheidungsfindung einhergehen, haben wir Interviews mit 16 Schulleiterinnen und Schuleitern gefu¨hrt, um herauszufinden, welche Art von Daten sie benutzen und was fu¨r Entscheidungen sie anhand dieser Daten getroffen haben. Wir haben herausgefunden: a) dass Leistungsdaten von Schu¨lerinnen und Schu¨lern meist in gleichem Maße genutzt werden, wie andere Daten, beispielsweise Angaben u¨ber den perso¨nlichen und sozialen Hintergrund der Schu¨lerinnen und Schu¨ler oder Prozessdaten der Schule, außer Acht gelassen werden; b) dass Leistungsdaten der Schu¨lerinnen und Schu¨ler vorwiegend fu¨r die Rechenschaftslegung eingesetzt werden – also eher fu¨r die Bewertung ,,des’’ Lernens als ,,fu¨r’’ das Lernen selbst; c) dass selten verschiedene Datenstro¨me zusammengefu¨hrt werden, um daraus vielschichtige Informationen fu¨r die Entscheidungsfindung abzuleiten; und d) dass der Umfang, in dem die Schulleitungen Daten zur Verbesserung der Lehrpla¨ne und des Unterrichts nutzen, von Schulbezirk zu Schulbezirk variiert. In der Studie werden sowohl die

123

Using data for decision-making

437

Herausforderungen als auch die Chancen der datenbasierten Entscheidungsfindung fu¨r die Verbesserung unserer Schulen aufgezeigt. Resumen El uso de datos para la toma de decisiones: la o´ptica de 16 directores de escuela en Michigan, EE.UU. – Como respuesta a la gran cantidad de datos que se asocian con el movimiento a favor de la rendicio´n de cuentas y la reto´rica sobre la toma de decisiones basada en el conocimiento de datos, hemos entrevistado a 16 directores de escuela con el fin de comprobar que´ caudales de datos han usado y que´ decisiones han tomado al usar estos datos. Hemos comprobado que: a) los datos sobre rendimiento estudiantil se han usado de forma predominante hasta la medida de desatender otros caudales de datos, tales como datos sobre trasfondo de los estudiantes y de la comunidad, y datos sobre procesos escolares; b) los datos sobre rendimiento estudiantil se utilizan ma´s bien para fines de rendimientos de cuentas, para realizar una evaluacio´n ma´s bien ‘‘sobre’’ y no tanto ‘‘para’’ el aprendizaje; c) los diferentes caudales de datos rara vez se utilizan en conjunto, con el fin de obtener una valiosa base de conocimientos para la toma de decisiones; y d) los distritos escolares difieren en cuanto a la medida en la que sus directores usan datos para mejorar los planes de estudio y la instruccio´n. El estudio sen˜ala tanto los retos como las posibilidades que ofrecen las decisiones tomadas sobre la base de datos para mejorar el rendimiento de las escuelas en EE.UU.

Introduction The last decade of the accountability movement has produced ‘‘mountains of data’’ (Celio and Harvey 2005). Logging onto Standard and Poor’s School Evaluation

123

438

J. Shen et al.

Services website, we found that there are many different types of educational data collected at the individual building, district and state levels. With so much data being available, data-informed decision-making has developed into a movement in the last decade. A quick Google search using key descriptors of ‘‘data’’, ‘‘decision making’’ and ‘‘education’’ found more than 50 million entries. Slightly fewer than 5,000 entries are found on ERIC using the same key terms. Consistent with the vast amount of literature on data-informed decision-making in education is the mandate of such a practice. For instance, the recently-promulgated Michigan School Improvement Framework (Michigan Department of Education 2006) requires using data to inform the school improvement process. The educational literature offers a succession of studies on the gap between rhetoric and practice. For example, Goodlad and Klein (1974) formulated, based on the rhetoric, some reasonable expectations for curriculum and instruction, ‘‘looked behind the classroom door’’ to see whether these expectations had been put into practice, and found that the reasonable expectations had not materialised. Similarly, Cohen (1991) documented a case where a teacher was fluent in the language of a new math curriculum. Her practice, however, remained unchanged. In Results Now, Schmoker (2006) proposes some insight into this phenomenon of little translation of new knowledge into actual practice. He describes a process of buffering that isolates teachers and school leaders from each other and from the information needed to shed light on ineffective practices and provide the catalyst for change. These buffers explain, to some extent, why it is very difficult to change educational practice even in the current environment of accountability (Shen and Ma 2006). Like other school reform movements, the systematic use of data to shape schoolbased decisions means a fundamental shift in practice for school leaders and their staffs. It means breaking down the buffer that isolates teachers and school leaders from a clearer understanding of their student achievement challenges and the potential solutions to those challenges. This study, therefore, looks at how principals access and use data for decision-making, and when (or if) they do, what kind of data they use and what kind of decisions they make with that data. In other words, how do school principals use data to better connect themselves and their teachers with the information they need to break through the buffers that inhibit actual translation of rhetoric and knowledge into changes in practice?

Literature review and conceptual framework The importance and necessity of data are increasingly recognised in the school setting under federal, state and even local systems of accountability. This growing focus on data related to accountability translates to a broader focus on the use of data for decision-making in schools and raises a couple of key questions. What kinds of data should be collected for decision-making? What kinds of data may be helpful for what types of decisions? The above questions are essential to school leaders who are expected to be successful in the adoption and implementation of data informed decision making practices. Our literature search revealed that the

123

Using data for decision-making

439

literature on various streams of data is well developed, while the literature on the types of decisions-based on data is limited and still emerging. Given the emphasis on accountability, it is understandable that student achievement data have been in the spotlight. Thus, the first body of literature on data streams consists of numerous articles in professional journals and other media that single out the importance of student achievement data as measured by local, state and national assessments. Schools are categorised and ranked for quality and success based, primarily, on these measures. These rankings can place schools in various phases of mandated school improvement that become increasingly more prescriptive and more high stakes as years of failing to make adequate yearly progress (AYP) add up. The improvement of schools, however, is a complex process encompassing many factors and student achievement data is just one of the indicators of both quality and the improvement progress (Salpeter 2004). In light of this reality, authors of the second body of literature on data streams go beyond achievement data. For example, Creighton (2001) suggested collecting data on student instruction and assessment, attendance and dropout rates, college entrance tests and instructional programme evaluations. AASA’s guidebook, Using Data to Improve Schools: What’s Working (2006), encouraged school system leaders and their respective staffs to use data for sound decision-making. In this book, attention was placed on test scores, rigour of coursework, graduation rates, attendance rates, promotion rates and rates of participation in co-curricular activities (including community service). Some studies tried to build a wider system of indicators with a multidimensional perspective, where student achievement was no longer the only factor to be considered. For example, Celio and Harvey (2005) provided a working model of a school management guide built on seven evidence-based indicators. The indicators are as follows: 1. 2.

3. 4. 5. 6.

7.

Achievement (reading and mathematics). Elimination of the achievement gap in reading and mathematics between subgroups of students by race, economic status, English language facility, etc. (where there are adequate numbers within a subgroup for comparison). Student attraction (ability of the school to attract students where there are opportunities for choice among parents/students). Student engagement with school (index of measures of school engagement, including attendance, tardiness and involvement in school activities). Student retention/completion (depending on the level of the school: elementary, middle, high school). Teacher attraction and retention (number of applications for teacher openings; proportion of teachers leaving the school for reasons other than scheduled retirement). Funding equity (measure of whether the school receives the funding that would be predicted given the composition of the student body).

Although authors of the second body of literature go beyond student achievement data, they do not indicate how these streams of data are related to each other. Within the third body of literature, authors not only proposed relatively discrete data

123

440

J. Shen et al.

streams, they also began to illustrate how the streams are related to each other. According to Wahlstrom (2006), three types of data need to be collected for datainformed decision-making—demographic data, process data and outcome data. There are assumed relationships built into Wahlstrom’s three-stream typology. It is the demographic data, the process data and the interaction between demographic and process data that lead to student achievement. Currently, the most comprehensive analysis about the typology of data streams is by Bernhardt (2003, 2004, 2005). She suggested that four types of data should be collected: (a) demographic data, which provides descriptive background information on students, staff and schools; (b) school process data, which ‘‘define what we are doing to get the results that we are getting’’; (c) student learning data, which describe student performance; and (d) perception data, which describe what people think about the learning environment. She not only considered these types of data with a longitudinal perspective, but also tried to analyse the result of the interaction between two types and even three types of data. In summarising the literature on data streams, there appear to be three major data streams—(a) student and community background data, (b) school process data and (c) student achievement data. These three streams each have two dimensions—objective and perceptual. For example, for student achievement, there are both objective standardised scores and the community’s perception on student achievement. In this study, we consistently use these three data streams. The typology of three streams of data provides a conceptual framework for inquiring into what kind of data principals use. Applying the three-stream data typology for this study, we analysed the data collected from interviews of principals on their practice of using data for decisionmaking. We used the framework of the three data streams to inquire into what data principals used and how data informed the decision-making processes in their schools. A number of empirical studies have recently been published on: how individuals’ organisational roles and the district reform history influence the construction of the meaning of data (Coburn and Talbert 2006); how district policies impact staff use of data (Kerr et al. 2006); the elements of data systems that facilitate and inhibit effective use of data (Wayman and Stringfield 2006a); and what choices a school district faces to develop a formative assessment system (Sharkey and Murnace 2006). As yet, however, we do not know much about datainformed decision-making except for some ‘‘exemplary situations’’ (Wayman and Stringfield 2006b, p. 467). We still need to know and improve ‘‘leaders’ expertise in accessing, generating, managing, interpreting, and acting on data’’ (Knapp et al. 2006, p. 39). This article contributes to this knowledge base by analysing the data collected from 16 principals. Principals play an important and unique role in school leadership (Hsieh and Shen 1998; Shen 2005; Rodriguez-Campos et al. 2008). It is important to inquire into their practice of data-informed decision-making.

Methods The increasing availability and amount of data, the rhetoric of using data to improve education, and the lack of empirical studies on how educators use data for

123

Using data for decision-making

441

decision-making provide the impetus for this study and for our focus on principals who, due to their unique position in the educational system, play an important role in the school improvement process and, thus, the decisions that can be informed by data. The school effectiveness research of the last 20 years affirms the role of principals in school success (e.g., Austin and Reynolds 1990; Goddard 2001; Goddard et al. 2004; Goldring and Pasternak 1994; Hallinger and Heck 1996; Heck and Marcoulides 1993; Heck 1992; Leithwood and Jantzi 1999; Leithwood and Montgomery 1986; Louis et al. 1996; Marks and Louis 1997; Marks and Printy 2003; Sebring and Bryk 2000; Sergiovanni 1995; Shen 2005; Taylor and Valentine 1985) in general and in raising student achievement in particular (e.g., Leithwood et al. 2004; Marzano et al. 2005; Witziers et al. 2003). In this study, we asked the following questions: Which streams of data do principals use? What kind of decisions do principals make using these data? Are there any patterns in what streams of data principals use and what decisions they make? We interviewed 16 principals in four urban school districts, with four principals (two at the elementary level, one at the middle school level and one at the high school level) from each of the four school districts. These principals would take part in a three-year funded project on data-informed decision-making. The interview was part of the context analysis before the project started. On the average, this group of 16 principals had about 12 years of experience as a teachers and six and one half years as principals. Among the participants, nine were female and seven were male; further, nine were members of a visible minority group. All principals were interviewed in their offices using a consistent interview protocol. The interview protocols consisted of questions on what data they have, what data they use and how data inform their decisions. The interviews ranged from one hour to one and a half hours, and all the interviews were transcribed verbatim. For the analysis of what data principals use, we used the conceptual framework of three streams of data as the lens, and coded and displayed data accordingly. For the analysis of what decisions principals make, we progressively coded the transcription utilising a constant comparative method (Corbin and Strauss 2007). Once we felt that we could not find any more decisions for each of the three streams of data, we confirmed the developed codes and recoded the data from the beginning to the end. We, then, reduced the data into matrices as shown in Tables 1, 2, 3 and 4. Patterns began to emerge based on the much reduced data. The interrater reliability was calculated; and Cohen’s Kappa were .97 for coding types of data and .95 for coding types of decisions based on achievement data. Therefore, the reliability appeared to be very high.

Findings Streams of data principals use Table 1 shows the data streams that the principals reported to have used. A few patterns emerged from the data in Table 1. First, the most frequently used data stream is that of student achievement, while the streams of (a) student and

123

442

J. Shen et al.

Table 1 Streams of data that the principals use Principal Student and community background data

School process data

P1

MEAP, ITBS, DIBELS, DRA, Star Reader, Bi-Annual Curriculum Assessments

P2 P3

Student achievement data

Suspension data Free and reduced lunch data

MEAP, ITBS, DRA, Bi-Annual Curriculum Assessments, Gates-MacGinitie, IRA MEAP, ITBS, Gates-MacGinitie

P4

MEAP, ITBS, Prep-ACT, ACT, SAT, BiAnnual Curriculum Assessments, GatesMacGinitie

P5

MEAP, Woodcock Munoz, Curriculum Based Measurement for Language Arts and Math

P6

MEAP, Gates-MacGinitie, District’s Criterion Referenced Test in Math

P7

Tungsten Assessment, Writing Assessments, Gates-MacGinitie, District Social Studies Assessment

P8

MEAP, Gates-MacGinitie, ACT, Work Keys, PSAT, SAT, Teacher Assessments

P9

MEAP, NWEA, ITBS, DIBELS, Compass Learning Program

P10

Demographic data, free and reduced lunch

Attendance data, referral data, behavioural data

MEAP, ITBS, MAP

P11

Gates-MacGinitie, NWEA, Star Reader, Compass Learning Program, Math Reader

P12

MEAP, ACT, SAT, Classroom Assessments

P13

MEAP, NWEA, GISD Pilot Assessment, Star, Harcourt MAP Assessment, MAT

P14

Discipline data, attendance data

MEAP, NWEA, MET, Star, Harcourt MAP Assessments

P15

MEAP, MAT, NWEA

P16

MEAP, MAT, Gates-MacGinitie, NWEA

Please see Appendix for an explanation of the acronyms

community background data and (b) school process data are rarely used. All 16 principals mentioned the usage of student achievement data, but only two principals indicated the usage of student and community background data and three principals mentioned school process data. Second, among the student achievement, the most frequently used data are those from standardised tests that are more for summative assessment, rather than from those formative tests that are classroom-based and for diagnostic purposes. Third, for the two principals who mentioned the data stream of ‘‘student and community background data’’, the data are more related to student background, rather than to community background. Fourth, even for the three principals who mentioned ‘‘school process data’’, the focus was limited to

123

Using data for decision-making

443

Table 2 Decisions made by the principals based on student and community background data Principal

Student and community background data

Decisions based on the data

P3

Free and reduced lunch data

‘‘We use the makeup of the homes to help us make decisions.’’ ‘‘It drives our at risk funding.’’ ‘‘Kids are denied food if their applications are not approved.’’

P10

Demographic data, free and reduced lunch

P10

‘‘We relate it to some teaching methodology from best practice instruction.’’ ‘‘We get a profile to track that student’s achievement.’’

Table 3 Decisions made by the principals based on school process data Principal School process data

Decisions based on the data

P2

‘‘I don’t really look at suspension too much because I don’t have a high rate of suspension here. But if my numbers increase that might mean what programme was in place when the suspension rates were low and why was it different then?’’

Suspension data

‘‘The school district is looking at doing a comparison between academic achievement and schools and suspension data.’’ P10

Attendance data; referral data; behavioural data

‘‘We try to find reasons for the behaviour. We cover things through interviews and that kind of thing making suggestions to the parents.’’ ‘‘The purpose of collecting any data in this business is student achievement.’’

P14

Discipline data, attendance data

‘‘Discipline and attendance information are put into the district’s computer information system.’’ ‘‘We’re trying to implement a model of positive behaviour support, so that kind of information is helpful for us.’’

Table 4 How principals used achievement data for decision-making Type of decisions

Principals

For school accountability (AYP)

X X X X X X X X X X

For school improvement plan

X X

X

X X

For working with parents and community

X X X X X

X X

For teacher accountability

X

For comparing with certain norms

X X

X

For identifying achievement growth

X

For grouping/placement

X X X X

For identifying weakness per state/other standards

X X X

For assessing proficiency per curriculum taught

X X

X X

X

X

X

X X X

X X X X X X X X

X X

X X X

X

123

444

J. Shen et al.

attendance and discipline. Other data on the school process—such as monitoring the development, implementation and evaluation of academic or non-academic programmes and the supervision of teachers’ instructional practices—were not mentioned at all. Finally, some data availability, such as MEAP, SAT and ACT, was universal given the state context. However, the availability of other data depended, to a certain extent, on the school districts. For example, in two school districts, six out of eight principals mentioned the use of Gates-MacGinitie and six of these eight principals indicated the use of the Northwest Evaluation Association (NWEA) assessment. Therefore, usage of data was dependent upon both the school district context and the larger state and national context. The data in Table 1 raises some serious issues. The first issue is data for accountability versus data for learning. According to Reeves’ (2004), the issue centres on the difference between assessment ‘‘of’’ learning and assessment ‘‘for’’ learning. The predominant focus on achievement data, in general, and on standardised achievement data, in particular, points to the fact that, in the current environment, data are used predominantly for accountability purposes, i.e., the assessment ‘‘of’’ learning. Relating to the second issue, assessment ‘‘for’’ learning, some important data are not used by principals. As mentioned in the previous paragraph, none of the principals mentioned the use of data on important school processes such as monitoring the development, implementation and evaluation of academic or non-academic programmes, and the supervision of teachers. The third issue is related to Bernhardt’s (2003, 2004, 2005) notion that the meaning of data derives, in part, from the intersection between and among various streams of data. If achievement data are not related to other data streams—(a) student and community background data and (b) school process data—the meaning of the achievement data is very limited, especially for the purpose of informing decisions that school leaders and their staffs make to improve the school’s outcomes. Given the fact that principals rarely mentioned the usage of data streams other than achievement, it is fair to point out that the notion of ‘‘intersecting data’’ has not been practised by the participants in this study. Decisions principals make using student and community background data The data in relation to the decisions made by principals using student and community background data are displayed in Table 2. It is interesting to note that among the two principals who mentioned student background, one principal did mention some decisions that have instructional implications: ‘‘We relate it to some teaching methodology from best practice instruction’’, and ‘‘We get a profile to track that student’s achievement.’’ It is certainly encouraging to see this kind of practice. The other principals did not mention, specifically, how the data could be used for improving instruction. Generally speaking, student and community background data have been underutilised; and even when these data are used, it is not necessarily for improving teaching and learning. It is interesting to note that even the practice of disaggregating student achievement by subgroups, a practice that is mandated by NCLB, was only mentioned by one out of 16 principals. The principals’ responses seem to indicate that they have not realised the benefits of disaggregating student achievement by subgroups and, therefore, they tend to

123

Using data for decision-making

445

complete the task as part of the reporting process rather than as part of their practices for school improvement. Decisions principals make using school process data Data sources in relation to the decisions made by principals using school process data are displayed in Table 3. As mentioned in the previous section, when principals mentioned ‘‘school process’’ data, they tended to focus on ‘‘attendance’’ and ‘‘discipline.’’ Two of the principals did mention that the attendance data and disciplinary data could be used in conjunction with student achievement. The first principal mentioned that ‘‘The school district is looking at doing a comparison between academic achievement and schools and suspension data.’’ The second principal commented that ‘‘The purpose of collecting any data in this business is student achievement.’’ The decisions related to ‘‘attendance data’’ and ‘‘discipline data’’ are mainly about opportunity to learn and the learning environment, which are related to student achievement (Marzano 2003). Therefore, these two principals did make the connection between attendance and discipline data, on one hand, and student achievement data, on the other. Two principals also mentioned ‘‘find reasons for the behaviour’’ which might have implications for disciplinary and other programmes and ‘‘put into the district’s computer information system’’ for reporting purpose. However, the limited scope of the school process data the principals mentioned constrains the possibilities of the decisions that principals could make. For example, principals did not mention data related to the design, implementation and evaluation of the academic and nonacademic programmes. Nor did they mention any supervision data related to teachers. The result is that many high-impact strategies for improving teaching and learning are overlooked or not linked strategically to student needs due to the fact that principals essentially neglected school process data, and if they did use school process data, they tended to focus solely on ‘‘attendance’’ and ‘‘discipline’’ data without linking these issues back to the teaching and learning process. Decisions principals make using student achievement data The data in Table 4 illustrate for what purposes principals used students’ achievement data. The checks in the table indicate the patterns of purposes in using student achievement data for decision-making. The data seem to indicate that student achievement data are used for three purposes. The first purpose is for accountability and school improvement. Almost all principals mentioned the use of data, particularly MEAP, for accountability (including AYP) purpose. Related to the use of data for accountability, six out of 16 principals mentioned the use of data for school improvement. For example, one principal said that MEAP data were used to ascertain ‘‘what they know and what they don’t know’’, and that MEAP data were discussed at staff meetings and the findings were incorporated ‘‘into our School Improvement Plan.’’ Only one principal mentioned the use of data for working with parents and the community and two discussed using data for teacher accountability suggesting that these are two purposes for data use that are noticeably underutilised (and perhaps

123

446

J. Shen et al.

minimally understood) by these principals. Moreover, the references principals made to using data for these purposes were not linked with any substantive decisions regarding working with parents and the community or helping teachers improve instructional practice. The second purpose of using student achievement data is to ascertain the status of students’ learning, including comparing the students with certain norms, such as the national norm, and identifying student’s growth in achievement. The principals mentioned data from SAT, ACT, PSAT and ITBS in this category. Eight principals mentioned the use of achievement data to compare students with certain norms, the percentage appears to be low given the availability of norm-referenced achievement data. Even a lower percentage, only four of the participants, indicated the use of achievement data to identify students’ academic growth. Given the burgeoning practice of using student growth to justify the attainment of AYP, there appears to be a need for professional development to orient the principals to the growth model. The third purpose of using achievement data is to make decisions that are directly related to curriculum and instruction—including (a) grouping and placement, (b) identifying weakness per state/other standards and (c) assessing proficiency per curriculum taught. The interview data revealed a variety of specific decisions in these three areas. The following are the excerpts from the transcription regarding how principals use data for grouping and placement: Display 1 Excerpts for grouping and placement • ITBS tells ‘‘how many kids are reading at grade level.’’ The lowest 40% need ‘‘tutoring before or after school.’’ (#1) • We use ITBS to find our Title 1 target students: if the fall below the 29th percentile. (#2) • ‘‘The informal tests like the DRA give us more specific information about the children that are falling behind.’’ ‘‘It has opened up the eyes of our teachers.’’ (#2) • ‘‘We use ITBS primarily to place students. It drives a lot of our at-risk funding.’’ (#3) • Gates-MacGinitie is used to try to place 9th grade students ‘‘who might need some type of remediation for math or reading.’’ (#4) • ‘‘I use the MEAP, MAP and the Gates. What I am trying to do is bring the pieces together to where I can see three levels of students …being able to assist each level with the need that each child needs.’’ (#11) • We use the social studies and the science on the MAT. ‘‘It helps determine our kids that are at risk. If they are below the 50th percentile then those are the kids we need to focus on first.’’ (#15) • Based on NWEA data ‘‘we actually regroup our kids’’ in English/Language Arts and ‘‘it helps them.’’ (#15) • We use NWEA ‘‘like a parent aware of a child.’’ ‘‘It tells you where you are at. So you can start grouping kids so that you are teaching to the majority of the class in the middle.’’ (#16)

From the excerpts, it is clear that the decisions regarding grouping and placement include (a) eligibility for programmes such as Title 1, (b) selecting students for targeted interventions, (c) grouping student according to the achievement level and (d) providing individualised service.

123

Using data for decision-making

447

As to identifying strength and weakness per state and other standards, the following excerpts reveal the kinds of decisions that principals made based on achievement data: Display 2 Excerpts for identifying strengths and weaknesses per state and other standards • DIBELS identifies ‘‘what reading skills K-3 students need help with.’’ (#1) • Star Reader gives a ‘‘reading grade level.’’ It will give teachers ‘‘an idea of what interventions need to happen for that particular student.’’ (#1) • MEAP is ‘‘an indicator of how well our students are doing with the state benchmarks and proficiency levels.’’ We look at ‘‘the benchmarks that they seem to be needing a little more instruction on and how we can make it better the next year for the nest group of children that take the MEAP.’’ (#2) • Gates-MacGinitie is only used for the ‘‘literacy kids to determine if they have had a successful gain in literacy.’’ (#4) • The MEAP is used by the teachers to ‘‘adjust their instructions.’’ (#5) • CBM is a reading test. ‘‘They read one minute to check for sentence fluency.’’ ‘‘The data is used to adjust instruction on a more timely basis.’’ (#5) • MEAP ‘‘we look at to improve instruction and also to determine whether we are meeting State standards.’’ (#6) • MEAP data we ‘‘tear it apart and look for trends, look for things that are happening in gaps, things that we need to put emphasis on.’’ (#7) • MEAP is used ‘‘to see where the gaps are and then I make my staff do gap analysis to see where do we fail, where do we begin to do our curriculum mapping for next year.’’ (#8) • When talking about MEAP, ‘‘I think you have to look at a variety of purposes when you look at student performances: … target focus….’’ (#12) • ‘‘GISD Assessment is set up just like the MEAP. Teachers use it as a teaching tool.’’ (#13) • The Star Program is part of our Title I programme and we use it with all kids. ‘‘The purpose is to improve the student’s reading scores, their reading comprehension, their word recognition.’’ (#13)

The essence of these excerpts is that principals use student achievement data to identify the strength and weakness according to the state standards, or other standards such as sentence fluency, reading comprehension, word recognition, etc. The findings could then be used to ‘‘do our curriculum mapping for next year’’, ‘‘to put emphasis on (things) and to ‘‘adjust instruction.’’ As to assessing proficiency per curriculum taught, principals made the following statements during the interviews: Display 3 Excepts related to assessing proficiency per curriculum taught • Bi-Annual Curriculum Assessments tell ‘‘how well the students have done with the curriculum we were supposed to teach.’’ (#1) • The Bi-Annual Curriculum Assessment has an item analysis that helps teachers identify specific questions that students had trouble with. ‘‘A teacher can sit back and say, ‘Gosh, I wonder why my kids all picked b and the right answer is d’.’’ (#2) • The District’s Criterion Referenced test identifies ‘‘what standards we are weak in and what we need to strengthen at the building level.’’ (#6) • Tungsten ‘‘allows us to look at math and reading’’ in almost ‘‘real time’’ because we collect it monthly. (#7)

123

448

J. Shen et al.

Display 3 continued • The District Social Studies Assessment is used to ‘‘find out what areas the students have mastered and what areas they still need some work in.’’ (#7) • We are trying ‘‘to move from paper and pencil to project based’’ teacher assessments. The purpose is to ‘‘validate what a student has learned.’’ (#8) • When talking about classroom assessment, ‘‘I think you have to look at a variety of purposes when you look at student performances: … target focus….’’ (#12) • Star is ‘‘more of an in house test because we use it to determine kids who are going to receive post mentoring.’’ It can be used by teachers in different ways. (#14)

The practice of using real-time data to assess whether students attained proficiency for the curriculum taught is promising because it gives timely feedback for making adjustments in teaching and learning to address the deficiencies in relation to the curriculum being taught. It is certainly encouraging to observe that principals mentioned the use of student data to make decisions directly related to curriculum and instruction, an area that has a high impact on student achievement; however, the numbers of principals who mentioned such usages were low. Only six of the 16 principals mentioned grouping and placement, nine mentioned identifying weakness per state/other standards, and seven mentioned assessing proficiency per curriculum taught. In other words, about half or less than half of the principals indicated that they made these kinds of decisions using student achievement data. Another interesting finding from Table 4 was the impact of the school districts on principals’ practice of using data to make curriculum- and instruction-related decisions, a finding that confirms similar ones in the literature (Coburn and Talbert 2006; Kerr et al. 2006). The principals in Table 4 are arranged in the order of school districts. The first four were from the same district and it was the same for the ensuing groups of four. It was interesting to note that the kind of achievement data are similar across the four school districts, among the 22 checks for ‘‘(a) grouping and placement, (b) identifying weakness per state/other standards and (c) assessing proficiency per curriculum taught’’ for Table 4, 16 came from the first two school districts while only six were from the last two school districts. The school district appears to be a factor that is related to the extent to which principals use data for teaching- and learning-related decisions. It will be interesting to further inquire into whether factors such as professional development for principals, job requirements, evaluation practices and school district leadership are related to the pattern that we observed in the data.

Summary and discussion In this study we inquired into what kind of data principals use and how they used data. We conducted individual interviews with 16 principals from four urban school districts, with two elementary principals, one middle school principal and one high school principal from each school district. Our results revealed some patterns as to

123

Using data for decision-making

449

what kind of data principals used and how these data informed their decisions. The following are some of the major findings and their related discussions. First, among the data that principals reportedly used, student achievement data dominates. A quick glance of the data in Table 1 indicate that all 16 principals mentioned the use of student achievement data, while only two principals mentioned the use of student and community background data, and three principals indicated the use of school process data. The dominant focus these principals conveyed on achievement data is certainly understandable, given the current policy environment which emphasises accountability. Student achievement data are very important and, as the data in Table 4 illustrate, many high-impact decisions could be made based on student achievement. Our findings suggest, however, that the predominant focus on student achievement may be having a deterring effect on the use of other important data streams. Second, the use of data is predominantly for accountability purpose. Almost all principals mentioned the use of student achievement data for accountability purposes, but fewer than half mentioned the use of the data for school improvement, in general, and improving curriculum and instruction in particular. In other words, the predominant use of data is focused on tracking and describing the outcomes of the learning process rather than on intervening in the learning process. This illustrates that the issue of data ‘‘of’’ learning versus data ‘‘for’’ learning is, indeed, indicative of a serious imbalance in how these principals currently understand and use the data available to them in the school environment. Third and somewhat encouraging, is the fact that some principals did mention using data for improving teaching and learning. Among others, some principals indicated that they used data for (a) grouping and placement, (b) identifying weakness per state/other standards and (c) assessing proficiency per curriculum taught. For example, as far as identifying weakness per state and other standards is concerned, some principals identify the strength and weakness according to standards such as sentence fluency, reading comprehension, word recognition, etc. The findings of weakness and strength could then be used to ‘‘do our curriculum mapping for next year’’, ‘‘to put emphasis on (things) and to ‘‘adjust instruction.’’ To use data to inform the development of high-impact strategies is a promising practice indeed; however, not all principals reported such practice. As a matter of fact, the numbers of principals who mentioned such usage were low with only six who mentioned grouping and placement, nine who noted identifying weakness per state/other standards, and seven who specified assessing proficiency per curriculum taught. These results suggest that, for these principals and likely for the field, there is a serious challenge ahead to institutionalise such practices. Fourth, related to the dominance of student achievement data to the neglect of student and community background data and school process data is the finding that principals rarely use multiple streams of data to inform decision-making. For example, they did not disaggregate achievement data by group membership, let alone disaggregate achievement data by both group membership and their schooling experience. It is indeed a surprising finding given the emphasis of NCLB on disaggregating student achievement data by subcategories of group membership. Therefore, the data suggest that Bernhardt’s (2003, 2004, 2005) concept of

123

450

J. Shen et al.

‘‘intersecting data’’ has not been practised by the principals. The power of datainformed decision-making has not been fully realised. Finally, principals’ practice of using data to make curriculum- and instructionrelated decisions appears to vary by school districts. The stated practices of ‘‘(a) grouping and placement and (b) identifying weakness per state/other standards and (c) assessing proficiency per curriculum taught’’, were much more prominent in the responses of principals from two districts than the responses from principals in the other two districts by a margin of three to one. The school district appears to be a factor that is related to the extent to which principals used data for teaching- and learning-related decisions. On the one hand, this finding suggests that in many school districts, data are seldom used for curriculum- and instruction-related decisions. On the other hand, the finding indicates that, if conditions are in place, the practice of using data for making curriculum- and instruction-related decisions could be promoted. Through interviewing 16 principals, we found that data inform principals’ decision to a limited extent. We have the following major findings: (a) student achievement data are predominantly used to the extent of neglecting other streams of data such as what transpires in the teaching and learning process in school; (b) student achievement data are used more for accountability purposes—for assessing ‘‘of’’ rather than ‘‘for’’ the learning; (c) different streams of data are rarely used together to derive rich meaning for decision-making; and (d) there is a difference among school districts in the extent to which their principals used data to improve curriculum and instruction. Given the continuing accumulation of data and the importance of using data for decision-making, the findings of the study point out serious challenges that the field of education, and particularly, the principalship face. The state of practice for these principals may have implications for understanding how far we need to go in converting school decision-making practices to less buffered and more data-supported processes (Schmoker 2006). In breaking down the isolation that impedes the process of translating new knowledge to changes in practice, the ability to access, interpret and utilise the rich sources of data in the school environment is critical. School principals are on the front line of the work to effect change; yet, if the practices of the 16 participants in this study are similar to their counterparts in other schools, principals will need continued assistance in adopting data-informed decision-making practices especially with regard to drawing important implications for raising student achievement through the utilisation of all three data streams in such a way that they inform decisions made for learning as well as conclusions reached about learning (Reeves 2004).

Appendix See Table 5.

123

Using data for decision-making

451

Table 5 Description of the acronyms used in Table 1 Name of test

Initiator/ developer

Type of assessment

Content areas

Measurement standards

American College Testing (ACT)

ACT

NormReferenced Diagnostic, Personalised Study Plan

ACT assesses collegebound students in the areas of English, Math, Reading, Science and optional Writing

National Norm

Compass Learning

WRC Media, Inc.

Diagnostic, Personalised Learning Plan

Compass Learning is a State Content web-based curriculum Standards and assessment programme for students in grades PreK-12. Content areas include: Math, Reading, Science, Social Studies and ESL

Developmental Reading Assessment (DRA)

Pearson Learning

Screening, Progress Monitoring

DRA assesses fluency and comprehension skills of students in grades 1–5. DRA enables the teacher to observe, record and evaluate changes in students’ performance

Diagnostic Assessment of Reading (DAR)

Riverside Publishing

Diagnostic

DAR assesses the reading skills (phonics, fluency, vocabulary and reading comprehension) of students in grades 1–12

Dynamic Indicators Drs. Roland of Basic Early H. Good and Literacy Skills Ruth A. (DIBELS) Kominski of the University of Oregon

Is a set of standardised, Formative indicators Normindividually Referenced, of key literacy administered Screening skills that are measures of literacy and Progress predictive of later development. Monitoring reading DIBELS test fluency achievement in Initial Sounds, Letter Naming, Phoneme Segmentation, Nonsense Words, Oral Reading, Retelling and Word Use

123

452

J. Shen et al.

Table 5 continued Name of test

Initiator/ developer

Gates-MacGinitie

Harcourt Assessment— Miller Assessment for Preschoolers (MAP)

Content areas

Measurement standards

Riverside NormPublishing Co. Referenced, Screening, Outcome Measure

Gates-MacGinitie provides data about the reading ability of students in grades K-12. The test helps teachers understand what beginning reading know, pinpoint students’ decoding skills, ascertain which students are reading on grade level

National Norm Standards

Harcourt Assessment, Inc.

NormReferenced Diagnostic

National Norm Harcourt Assessment MAP is designed to evaluate children in ages 2 years 9 months to 5 years 8 months that are mild to moderate developmental delays

Iowa Tests of Basic The University Skills (ITBS) of Iowa/ College of Education

NormReferenced

Primary Grades (Levels National Norm 5–8). Vocabulary, Standards Word Analysis, Reading Comprehension, Listening, Language, Math, Social Studies, Science, Sources of Information. Grades 3–8 (Levels 9–14). Vocabulary, Reading Comprehension, Spelling, Capitalisation, Punctuation Usage and Expression, Math Concepts and Estimation, Math Problem Solving and Data Interpretation, Math Computation, Social Studies, Science, Maps and Diagrams, Reference Materials, Word Analysis and Listening

Metropolitan Achievement Test

NormReferenced

Math, Science, Reading National Norm and Social Studies

123

Harcourt Assessment, Inc.

Type of assessment

Using data for decision-making

453

Table 5 continued Name of test

Type of assessment

Content areas

Michigan Michigan Educational Legislature Assessment Program (MEAP)

CriterionReferenced

Reading, Math, Science, Michigan Content Social Studies, Standards and Writing at selected Benchmarks grades

Northwest NWEA Evaluation Assessment, Measures of Academic Progress (NWEA MAP)

Diagnostic

NWEA measures student progress in Reading, Math and Language Usage

Preliminary SAT/ National Merit Scholarship Qualifying Test (PSAT/NMSQT)

Initiator/ developer

NormCollege Board Referenced and the National Merit Scholarship Corporation (NMSC)

Measurement standards

State-aligned Computerised Adaptive Assessments

PSAT/NMSQT provides National Norm practice for the SAT Reasoning Test and enable students to enter NMSC scholarship programmes

Scholastic Aptitude College Board Test (SAT)

NormReferenced

SAT demonstrates to colleges students’ skills in Reading, Math and Writing

Stanford Achievement Test (SAT)

ETS

NormReferenced

Math, Science, Reading National Norm and Social Studies

Star Reader

Bright Education Services and Testing

Tungsten Benchmark Assessments

Edison Schools

Woodcock-Munoz Riverside Language Survey Publishing

National Norm

State-specific Commerciallypractice tests developed on-line practice tests in Language Arts, Math, Science, Social Studies and History Diagnostic

Tungsten Benchmark State Benchmarks are electronically administered monthly to students in grades 2–8 in Reading and Math, and quarterly to students in grades 2–8 in Science and Social Studies

Diagnostic

Woodcock-Munoz assesses English language proficiency of students in grades K-12 and adults

English Language Proficiency

123

454

J. Shen et al.

Table 5 continued Name of test

Initiator/ developer

WorkKeys ACT Foundational and Personal Skills Assessment

Type of assessment

Content areas

Diagnostic

WorkKeys are webbased assessments designed to measure the cognitive abilities of students in applied maths, reading for and locating information, along with predicting job-related behaviours

Measurement standards

References American Association of School Administrators. (2006). Using data to improve schools: What’s working. Alexandria, VA: American Association of School Administrators. Austin, G., & Reynolds, D. (1990). Managing for improved school effectiveness: An international survey. School Organization, 10(2), 167–178. Bernhardt, V. (2003). Using data to improve student learning in elementary schools. Larchmont, NY: Eye on Education. Bernhardt, V. (2004). Using data to improve student learning in middle schools. Larchmont, NY: Eye on Education. Bernhardt, V. (2005). Using data to improve student learning in high schools. Larchmont, NY: Eye on Education. Celio, M. B., & Harvey, J. (2005). Buried treasure: Developing a management guide from mountains of school data. New York: Wallace Foundation. Coburn, C. E., & Talbert, J. E. (2006). Conceptions of evidence use in school districts: Mapping the terrain. American Journal of Education, 112, 469–496. Cohen, D. K. (1991). A revolution in one classroom: The case of Mrs. Oublier. Educational Evaluation and Policy Analysis, 12(3), 311–329. Corbin, J. C., & Strauss, A. C. (2007). Basics of qualitative research (3rd ed.). Thousand Oaks, CA: Sage Publications. Creighton, T. B. (2001). Data analysis in administrators’ hands: An oxymoron? School Administrator, 58(4), 6–11. Goddard, R. D. (2001). Collective efficacy: A neglected construct in the study of schools and student achievement. Journal of Educational Psychology, 93, 467–476. Goddard, R. D., et al. (2004). Collective efficacy beliefs: Theoretical developments, empirical evidence, and future directions. Educational Researcher, 33(3), 1–13. Goldring, E. B., & Pasternak, R. (1994). Principals’ coordinating strategies and school effectiveness. School Effectiveness and School Improvement, 5(3), 239–253. Goodlad, J. I., & Klein, M. F. (1974). Looking behind classroom door. Worthington, OH: Charles A Jones Publishing Company. Hallinger, P., & Heck, R. H. (1996). Reassessing the principal’s role in school effectiveness: A review of empirical research, 1980–1995. Educational Administration Quarterly, 32(1), 5–44. Heck, R. H. (1992). Principals’ instructional leadership and school performance: Implications for policy development. Educational Evaluation and Policy Analysis, 14(1), 21–34. Heck, R. H., & Marcoulides, G. A. (1993). Principal leadership behaviors and school achievement. NASSP Bulletin, 77(553), 20–28. Hsieh, C.-l., & Shen, J. (1998). Teachers’, principals’, and superintendents’ perceptions of leadership. School Leadership and Management, 18(1), 107–121.

123

Using data for decision-making

455

Kerr, K. A., et al. (2006). Strategies to promote data use for instructional improvement: Actions, outcomes, and lessons from three urban districts. American Journal of Education, 112, 496–520. Knapp, M. S., et al. (2006). Data-informed leadership in education. Seattle, WA: University of Washington: Center for the Study of Teaching and Policy. Leithwood, K. A., & Jantzi, D. (1999). The relative effects of principal and teacher sources of leadership on student engagement with school. Educational Administration Quarterly, 35(supp.), 679–706. Leithwood, K. A., & Montgomery, D. J. (1986). Improving principal effectiveness: The principal profile. Toronto: Ontario Institute for Studies in Education. Leithwood, K., et al. (2004). How leadership influences student learning. Minneapolis: University of Minnesota, Center for Applied Research and Educational Improvement. Retrieved September 2007, from www.wallacefoundation.org/wf/knowledgecenter/knowledgetopics. Louis, K. S., et al. (1996). Teachers’ professional community in restructuring schools. American Journal of Education, 33(4), 757–798. Marks, H. M., & Louis, K. S. (1997). Does teacher empowerment affect the classroom? The implications of teacher empowerment for instructional practice and student academic performance. Educational Evaluation and Policy Analysis, 19, 245–275. Marks, H. M., & Printy, S. M. (2003). Principal leadership and school performance: An integration of transformational and instructional leadership. Educational Administration Quarterly, 39(3), 370–397. Marzano, R. J. (2003). What works in schools. Alexandria, VA: Association of Supervision and Curriculum Development. Marzano, R. J., et al. (2005). School leadership that works. Alexandria, VA: Association for Supervision and Curriculum Development. Michigan Department of Education. (2006). Michigan school improvement framework. Lansing, MI: Michigan Department of Education. Reeves, D. (2004). Assessing educational leaders: evaluating performance for improved individual and organizational results. Thousand Oaks, CA: Corwin Press. Rodriguez-Campos, L., Rincones-Gomez, R., & Shen, J. (2008). Do teachers, principals, and superintendents perceive leadership the same way: A structural equation modeling test of a multi-dimensional construct across groups. Frontiers of Education in China, 3(3), 360–385. Salpeter, J. (2004). Data: Mining with a mission. Technology and Learning, 24(8), 30–32. 34, 36. Schmoker, M. (2006). Results now: How we can achieve unprecedented improvements in teaching and learning. Alexandria, VA: Association for Supervision and Curriculum Development. Sebring, P. B., & Bryk, A. (2000). School leadership and the bottom line in Chicago. Phi Delta Kappan, 81, 440–443. Sergiovanni, T. J. (2005). The principalship: A reflective practice perspective (5th ed.). Bonton: Allyn and Bacon. Sharkey, N. S., & Murnane, X. (2006). Tough choices in designing a formative assessment system. American Journal of Education, 112, 572–588. Shen, J. (Ed.). (2005). School principals. New York: Peter Lang. Shen, J., & Ma, X. (2006). Does systemic change work? Curricular and instructional practice in the context of systemic change. Leadership and Policy in School, 5(3), 231–256. Taylor, A., & Valentine, B. (1985). Effective schools: What research says about S series (Data Research Report No. 1). West Haven, CT: National Education Association. (ERIC Document Reproduction Service No. ED 274 073). Wahlstrom, D. (2006). Using data to improve student learning. Retrieved September 2007, from http://www.successlineinc.com/MichiganDataWorkshops. Wayman, J. C., & Stringfield, S. (2006a). Technology-supported involvement of entire faculties in examination of student data for instructional improvement. American Journal of Education, 112, 549–571. Wayman, J. C., & Stringfield, S. (2006b). Data use for school improvement: School practices and research perspectives. American Journal of Education, 112, 463–468. Witziers, B., Bosker, R. J., & Kruger, M. L. (2003). Educational leadership and student achievement: The elusive search for an association. Educational Administration Quarterly, 39(3), 398–425.

123

456

J. Shen et al.

The authors Jianping Shen is currently the John E. Sandberg Professor of Education in the Department of Educational Leadership, Research and Technology at Western Michigan University. He is a former National Academy of Education/Spencer Foundation postdoctoral fellow. His research interests include leadership theory, policy analysis and research methods. He has published widely, including more than 60 journal articles and several books in English. Van E. Cooley is Professor and Chair of the Department of Educational Leadership, Research and Technology at Western Michigan University. He has written over 50 articles and presented at over 60 national conferences and meetings. Through external funding, he has worked with a number of urban principals and professional organisations. He has also served on national issue groups focusing on data informed decision-making and cohesive leadership strategies. Dr. Cooley has served as a teacher, building-level administrator, assistant superintendent and superintendent. Patricia Reeves is an Assistant Professor of Educational Leadership, Research and Technology in the College of Education and Human Development at Western Michigan University. She joined the WMU faculty in 2004 with 20 years’ experience as a K-12 deputy superintendent and superintendent. Since joining the faculty, Dr. Reeves has also worked with the Michigan Department of Education and the Michigan Association of School Administrators to write new statutes for and implement performance and impact credentialing systems for school leaders. Her research focuses on superintendent practice, datainformed school improvement, administrator development/credentialing and qualitative research methods. Walter L. Burt is Assistant Professor of Educational Leadership in the Department of Educational Leadership, Research and Technology. He has experience as a teacher, building and central office administrator and school superintendent. He has published in professional journals and presented at national conferences. His research interest is in educational leadership, with a particular emphasis on leadership development in urban settings. Lisa Ryan is a science teacher at Calhoun Community High School, an alternative school for at-risk students, in Battle Creek, Michigan. She also works as an educational research consultant, writing grant applications and conducting project evaluations. Most recently, she served as a co-principal investigator on a National Science Foundation-funded project investigating students’ perceptions of scientists on television. Her research interests include schools for at-risk students and science education for underrepresented groups. J. Mark Rainey holds a Doctor of Education in Educational Leadership from Western Michigan University (WMU). He is a Senior Research Associate in the Department of Educational Leadership, Research and Technology at WMU. He serves as a Senior Consultant to the Michigan Association of School Boards. He was formerly the Executive Director for Kalamazoo Regional Educational Service Agency (KRESA). Prior to his KRESA position, he was a teacher, principal and district administrator for the Saginaw Public Schools. He has also acted as a school board trustee for the Gull Lake Community Schools in Richland, Michigan. Wenhui Yuan is a research analyst for the Fort Worth Independent School District, Texas. He served as a faculty member in the Department of Education, East China Normal University, before enrolling in the Ph.D. programme in Educational Leadership at Western Michigan University. Currently, his research interests include educational practice and policy in closing the achievement gap, building family, school and community partnerships, and improving teacher quality.

123

Suggest Documents