i SIMULATE AND STIMULATE TO UNDERSTAND ... - CiteSeerX

0 downloads 0 Views 503KB Size Report
Aug 22, 2002 - CHAPTER 4 . ..... help students not only reach the right answer but understand why it was the right ... sixth printing (1997) Moore states, “The Basic Practice of Statistics is ..... The bibliography on the Key Curriculum/Fathom web site pointed me to ...... comfortable with changing the line's intercept and slope.
SIMULATE AND STIMULATE TO UNDERSTAND: LEARNING STATISTICS WITH FATHOM™

by Sharon J. Lane-Getaz

A Capstone submitted in partial fulfillment of the requirements for the degree of Master of Arts in Teaching

Hamline University Saint Paul, Minnesota September 2002

Committee: Professor Peg Lonnquist, chair Robert Rumppe Donna Paige

i

TABLE OF CONTENTS

TABLE OF FIGURES.................................................................................................. v CHAPTER 1 ................................................................................................................. 1 HISTORICAL BACKGROUND ..................................................................................... 1 Situation: Real-data curriculum using graphing calculators............................. 3 Problem Statement: Limited understanding of statistical concepts. ................. 6 Solution: Add dynamic, graphic statistical software—Fathom.. ..................... 7 Question: Did the labs deepen students’ understanding? ................................. 8 SUMMARY .............................................................................................................. 12 CHAPTER 2 ............................................................................................................... 14 THEORETICAL FRAMEWORK: CONSTRUCTIVISM MEETS TECHNOLOGY ................. 14 Statistics and constructivism .......................................................................... 15 Statistics and technology................................................................................. 19 Fathom: Dynamic, graphic simulation for deeper understanding................... 27 SUMMARY .............................................................................................................. 32 CHAPTER 3 ............................................................................................................... 34 INTEGRATIVE ACTION RESEARCH .......................................................................... 34

ii

The setting....................................................................................................... 36 The participants............................................................................................... 36 Action Research Plan...................................................................................... 41 Limitations. ..................................................................................................... 47 SUMMARY .............................................................................................................. 50 CHAPTER 4 ............................................................................................................... 51 RESULTS OVERVIEW .............................................................................................. 51 QUANTITATIVE RESULTS ....................................................................................... 52 Mean Comparison........................................................................................... 53 Exploratory Data Analysis.............................................................................. 53 Tests of Significance....................................................................................... 57 QUALITATIVE RESULTS.......................................................................................... 61 Reflections prior to the first lab. ..................................................................... 62 Student KWL feedback................................................................................... 62 Interpretation of student KWL feedback. ....................................................... 67 Reflections after the first lab........................................................................... 68 SUMMARY .............................................................................................................. 69 CHAPTER 5 ............................................................................................................... 71 FINAL OBSERVATIONS ........................................................................................... 71 Positioning within existing research. .............................................................. 72 Implications of the study................................................................................. 74

iii

Generalizability............................................................................................... 77 Observations as a Fathom user ....................................................................... 77 What’s next? Modifications to the existing curriculum................................. 78 SUMMARY .............................................................................................................. 81 APPENDIX A. Cognitive Learning and Instructional Design .................................. 84 APPENDIX B. Tabular Demographic Comparison: No Lab versus Lab Year ......... 86 APPENDIX C. Math Studies Second Semester Syllabus........................................... 88 APPENDIX D. Lab Handouts: Linear Regression, Inferring the Mean..................... 91 APPENDIX E. Samples of Student Output ................................................................ 98 APPENDIX F. IB Internal Assessment Criteria Summary ...................................... 101 APPENDIX G. Analysis T-Test—Eliminates Outliers ............................................ 103 APPENDIX H. Sample of KWL Survey .................................................................. 105 APPENDIX I. Qualitative Results: KWL Data from 2001 Lab ............................... 107 APPENDIX J. Analysis, Structure, and Evaluation Distribution ............................. 110 APPENDIX K. Human Subjects Research Forms.................................................... 112 REFERENCES ......................................................................................................... 117

iv

TABLE OF FIGURES

Figure A:

Cognitive Constructivism meets Technology

14

Figure B:

Sample Fathom Display

29

Figure C:

Participant Demographics—Program

37

Figure D:

Participant Demographics—Ethnicity

37

Figure E:

Participant Demographics—Gender

38

Figure F:

Participant Demographics—Grade Level

38

Figure G:

Second Semester Math Studies Grades—Grade Level

39

Figure H:

Second Semester Math Studies Grades—Gender

40

Figure I:

Cognitive Learning and Fathom Lab Design

43

Figure J:

Higher-Order Skills No Lab vs. Lab Group—Mean Comparison

52

Figure K:

Analysis Scores Distribution

54

Figure L:

Structure and Communication [Synthesis] Scores Distribution

55

Figure M:

Evaluation Scores Distribution

56

Figure N:

Mean Probability t-test on Analysis

57

Figure O:

Mean Probability t-test on Synthesis

59

Figure P:

Mean Probability t-test on Evaluation

60

Figure Q:

What do you Know?—Quantitative

63

Figure R:

What do you Want to know?—Qualitative

64

v

Figure S:

What have you Learned?—Quantitative

65

Figure T:

What have you Learned?—Qualitative

66

Figure U:

What Students Learned using Fathom—Qualitative

67

vi

1

CHAPTER 1 Introduction

Historical Background Whether running statistical software on a personal computer or using graphing calculators, statistics educators have used technology to make computations easier and to depict statistical ideas graphically. Statistical software is widely recognized as effective scaffolding to teach students statistics (Packard, 1993; Keeler and Steinhorst, 1995; Cabilio and Farrell, 2001; Mills, 2002). However, there is little research that shows whether statistical software is actually effective in teaching statistical concepts. This integrative action research project addresses the question; did adding Fathom Dynamic Statistics™ software to my introductory statistics course deepen students’ conceptual understanding? My action research begins with a little personal history about this question. In the following discussion I elaborate on these points: •

The situation: Inherited real-data curriculum using static graphing calculators,



The problem: Found students had limited understanding of statistical concepts,



The solution: Added dynamic, graphic statistical software (Fathom), and



The question: Did the labs improve students’ conceptual understanding?

2

I started teaching statistics two years ago in the International Baccalaureate program at an upper-Midwest, urban high school. I found that while students were able to produce the “right answer” using graphing calculators, they had merely rehearsed the proper sequence of buttons to push but remained unclear on the concepts underlying the computations. Having entered teaching after a career in the computer industry, I was predisposed to think that just the right technology would help students not only reach the right answer but understand why it was the right answer. I had other concerns that just the right technology might also address. My students were and are a heterogeneous group. Both IB and non-IB students who do not intend to pursue mathematics or science-related careers tend to sign up for this Standard Level Mathematical Studies course called Math Studies. Many of these students plan to study the social sciences in college and will need to use statistics in these pursuits. At one end of the spectrum there are seniors taking this course who are merely fulfilling a needed math credit for college admission, and at the other end are juniors (and a few sophomores) who are working diligently toward the prestigious IB diploma. Meeting the needs of this diverse group of learners is part of what motivated me to want to bring some new approaches to this course. Not only might a dynamic statistics lab deepen understanding of statistical concepts but it could also inspire some students whose interest might not otherwise be piqued by the existing curriculum.

3

Situation: Real-data curriculum using graphing calculators. The curriculum I inherited has many merits. The curriculum is real-data based, which makes the topics of greater interest to the students, and it employs the use of graphing calculator technology, which reduces the need for students to struggle with complex, mathematical computations. As both a teacher and a learner of statistics, I do value the graphing calculator technology as a great educational tool. (There were no graphing calculators in the classrooms I left some twenty years ago!) The TI-83 graphing calculator (i.e., Texas Instruments TI-83) is the tool of choice in the IB Math Studies curriculum that I teach. Students are expected to use these calculators in class, for homework, and when they take their comprehensive IB exam at the end of the course. I was fortunate to be able to teach a course that has already integrated technology to a significant degree. I am also fortunate to have inherited a real-data based textbook, David S. Moore’s (1995) Basic Practice of Statistics, to teach this course. In the preface of the sixth printing (1997) Moore states, “The Basic Practice of Statistics is an introduction to statistics for students in two-year and four-year colleges and universities that emphasizes working with data and statistical ideas…” (p. xiii). Moore, as a member of the joint committee of the American Statistical Association and the Mathematical Association of America, helped develop guidelines for teaching introductory statistics. The guiding principles reflect a constructivist approach: •

Emphasize statistical thinking;

4



Emphasize more data and concepts, less theory, fewer recipes; and



Foster active learning.

The emphasis on statistical thinking requires students to explain why they are doing a particular statistical test and to explain how the results could be useful in a real-world setting. The emphasis on data, not theory, frees the students from the calculus theory that underlies some of the statistical computations and asks the students to focus on real data. Students are expected to learn concepts inductively through their experiences with data. The underlying philosophy is one of active learning, a constructivist method, of having students engaged in hands-on, problemsolving activities. This textbook was an excellent choice for teaching the statistics content in this IB Math Studies course. The statistics content in IB Math Studies includes most of the fundamental statistics offered in Moore’s text: •

understanding data in terms of examining center, shape, and spread of distributions, correlation, linear regression, conditional and marginal distributions, and designing samples and experiments to produce data;



understanding inference in terms of sampling distributions, sample proportions and means; and



introducing the inference concepts of confidence intervals, and the z-test and X2-test of significance.

5

Moore illustrates examples in his 1995 textbook using Minitab® software—rated the best software of its kind at the time (Webster, 1992). In my classroom today I illustrate these same examples using our handheld graphing calculator and overhead projection. The text, which was updated with new data in 1996, is a good, but dated, selection for the IB Math studies curriculum because of its focus on stimulating student thinking. The IB Math Studies curriculum allows for some flexibility in the offerings that a particular school may teach. My school, “Upper Midwest High School,” has emphasized statistics. We dedicate one full semester to statistics instruction and during the third quarter, while reviewing a broad range of mathematics topics in class, students spend a considerable amount of time outside of class on a statistics project, the so-called “IB internal assessment,” designed to assess students’ conceptual understanding. The “internal assessment,” a statistics project, is designed by the IB program to evaluate student understanding of the research process and it comprises one third of the International Baccalaureate grade for Math Studies while the remainder of their grade comes from written exams. The statistics project spans five weeks of the third quarter. Students spend 25 hours, primarily outside of class, researching, analyzing, synthesizing, evaluating, and reporting on a statistical topic of their choice. Students pose as researchers, define a task statement, conduct a statistical study or collect data from reputable sources, analyze, and evaluate the data, and then draw appropriate

6

conclusions. Finally, students write a research paper reflecting their work which is evaluated using an IB rubric. In my first year of teaching this course, students used their graphing calculators to compile their statistical analyses for the internal assessment. Many students transposed the statistical output from their graphing calculators on to paper by hand for inclusion in their final papers. Where possible, a few students used spreadsheet applications to generate histograms, scatter plots, and line graphs, but bell curves had to be hand drawn. So, the statistics-capable graphing calculator technology (the TI-83) left a couple of things to be desired. First of all, students could master the TI-83 to generate the right answers without knowing exactly why they were doing what they were doing. Despite the fact that calculator technology relieved students from computational drudgery, they were not thinking deeply to solve statistical problems. Secondly, the accurate graphics generated on the calculator screens were typically being hand drawn, with less accuracy, into students’ final project papers. Problem Statement: Limited understanding of statistical concepts. Graphing calculator technology alone had not sufficiently facilitated students’ attainment of deeper understanding of statistical ideas in my class. I was gratified to find that I was not alone in my perception that students were missing the concepts behind their statistical analyses. Like-minded educators state that students, and professionals, oftentimes have little understanding of the underlying basic statistical ideas (Cabilio and Farrell, 2001; Garfield, delMas, and Chance, 1997). My background in the

7

computer industry as a systems engineer, and my calling toward constructivist methods, led me to think that just the right statistical software could not only free students from the computational grind of statistics but also open the door to the conceptual world of ideas and relationships within statistics. Could adding conceptspecific personal computer labs foster a deeper understanding of fundamental statistical concepts? I decided to try it…for free! Solution: Add dynamic, graphic statistical software—Fathom. A Fathom site license was given to our school as part of a deal with Key Curriculum Press in exchange for sponsoring a workshop last summer (2001). The people at Key Curriculum Press are the developers of Fathom. They also developed The Geometer’s Sketchpad, which other teachers in our department and I already use enthusiastically to supplement our inductive geometry courses. Although I did not attend the workshop, the newest teacher of Math Studies, who joined me in teaching the course last fall, was able to attend the summer session on Fathom and gave it good reviews. So, my teaching colleague and I signed up for a couple of days of lab time at the end of each quarter in anticipation of adding a Fathom lab or two. I found Fathom easy to learn and the Data in Depth workbook published by Key Curriculum Press easy to follow. I installed Fathom at home and used the wellwritten workbook to modify a lab on linear regression that would smooth our students’ transition from the TI-83 environment to this new dynamic interface. I designed the labs so that the first day was primarily a day to “get to know the software.” On Day Two of the lab all the student pairs were able to successfully

8

complete the lab handout. Based on the success of this first lab, I designed a second computer lab on statistical inference that was added at the end of the next quarter. Again, the prepackaged lab in the workbook was easy to follow and easy to adapt for my class. The ease of the Fathom lab was complemented by the convenience of the new wireless laptop technology which we brought in to the classroom for lab day. Many of the students had not yet used the new wireless lab environment but they had prior experience with the network software from the hardwired LAN labs and the PCs available to students in the media center. Almost unwittingly students were enjoying gaining additional exposure and practice with statistical concepts. During the linear regression lab one student started singing when I passed his desk with my video camera, “What’s slope got to do, got to do with it?” (He was singing the lab instructions to the tune “What’s love got to do with it?,” made popular by Tina Turner.) So, students were having fun. The larger question remained. Was the lab effective? Question: Did the labs deepen students’ understanding? As will be shown in Chapter Two some researchers believe the dynamic nature of Fathom may be key to improving students’ understanding of underlying statistical concepts. The static graphing of TI-83s are being superceded by dynamic environments in which data are no longer fixed in a table or on a graph but can easily be manipulated to investigate a myriad of “what if?” scenarios.

9

The TI-83s are beginning to be introduced in earlier classes (e.g., Algebra II), so future students will come to Math Studies more adept at using the graphing calculator technology. This prior knowledge should enable me to move more quickly through some of the curriculum and add more dynamic labs over time. Eventually, I envision a class that is primarily lab-based. In order to convince my department and school to make a new textbook investment, it would be very helpful to have some quasi-empirical data to back up my ideas. There is precious little empirical data available on this subject, as will be shown in Chapter Two. There are bound to be skeptics who will need some convincing. The perspective that computer simulation labs like Fathom improve students’ understanding of statistical concepts has some cautiously optimistic proponents among respected statistics educators. Avid statistics researchers like delMas, Garfield, and Chance (1999) who authored the action research model I used for my study, cautioned about using simulation software without verifying its effectiveness. (D)espite the accepted approach used to integrate simulation software into a statistics class, there is little published research describing and evaluating such an approach. While the motivation for these programs is to provide improved instructional experiences, most of the authors describe the nature of the program and demonstrate how it can be used in the classroom, but they do not report any evidence that the program improves learning or understanding…. (delMas, Garfield, & Chance, 1999, ¶ 7)

10

Clearly, more studies need to be conducted that address the effectiveness of integrating statistical simulation software in to the classroom. In the conclusion of her review of the literature on computer simulation methods (CSMs), Mills (2002) speculated that: The lack of empirical support may be related to the fact that it can be very difficult and labor-intensive to conduct empirical or experimental educational research. Randomly assigning students in the same class to different teaching methods is also an ethical concern. Still, there is a growing emphasis on assessment in higher and statistics education (Garfield, 1995), especially using computer technology, and documenting our research through empirical methods will be necessary as we progress into the computer education era. My action research study is one attempt to offer some quantitative, as well as qualitative data, to the discussion. It is important to note that while adding this new dynamic statistical technology, graphing calculators will remain an important technology to teach statistics in the IB Math Studies courses. In spite of the limitations of the TI-83 calculators such as the small screen, which prohibits simultaneous display of data and graphs for all practical purposes, the inability to save separate scenarios for comparisons, and the inability to directly create report-quality output, there are distinct advantages of the TI-83 technology. TI-83s are more affordable (students can afford to buy them) and portable (students can take them home to do homework.) However, in addition to the limitations enumerated above, this graphing calculator

11

technology apparently did not help students understand the concepts behind the calculations that they could do so efficiently. Fathom’s animated, activity-based labs might bridge this gap. Fathom, like many PC-based statistical software predecessors (e.g. Minitab®, Systat, etc.), provides students with the ability to save and produce graphic output for their culminating statistics study papers. Fathom’s unique advantage is its animated software environment which allows students to create innumerable “what if” scenarios and dynamically see the results. The data and the graphics are linked so that as soon as the user changes the data, the graphs reflect the results dynamically. The labs are designed to facilitate learning the underlying statistical concepts that many students still struggle with, even after mastering the TI-83s. As will be discussed in greater detail in Chapter Two, the dynamic nature of this program that facilitates testing numerous “what if” scenarios by merely dragging a data point, might prove to help students deepen their understanding of statistical concepts. As an example, consider the concept that “area under the curve” is probability. This concept can be elusive to introductory statistics students. “One of the biggest hurdles in teaching statistics is convincing students that the area under curves has anything to do with all the samples, histograms, and various other indicators to which they have been exposed” (Lee, 1999, p. 670). One of the Fathom labs added to the Math Studies course had students “grabbing, dropping, and sliding” data across the screen while immediately seeing the impact on the area under the curve, the probability of the event occurring this far from the mean. The dynamic, animated

12

“look and feel” of the program might have facilitated a more intuitive understanding of this abstract idea. If the computer labs were effective, I might change the statistics textbook being used in Math Studies and move to a more lab-driven class. This is one of my underlying motivations that has not yet surfaced in this paper. I was given lead responsibility over Math Studies last year and think this would be a great modernization of the curriculum. I would choose a textbook that is real-data driven, like Moore’s text, but one that is more up-to-date in terms of its technology focus. I might also choose a text with more multi-cultural and gender-neutral sources of data. Perhaps my textbook selection could be compatible with both a dynamic Fathom approach for teaching concepts and using graphing calculator technology for portability and affordability. Summary The dynamic software environment of Fathom Dynamic Statistics™ moves beyond static graphics. Fathom provides students with opportunities to see innumerable examples—to dynamically manipulate statistics parameters without being bogged down in calculations—and might thereby facilitate students’ grasping statistical concepts more deeply. My hypothesis for this quasi-experimental action research project was that Fathom would prove to foster deeper understanding of statistical concepts and that deeper understanding would be reflected in the quality of students’ final statistics projects, the IB internal assessment. After a review of the literature (Chapter Two), I

13

present the methods I used to evaluate the Fathom labs’ effectiveness (Chapter Three). In Chapter Four, I report and interpret the results of my data analysis, as well as interpret my reflections and observations, and students’ responses to a KWL survey from our first lab day.1 I also report themes that emerge from students’ responses including whether students believed they could use Fathom to create scatter plots and linear regressions after the two day lab was completed and whether the lab was fun. As is stated in Chapter Five, by examining the effectiveness of adding the Fathom lab to our real-data, static technology-based curriculum this action research project may not only be of use to me but also to other secondary and some tertiary statistics educators.

1

The KWL survey (what do you Know, what do you Want to know, what have you Learned) is a metacognitive approach adapted from Vacca and Vacca, 1999.

14

CHAPTER 2 Literature Review

Theoretical Framework: Constructivism meets Technology In order to determine did adding a dynamic computer lab to my introductory statistics course deepen students’ conceptual understanding, I reviewed the literature to shape and inform my action research study. My literature search revealed a twenty-year migration of statistics educators toward constructivist reform, emerging technologies, and finally to using dynamic, graphic simulation activities to deepen students’ understanding of statistical concepts. The concept map in Figure A illustrates the interlocked relationships among these themes. Figure A: Cognitive Constructivism meets Technology Statistics Education Reform CognitiveConstructivist Methods

Dynamic, Graphic Simulation Software

Deep Understanding

Emerging Technologies

15

My literature review findings are presented within these three interlocked themes: (1)

Statistics educators adopted collaborative, active-learning approaches and found themselves aligned with cognitive constructivism;

(2)

Statistics educators explored emerging technologies to foster deeper conceptual understandings; and

(3)

Dynamic, graphic statistical software, like Fathom, designed to teach statistical concepts for deeper understanding is now commercially available. My literature search on “teaching statistics” in ERIC (Educational Resources

Information Center) yielded 98 resources, eight of which are cited below. Linking technology instruction to achievement of deeper conceptual understanding arose from searches on “cognition and technology” in INFOTRAC (5 of 68 are cited within this paper), “technology instruction and cognition” in ERIC (1 of 49 cited within), and “cognitive processing and technology” in PsycINFO (2 of 65 cited within). Additional sources were discovered using the ancestry method and reputable online resources. The bibliography on the Key Curriculum/Fathom web site pointed me to the online Journal for Statistics Education on the UCLA statistics department web site. From this journal I have drawn from 10 of 130 articles. Invaluable to my research were the descriptive literature reviews of Mills (2002) and Becker (1996). Statistics and constructivism. The very wording of my question, did adding a dynamic computer lab to my introductory statistics course deepen students’ conceptual understanding, implies that I am a cognitive constructivist. I am drawn both to the social-constructivist ideas of Vygotsky—that social interaction is required

16

for cognitive development—from which active, collaborative learning strategies emerge, as well as the use of tools and teaching aids as encouraged by Bruner, founder of the Center for Cognitive Studies (Lagemann, 2000). More fundamentally I am interested in students’ thinking, the one feature of constructivism upon which proponents of virtually all branches of constructivism agree (Olsen, 1999). In researching this topic I found that I join a large number of statistics educators who are focusing their efforts on improving students’ thinking. Student thinking is of particular concern in the introductory statistics course where the pedagogical focus is not on computational methods but on statistical reasoning. The shift in statistics education toward improving statistical reasoning predates the 1989 NCTM (National Council of Teachers of Mathematics) reform movement when statistics education became a priority in the K-12 arena. Snee (1993) cites the beginning of the move away from a computational approach as early as 1979. Watts, Snee, and Hogg were among the voices who called for a change in methods and focus back in the early 1990s (Bradstreet, 1996). There was general agreement that statistics education, especially introductory statistics education, needed to focus on student thinking about real research issues, using statistically sound processes, to solve problems. To stimulate statistical thinking, a growing chorus of statistics educators encouraged their colleagues to use real data, graphics, situated learning and workshops, i.e., active-learning approaches (Snee, 1993; Garfield, 1993; Moore, 1995; Steinhorst and Keeler, 1995; Bradstreet, 1996; Yilmaz, 1996; Gnanadesikan,

17

1997). The popularity of activity-based learning can be gleaned from the large number of articles published on the topic. In Becker’s 1996 literature review on “teaching statistics” she discovers that the most prevalent of all teaching materials discussed were activities (30%) and the most prevalent teaching strategies (32%) “involved group work, cooperative instruction, or individualization” (Becker, 1996, p. 78). Articles on active-learning approaches were prevalent in my search as well (Fernandez and Leping, 1999; Dolinksy, 2001; Cabilio and Farrell, 2001; Lovett and Greenhouse, 2000; Mills, 2002). These constructivist approaches to teaching and learning are grounded in cognitive learning theory. Snee (1993), Olsen (1999), and more recently, Lovett and Greenhouse (2000) suggest that cognitive psychologists’ work can help the statistics instructors in their efforts in the classroom. Lovett and Greenhouse claim that instructors can “increase their successes, if they apply the five principles of learning to guide their instructional design.” The five principles of learning are that 1) students learn best when they practice and perform on their own, 2) knowledge tends to be specific to the context in which it is learned, 3) learning is more efficient when students receive real time feedback on errors, 4) learning involves integrating new knowledge with old knowledge, and 5) learning is less efficient when students’ mental load increases. (See Appendix A: Cognitive Learning and Instructional Design.) Lovett and Greenhouse claim that using these principles helps “predict how various new learning activities will impact students’ learning. The essential idea is

18

that students’ processing (e.g., how students learn, what processes they engage while performing various learning activities) is an important part of what makes instruction effective. Focusing on student learning, therefore, can help us understand how best to improve it” (Lovett and Greenhouse, 2000, pp. 204-205). Since my objective was to improve student learning, I verified that these cognitive principles were present in the instructional design of the Fathom computer lab implementation. (See Chapter Three for a mapping of these principles to the Fathom implementation design.) The computer lab is simply an aid, an active-learning method that creates a vicarious experience in which students construct knowledge. The content and purpose of a computer lab clearly reflects cognitive-constructivist principles. It is an active-learning, collaborative exercise that employs an aid (technology) to improve students’ learning—it fits all the criteria needed to “stimulate statistics learning” (Fernandez and Leping, 1999). “(M)any of the instructional techniques that have blossomed in recent reform movements (e.g. collaborative learning) are consistent with current cognitive psychological research but (not) … developed directly from those results” (Lovett and Greenhouse, 2000, p. 196). In addition to the social-constructivist emphasis on a collaborative-learning approach, the cognitive-constructivist classroom employs as many instructional aids as possible (LeFrancois, 1999). Today these aids might take the form of manipulatives, videos, models, computer simulations, and combinations thereof, to create pseudoexperiences that will foster deeper conceptual understanding.

19

Of course not all researchers would agree that technology is the proper method to engage students in active learning. Snee shares the cognitivist view that a “key is experiential learning ... experience in working with real data (please, not computer simulations) … and solving real problems and improving processes…” (Snee, 1993, pp. 2-3). However, Snee’s discomfort with computer simulations is not shared by many of his colleagues. I believe that even pseudo-experiences like computer simulations, e.g. a lab, can be an effective constructivist teaching method in the introductory statistics classroom. The answer to my research question will either bolster or counter my belief but I am supported in this belief by Ben-Zvi (2000) who characterized computers as cognitive tools which shift statistical thinking to a higher cognitive level by focusing the statistics activities on “transforming and analyzing representations” (pp. 140-141). In addition, using real data in computer simulations brings the simulated experience a little closer to a real environment. When using real data, the only difference between simulating statistical research and doing statistical research is intention, i.e., whether the researcher is passionately committed to his or her question. Statistics and technology. As personal computing technology became ever more affordable, many statistics educators began experimenting with it as an activelearning tool to teach statistics from a more constructivist perspective. “Increasingly, researchers and educators are linking constructivism and learning with technology … since many see a strong support for the principles of constructivist philosophy in computer-based learning environments” (Ferguson, 2001, p. 5). Khamis’ “1991

20

survey indicated that 70% of introductory statistics courses used computer software…” (Fernandez and Leping, 1999, p. 176). Cobb (1993) documented a dozen NSF-sponsored projects which were developing resources for statistics education: Nearly all of these projects involve activities for statistical laboratories: for analysis of archival data, or for hands-on production of data, or for simulation-based learning. The prominence of the lab approach accords with the movement of statistics back towards its roots in science, and with research in education that demonstrates the importance of active learning. (¶ 13) Ten of twelve projects “seek to improve the teaching not just of statistical thinking but of statistical methods as well by involving students actively in the practice of statistics as science” (Cobb, 1993, ¶ 17). In addition, ten of these projects were implementing some form of computer laboratory to accomplish their goals. Technology helped facilitate active-learning environments, simulated real problems, and utilized real data, but there was little empirical evidence or general theory of learning to support this method. The popularity of computer use among statistics teachers was well documented in Becker’s 1996 literature search as well as the ironic scarcity of empirical evidence about their effectiveness. In a more recent literature search, Mills looked for “simulation” and “statistics” in articles between 1983 and 2000 and yielded 18 references which related statistics education to computer simulation methods, only one of which was

21

empirical. Mills’ (2002) own contribution to the research includes a reiteration of definitions of four computer simulation methods that mirror the historic development of personal computer software in general. The first simulation method requires some programming expertise, but these simulators (e.g., a BASIC® program) are very flexible in their function. The second method provides a window-driven, userfriendly computer environment (like Minitab® or Excel®) that allows students to concentrate on statistics but can lose some of the flexibility. The third definition, a combination of the first two, enables students to conduct “what if” scenarios with the user-friendliness of a window-driven environment and the flexibility of a programmer using software like SAS® or SPSS®. The fourth definition, or “the fourth generation,” employs commercial software packages designed exclusively for simulation purposes, like the “Sampling Distribution” program written by delMas (Garfield, delMas, and Chance 1997). This fourth generation computer simulation begins to describe the Fathom environment, which was not yet generally available at the time of Mills’ research. Regardless of the flexibility or user-friendliness of the computer simulation method (CSM), Mills (2002) concludes that they “are being used in all areas of statistics to help students understand difficult concepts. … Second, although many of the authors recommended using CSMs as a teaching tool in the statistics classroom, a few also advocated using some type of simulation exercise prior to the use of CSMs. …The results from the empirical studies reviewed in this paper revealed that CSMs appeared to be effective for lower-ability students” (Mills, 2002). Thus, Mills’ work

22

implies that a computer simulation lab may deepen students’ conceptual understanding and address the difficulty of teaching to disparate learning abilities. Fathom may actually take computer simulation a step further than Mills’ four definitions or generations. Fathom may be a fifth generation computer simulation, since it has been specifically designed not just for simulation but for teaching. A more accurate description of Fathom’s niche may be described by Yilmaz (1996): Although the use of computer software is common today, typical statistics software is meant for computation and not for teaching. The kind of software (Yilmaz, et al.) envision(ed) would be a more completed teaching tool by which basic technical skill can be learned at each student’s own pace, and significantly outside the classroom. … Capabilities of the data analysis module would be similar to the student editions of statistics software such as Minitab®, SPSS, Systat, and StataQuest. (¶ 18–21) The software Yilmaz envisioned would have three main modules: specific learning topics, test administration, and data analysis with a “consistent graphical interface” as a primary requirement. He asserts that the key to more effective teaching of introductory statistics is to “develop…interactive computer software specifically designed for teaching statistics” (Yilmaz, 1996, ¶ 3). Fathom developers may have answered Yilmaz’s call. In addition to calling for teaching software designed for statistics, Yilmaz specified three competencies introductory statistics students should achieve to be prepared to use statistics in the real-world.

23

(1)

Ability to link statistics and real-world situations,

(2)

Knowledge of basic statistical concepts, and

(3)

Ability to synthesize the components of a statistical study and to communicate the results in a clear manner. (Yilmaz, 1996, ¶ 5)

The real-data focus of the Math Studies course I teach already helped students link statistics to the real-world (Yilmaz’s first competency.) The optimal software to augment my course would pull in data from real-world databases, including the Internet. It would also improve knowledge of basic statistical concepts, (Yilmaz second competency) which were still unclear to many of my students after mastering many of the TI-83 statistical functions, and would improve students’ ability to synthesize their knowledge and communicate it well, reflecting deeper conceptual understanding, (Yilmaz’s third competency). Yilmaz, like Laviolette (1994), claims that interactive teaching software is the main vehicle for developing the second competency, knowledge of basic concepts. I think the optimal statistical software for education would address all three competencies identified by Yilmaz. Thus, Mills (2002) calls for computer simulation to teach statistics concepts, and Yilmaz (1996) calls for customized interactive, graphic teaching software. Even earlier, Packard, Holmes, and Fortune (1993) recognized that using a more animated, as opposed to static, graphics environment for computer simulation increased the effectiveness of teaching statistical concepts. A highly interactive, animated environment facilitates conducting “what if” scenarios. Packard’s Virginia Techbased study included several hypotheses. The most important hypothesis, from my

24

perspective, was this: Packard assumed that the nature or quality of the computer interaction that is, animated versus static would have an impact on how well his participants would learn the statistical concepts. Packard specifically selected participants who had little or no previous statistical experience. His study differentiated three computer simulation interfaces: (1) text, graphics, plus static interaction; (2) text, graphics, plus animated interaction; (italics my emphasis) and (3) text, graphics, plus passive video. (Packard, et al., 1993, p. 5) Packard concluded that the participants were less enthusiastic about static interaction with text and graphics, while text and graphics with animation was the most appealing. Thus, the works of Mills (2002), Yilmaz (1996), and Packard, et al. (1993) lead me to conclude that in order to deepen students’ conceptual understanding my computer lab needs to be based on commercially-available, interactive, graphic teaching software that allows flexible computer simulation and data analysis in a highly animated, user-friendly interface. Before I go on to look specifically at Fathom in light of these requirements, it is of interest to mention that during the development of statistical software to foster deeper conceptual understanding, some middle school educators were looking more generally at the impact of a technologyenriched curriculum on their students’ higher-order thinking. In a study very similar to the one I conducted, Hopson, Simms, and Knezek (2001) investigated the effect of a technology-enriched classroom on fifth- and sixth-

25

grade students’ development of higher-order thinking skills. The treatment group received a technology-enriched environment which included training, and access to word processing, spreadsheets, the internet, and various electronic online resources (like a thesaurus and dictionary). Students in the treatment group were taught the same curriculum as the comparison group. There was an increase in evaluation scores for the technology-enriched classroom versus the comparison group, significant at the p

Suggest Documents