Session T2J
Assessing Inquiry Learning in a Circuits/Electronics Course John C. Getty Instructor and Lab Director, Montana Tech Petroleum Engineering Butte, MT 59701
[email protected]
Abstract – Physics education research and the development of “inquiry” teaching methods over the last 20 years have, by all measures, significantly reduced misconceptions and improved student understanding in physics. But adoption of these methods in upper division science and engineering courses has been slow. This paper describes the modification of an electronics/circuits course designed for physics majors to determine whether the benefits of the inquiry method can be extended to upper level circuits/electronics courses. Formative and summative assessments and methods for evaluation of student misconceptions and understanding including the application of a standardized electric circuit concept inventory instrument are described. The results suggest that inquiry teaching methods implemented in upper level university courses contribute significantly to extinguishing common misconceptions and improving long-term understanding of the material. With so much to be gained it is time for the engineering education community to embrace these concepts and implement the fruits of physics education research for the benefit of our students and the world community. Index Terms – Assessing inquiry, batteries and bulbs, circuits, electronics INTRODUCTION Most undergraduate physics programs include an upper level course intended to introduce their students to electronic methods for data acquisition in the laboratory. The fall course in the full-year Laboratory Electronics sequence at Montana State University (MSU) is an introduction to DC and AC electric circuits, controlled sources (such as operational amplifiers), and time-varying signals and their interaction with the reactive elements, capacitors and inductors. This course was modeled after the classic “Circuits I” found in most Electrical Engineering programs. While the physics majors that are targeted by these courses tend to be scientifically sophisticated, they share many characteristics with students from across the academic spectrum. Even with their strong background in the physical sciences, a substantial number come to this class with misconceptions about voltage and current.
Substantial experience teaching this sequence has proven that this course and its students are susceptible to many of the pitfalls of conventional instruction described in the literature. A 1983 paper by Cohen, Eylon and Ganiel [1] describes students “mechanistically” applying element constraints (such as Ohm’s law) and circuit constraints (Kirchhoff’s Laws) to arrive at the operating point for a circuit. With each additional offering, anecdotal evidence accumulated which indicated the existence and tenacious persistence of fundamental misconceptions in students in the Laboratory Electronics sequence, despite the technical sophistication of the audience. This paper describes an action research project [2] in which the curricula for Laboratory Electronics was modified to include tenets gleaned from the physics-by-inquiry methodology, based primarily on material designed by Lillian McDermott, Peter Shaffer [3] and their Physics Education Research group at the University of Washington. The goal was to determine if these pedagogical methods improve student understanding and retention of this material. BACKGROUND
Year after year, at a point in the semester when the students had become quite skilled at solving simple DC circuit problems, students would often reveal a mental model of voltage and current that contained one or more serious flaws. It also became apparent that a student’s level of academic performance was not a good predictor of these misconceptions. It became clear that students learn to apply Ohm’s Law and Kirchhoff’s Laws with machine-like efficiency, yet can simultaneously retain a fundamental misunderstanding of the science. An April 2007 survey of Laboratory Electronics students confirmed these observations. Only 50% of the students were able to correctly rate the brightness of bulbs in a simple DC circuit shown in Figure 1. In describing their experience with this problem, Shaffer et. al. wrote [4]: “This (batteries-and-bulbs, B&B) question has been administered to more than 500 university students and has proved fruitful for eliciting some common misconceptions. Almost every possible bulb order has appeared. When the question has been asked on course
978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference T2J-1
Session T2J examinations approximately 10% of the students in algebra-based courses, and 15% of the students in calculus based courses are able to correctly rank the bulbs. Whether the question is administered before or after instruction FIGURE 1 does not seem to SHAFFER’S B&B PROBLEM affect the outcome. We have found the same success rate among university graduates. A recent administration of this question to more than 100 science and science education faculty yielded similar results.” The first segment of a three-part video documentary “Minds of our own” [5] describes a project that presents a batteries-and-bulbs (B&B) problem to Harvard students on the day of their graduation. Most of the students had difficulty creating a circuit that would light the bulb. In an indictment of our educational system, Prof. Philip Sadler of Harvard comments “it goes to the fundamental understanding of electricity. If one cannot light a light bulb with a battery and wire then everything built on those ideas has problems.” It is precisely this most fundamental understanding of electricity that students in Laboratory Electronics lacked, even though most could apply Kirchhoff’s laws to virtually any circuit. In a 1994 paper, “Implications of cognitive studies for teaching physics,” Edward Redish [6] wrote "it is reasonably easy to learn something that matches or extends an existing mental model." He observed, nevertheless, that “it is very difficult to change an established mental model substantially" unless the student is provided “a clear and compelling contradiction.” It is this process of confronting a student’s stubborn and persistent misconceptions with contradictory evidence, often via Socratic dialogue, that provides the foundation of the inquiry method. By 1998 researchers began reporting that inquiry methods have a long term impact on student’s world view of physics. In the article “Do they stay fixed”, Francis et al. [7] describe a longitudinal study that examines student understanding of mechanics. In the study, students that had completed an inquiry “reformed” physics course were asked to write the Force Concept Inventory (FCI) four years post-instruction. The conclusion was that the inquiry method of instruction used in this course contributed to a long-term shift in their “students’ world view from ‘Aristotelian’ to ‘Newtonian.’” These researchers have provided a proven method of maximizing the educational value to our students. We must allow each student to discover compelling evidence that leads them to form a workable model. But our role as guide can be challenging, since we often must first identify their misconceptions about the material before we can point them
toward experiments that directly contradict their erred beliefs about the topic. TREATMENT AND METHODS With the substantial evidence that the misconceptions observed in Laboratory Electronics could be extinguished, attention was turned to finding ways to measure the expected improvement. In a 2004 paper, Engelhardt and Beichner [8] describe the development of a knowledge inventory instrument, Determining and Interpreting Resistive Electric circuits Concepts Test (DIRECT), designed to evaluate student understanding and misconceptions about electric circuits. Their instrument was used to study several groups of university-level physics students in which two of the courses, one algebra-based and another a calculus-based general physics course, used traditional curricula. A third group of students in an inquirybased course outperformed the other two groups on the author’s inventory. The changes in pedagogy used for this project were based primarily on material developed by the Physics by Inquiry two volume workbook/text. These methods are perhaps best described as a guide to help students discover physics concepts on their own, rather than “teaching by telling.” This latter technique of formal lecture remains quite common in the post-secondary educational system. However, research by McDermott and others show that the Physics by Inquiry (PBI) technique results in improved depth of understanding, reductions in misconceptions and longer-term recall of the principles. Their technique is referred to in this document generally as “inquiry.” The treatment plan was straightforward; the first three weeks of well-worn, well-tested “cookbook” laboratory exercises and lectures about Ohm’s and Kirchhoff’s laws were replaced with new exercises based on McDermott's "Physics by Inquiry" curriculum. The class time previously allocated for lectures on basic circuit analysis was instead devoted to conducting course business and offering pre-tests to help gauge the effectiveness of the modified curriculum. Learning was left entirely to the three-hour laboratory sessions. By the fourth week of the course the lecture presentations returned to the traditional lecture format to introduce the critical concepts of element and circuit constraints via Kirchhoff’s laws, while connecting these topics to models the students developed during the first three weeks. From this point on the course schedule was very similar to what had been done in previous offerings. One on-going change was a cultural shift away from heavy reliance on formal circuit analysis techniques. In addition, when new concepts were introduced it was done so using the terminology developed during the inquiry portion of the course. The goal was to continue to provide the students with the needed understanding of formal circuit analysis, including
978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference T2J-2
Session T2J techniques such as voltage and current division while supporting these students' efforts to develop and expand a robust and accurate mental model of voltage and current. The hope was that improved fundamental understanding might make up for reduced practice time with formal analysis methods. Several data collection methods were used to triangulate towards an assessment of the value of the treatment. These methods included the DIRECT concept inventory test, a practical exam that was used as cover to conduct individual assessments, as well as both formative and summative written assessments. Perhaps the most unequivocal data in this study were provided by the DIRECT instrument. This instrument also offers the benefit of a fairly large number (n>1000) of samples acquired by the designer to which the students in the treated and untreated Laboratory Electronics courses could be compared. Another useful tool was the combination of the two circuits in Figures 2 and 3. The first, a B&B problem designed by Dr. Greg Francis at MSU, was presented as an in-class assessment. Because fundamental understanding of both voltage and current is required to arrive at a correct answer, the FIGURE 2 problem has become known as the Fabulous FABULOUS FRANCIS 8 BULB PROBLEM Francis 8 Bulb Problem (FF8BP). The Quantitative 8 Resistor Problem (Qnt8RP) shown in Figure 3 is the electrical twin of the FF8BP. The results reconfirmed previous observations that the students in the untreated offering of Laboratory Electronics approached the Qnt8RP problem mechanistically. Two additional assessments were used in an effort to elicit misconceptions. The FIGURE 3 first was a question QUANTITATIVE 8 RESISTOR PROBLEM posed during one of the lab periods about a faux “Expensivo Sensor Company” product that was specified as requiring 12V at 50mA. The students were told that the only voltage supply available to operate the sensor was rated at 12V and 500mA. The goal was to determine if they had misconceptions about the voltage and current behavior of a voltage source and a load. Near the end of the course a more formal individual
performance assessment of student work was offered, reusing the circuit shown in Figure 1. In the format of a “practical exam” students were first asked to predict the relative brightness of the bulbs in the five-bulb problem in Figure 1. The students were then asked to confirm their ranking by constructing the circuits and making appropriate measurements. If they discovered that their initial prediction was wrong, they were asked to “explain where your logic broke down.” This opportunity for one-on-one interaction with the students was also used to informally interview them about their understanding of these circuits. DATA ANALYSIS The baseline group for this study was students enrolled in the Fall 2006 offering of Laboratory Electronics I. The treatment group consisted of students enrolled in the Fall 2007 offering of the class. Table 1 below provides a comparison of these two groups. A t-test of the average GPA of these two populations indicates that the two are statistically equivalent. TABLE 1 COMPARISON OF TREATED AND UNTREATED POPULATIONS Group description Fall 2006 Fall 2007 (baseline) (treatment group) Number of students enrolled 19 20 Number of non-physics majors 4 3 Class GPA 3.39 3.22 Standard Deviation in GPA 0.45 0.68 Non-traditional students 1 3 (mature learners)
DIRECT Instrument The DIRECT inventory asks 29 questions which are grouped into one of five topics or “objectives.” These include: I Physical aspects of circuits (schematics) II Equivalent circuits III Energy and power IV Current and conservation of current (KCL) V Voltage and Ohm’s Law (KVL) The results are shown in Figure 4 below. The first five bars, left-to-right, are grouped by objective. The last set of bars on the right (green section) represents each cohort’s average score on the inventory instrument. The four individual rows of bars represent four distinct student groups that were evaluated using the DIRECT instrument. The first row (light yellow) represents Engelhardt’s original data, obtained from high school and college students. The second row (blue bars) represents the Laboratory Electronics pre-test performance measured in the Fall of 2007 (treatment group). The third row (violet) reports the results for the untreated baseline group. The last row represents the performance of the treatment group, measured about the same time post-instruction as the baseline group.
978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference T2J-3
Session T2J The pre-test data from the Fall of 2007 show that students entering the Laboratory Electronics course are virtually indistinguishable from the population studied by Engelhard. These students start the course with the same misconceptions and similar abilities. As would be expected, post-instruction evaluation of this group of students showed growth on the DIRECT instrument.
FIGURE 4 DIRECT INSTRUMENT RESULTS
The class averages on the DIRECT instrument for the treated group shows improvement over the untreated population. And the t-test results come tantalizingly close to the critical value needed to claim a statistically significant improvement in student performance. However the small sample size and standard deviation in the DIRECT scores make impossible any such claim. In Figure 5 the post-instruction DIRECT results for both the baseline and treatment groups are plotted as a function of the population’s grade point average (GPA). This graph highlights the generally better performance of the treatment group (lighter lavender squares) on the DIRECT, especially those carrying a better GPA. Except for the two outliers, it appears that a direct relationship might exist between GPA and performance on the DIRECT instrument. This FIGURE 5 INDIVIDUAL PERFORMANCE ON result is not DIRECT INSTRUMENT VS. GPA unexpected, but does offer some evidence that the DIRECT instrument is in fact reliable.
The untreated baseline group (dark diamonds) does not appear to show this same relationship between GPA and performance on the DIRECT exam. This group also shows a wider scatter from high to low scores on the inventory. Perhaps most interesting is that, on average, the better students in the untreated group (as measured by GPA) actually scored worse on the DIRECT instrument. The average score on the DIRECT instrument of those with a GPA below 3.5 (n=9) was 66%. Those students with a GPA better than 3.5 (n=6) averaged only 55%. This result begs a question about whether those with a better GPA might be better at “mechanistically” finding solutions to quantitative problems. And perhaps those who, on average, struggled more with their academic program also tended to have developed a more fundamental understanding of electric circuits. Unfortunately, the data collected during this research is insufficient to confidently address this question. Other analysis methods using the DIRECT data indicate that the improvement is a result of reduced misconceptions about electric circuits. The graph in Figure 6 plots for each of the students enrolled in the treated class their posttreatment score as a function of the score earned on the pretest. In order to show gain the point must be above the dashed line. Those points to the upper left represent the greatest gain in knowledge as measured with the DIRECT instrument. Two outliers exist that would apparently indicate that these two students lost knowledge about electric FIGURE 6 circuits. In one INDIVIDUAL POST-TREATMENT VS. case, the PRE-TREATMENT ON DIRECT INSTRUMENT student that scored the highest on the pre-test (26/29), missed one additional question on the post-test, scoring 25 out of 29. This student apparently started the course with significant knowledge about electricity. The second student’s loss is more significant and more difficult to explain. With a GPA >3.5 this student’s performance on summative assessments does not appear to be an issue. And, none of the other methods offer insight about what might have caused this student’s poor performance on the post-test. Practical Exam The B&B problem used for the practical exam (Figure 1) proved to be useful both for the students and in the effort to understand their misconceptions. This test was taken from Shaffer’s large study [2] which found that only 15% of
978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference T2J-4
Session T2J calculus-based physics students were able to correctly rank the brightness of the bulbs. In the thirteenth week of the Laboratory Electronics I course, 16 of the 20 students (80%, n=20) ranked the brightness of the bulbs correctly. As measured by the DIRECT instrument, these cohorts began the course with a typical level of understanding of electric circuits. Thus, the results of the practical exam show significant growth. However the real power of the practical exam came from the opportunity to evaluate a number of characteristics of student performance. A selection of some of these observations includes: • Seven of the twenty papers (35%) contain clear evidence that the student substituted a resistance for each bulb and then used classical numerical methods to arrive at a solution. • The most common misconceptions (3 of 20 students representing 15% of the class) revealed during the practical exam involved the role of current in electric circuits. These errors map well to those described by Shaffer (1992) including the insidious misconception that “current is used up.” • Among students who did not correctly predict the bulb brightness the hands-on portion of the exam universally guided them to independently understand the error in their original thinking. This was accomplished by requiring the student to actually build the circuit and compare the results to their prediction. One student who incorrectly predicted the two bulbs in parallel would be the brightest wrote a revealing explanation for the error. “I think I was wrong about D & E being brighter than A because I forgot that the current would only increase from the source and then split evenly to each bulb making them equal in brightness to each other and A.” At least one other student described this same kind of misconception, in which their thinking appears to be that the source current determines the current in all paths of the network. This is, of course, true only in a series circuit. Two of the three students who demonstrated misconceptions about current on the Practical Exam also scored poorly on the follow-up offering of the DIRECT test. Comparing these two assessment methods show that misconceptions can be identified using various techniques. FF8BP vs. assessments
Qnt8RP
as
formative
and
summative
As discussed above, the FF8BP was designed so that the solution requires students to have a working model of both voltage and current. Its quantitative electrical twin (Qnt8RP) uses valued resistors rather than bulbs. The circuit solution to Qnt8BP can be achieved with the appropriate application of Kirchhoff’s laws and thus can be accomplished “mechanistically” even with significant misconceptions about electricity. These two problems were offered to the students in both summative and formative assessments.
The results for these assessments (Table 2) support the prediction that physics students are better able to arrive at a result if a circuit problem is quantifiable. TABLE 2 QUALITATIVE VERSUS QUANTITATIVE TEST QUESTIONS Assessment structure Qualitative (FF8BP) Quantitative (Qnt8BP)
% correct (n=20) 73% 85%
But the non-numerical observations from these assessments are also very useful. Only one of 15 students described using both the current and voltage models, which is required to evaluate FF8BP. A second student correctly noted that a current model alone was insufficient to solve the problem. With such a low use of the Physics by Inquiry models, how were students able to achieve success on the FF8BP? It appears from the student work that once their current model had failed, most made a gut-level stab at the answer. And in most cases, that guess was correct. This high success rate at divining the answer might be explained by an inherent understanding of voltage behavior in a circuit, without actually being able to explain their understanding. One student wrote “(it is) very intuitive that (bulbs) E&F have high resistance and lower current.” This comment reveals both that this student has developed a useful informal model and that his thinking involves substituting a resistance value for each bulb. Expensivo Sensor Company At week 8 in the course the students were asked to address the following scenario. They had purchased an expensive sensor that was rated at 12V and 50mA. The only power supply available was labeled 12V and 500mA. The goal was to determine how many students held the misconception that a voltage source also has a fixed current. Of the 11 useful responses, six initially stated (incorrectly) that the sensor would be damaged by using this source. One student wrote “pushing too much current through the. . . sensor will damage it.” This comment reveals an apparently common misconception that current is determined by the source, regardless of the load. Two students self-corrected their original answer, including the student quoted above. In a second paragraph this student wrote, “maybe the sensor will only pull 50mA . . . because it is a volt source and current varies depending on the load.” This assessment reveals that using inquiry methods in the classroom does not eliminate misconceptions. Lillian McDermott wrote (1993) “meaningful learning. . . connotes the ability to interpret and use knowledge in situations different from those in which it was initially acquired.” The argument can be made that this assessment shows that “meaningful learning” indeed occurred. In the end, 64% of the students got the right answer on a test of misconception, using questions that they had not previously encountered.
978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference T2J-5
Session T2J CONCLUSIONS These results suggest that incorporating inquiry methods into the Laboratory Electronics course improved the ability of physics students to qualitatively evaluate electric circuits, even while they continue to depend on their strength in finding mathematical solutions to confirm their results. Interpretation of the results of the several data collection methods lead to the following additional conclusions. • Students develop skills needed to solve qualitative circuits problems when they are exposed to inquiry techniques in the classroom and laboratory. Results from the DIRECT instrument and the practical exam shows an improvement among the treated class in nearly all categories evaluated. • Data sufficient to quantify the effect of inquiry techniques on students’ ability to solve quantitative problems was not obtained. However, personal observations indicate that even short exposures to inquiry methods encourage students to develop models that are useful in solving qualitative circuits problems. • Physics students will often restate a qualitative circuit problem in terms of quantifiable variables to produce a solution. • The laboratory is a powerful tool in the quest to help students develop a robust mental model. Evidence for this is provided by the fact that all of the students who initially erred in ranking the brightness of bulbs in Shaffer’s circuit (Figure. 1) independently discovered their error. The analysis of the data collected for this project leads to the conclusion that introducing inquiry methods into the Laboratory Electronics course provided significant benefits. Student performance on the DIRECT evaluation showed an improvement in understanding of electric circuits when compared to an untreated group. The work of students in the treated group, on both summative and formative assessments, contained fewer obvious misconceptions about circuits. Even so, the treated group had a lingering tendency to rely on quantitative methods for solving these qualitative problems. This fact provides evidence that the approach typically taught in physics classes, that is, computing a numerical result, remains for students the default method of solving problems. The data also leads to the conclusion that student success on B&B problems and improvements in the DIRECT evaluation scores seem to be directly related to the use of inquiry methods in the classroom. The data was insufficient to determine if the approach affected their ability to find solutions to quantitative problems. Higher rates of success at finding the solution for the quantitative twin (Qnt8BP) to an advanced batteries-andbulbs problem (FF8BP) indicate that even after exposure to inquiry methods, students are better prepared to solve quantitative than qualitative problems. It was shown that designing questions to expose specific
misconceptions is possible. The “Expensivo Sensor” problem demonstrates one such construct. However, it can be argued that the number of possible misconceptions present in a classroom is directly related to the number of students. Because each student brings with them a unique and interacting set of misconceptions, precisely identifying specific errors in their mental model can become a hit-ormiss process that is fraught with unreliability. Nevertheless, designing a question to induce a “clear and compelling contradiction” does not require a deep understanding of the precise misconception. The practical exam utilized a question originally designed to test whether students had a working mental model of electricity. When used in the context of a performance-based assessment the results of actually constructing the circuit provided a powerful learning opportunity. In this study, 100% of the students that made an initial error in predicting the brightness of the bulbs were able to self-correct their erred logic. The efforts to measure long-term effects of the inquiry method were encouraging, but did not pass a statistical test for significance. These results, taken from the DIRECT instrument showed some improvement in most areas when comparing treated and un-treated populations. While this project was conducted in a Physics classroom, there is every reason to believe it would be just as effective in an engineering course. And, determining the need for such an intervention is as simple as asking students to rank the brightness of five light bulbs. REFERENCES [1]
Cohen, R., Eylon, B., & Ganiel, U., “Potential Difference and current in simple electric circuits: A study of students’ concepts.” Am. J. Phys. 51, 1983, 407-412.
[2]
Mills, G. E., “Action Research: a guide for the teacher researcher”, Pearson, ISBN 013172276X, 2007
[3]
McDermott, L. C., & University of Washington, Physics Education Group, “Physics by Inquiry, An Introduction to Physics and the Physical Sciences”, Volume 2. John Wiley and Sons. 1996
[4]
Shaffer, Peter S., & McDermott, Lillian C., “Research as a guide for curriculum development: An example from introductory electricity. Part I: Investigation of student understanding.” Am. J. Phys., 60, 1992, 994-1003.
[5]
Harvard-Smithsonian Center for Astrophysics, “Resource: Minds of Our Own.” Retrieved May 15, 2008, from Annenberg Media Learner.org Web site: http://www.learner.org/onesheet/series26.html, 1997
[6]
Redish, Edward F., “Implications of cognitive studies for teaching physics.” Am. J. Phys., 62, 1994, 796-803.
[7]
Francis, G. E., Adams J. P., Noonan, E. J., “Do they stay fixed?”, The Physics Teacher, 36, Nov 1998, pp488-490
[8]
Engelhardt, P. V., & Beichner, R. J., “Students’ understanding of direct current resistive electrical circuits.” Am. J. Phys, 72, 2004, 98-115.
978-1-4244-4714-5/09/$25.00 ©2009 IEEE October 18 - 21, 2009, San Antonio, TX 39th ASEE/IEEE Frontiers in Education Conference T2J-6