Identifying Engineering Students’ Design Practices Using Process Data Camilo Vieira Purdue University, West Lafayette, USA
[email protected]
Alejandra J. Magana Purdue University, West Lafayette, USA
[email protected]
Senay Purzer Purdue University, West Lafayette, USA
[email protected]
Abstract: The assessment of the development and quality of students’ engineering design competencies is a challenging task. The open-ended nature of design process, with multiple paths and possible solutions for a given problem, requires creative ways of assessment. Technology-based assessment techniques are a promising approach to achieve this goal. This paper uses clickstream data from an educational CAD tool to identify what elements from student design processes change after instruction. Specifically, we focus on two design strategies: (1) generating ideas and (2) conducting experiments. Colombian engineering students from a Caribean public university worked on a design challenge to create an energy efficient building using an educational CAD tool. This tool logs all student interactions with the software, which the researchers process to identify student design strategies. The results from this study suggest that the process data from an educational CAD tool can be analyzed both visually and using common statistical techniques to identify changes in student design strategies.
Introduction Engineering design is an important process to solve real-world complex problems that do not have a unique solution nor a unique path to find the solution. Such complexity and its openended nature make the engineering design process difficult to measure. Some of the qualitative approaches that have been used to assess this process include: (1) student descriptions of how they used the design process (Hirsch, Berliner-Heyman, Carpinelli, & Kimmel, 2012); verbal protocol analysis (Atman, Chimka, Bursic, & Nachtman, 1999); and student critiques of a given design process (Bailey, 2008). However, these approaches are often time consuming and therefore, non-scalable. Technology-based assessments have recently emerged as an opportunity to assess these complex processes. Students’ process data can be captured from their interactions with computational tools while working on a design task, and then be analyzed using pattern recognition and statistical analysis tools. For instance, Worsley and Blikstein (2014) used screenshots captured while students worked on a design task to understand their process. Xie and colleagues (2014) studied the sensitivity of CAD logs to instructional interventions related to the science behind the design challenge. Goldstein and colleagues explored screen cast reproductions and students’ interactions with the CAD tool to describe their ideation strategies. Vieira, Goldstein, Purzer, and Magana (2016) used students’ interactions with an educational CAD tool to characterize student experimentation strategies in design,
1
and Seah and colleagues (2016) expanded this work by exploring the effect of providing an instructional intervention on students’ experimentation strategies in design. This study explores the sensitivity of CAD logs to characterize student design process before and after instruction related to effective design strategies.
Context The study takes place at a Colombian institution with collaboration among researchers from Colombia and United States. In the United States, engineering design has acquired a great relevance both at the high school and at the college level. The design process is introduced both as a learning outcome and as a pedagogical strategy to support student learning. Conversely, Colombian institutions and Colombian educational researchers have not formally adopted this approach to science and engineering education (Vieira, Aguas, Goldstein, Purzer, & Magana, 2016). To start addressing this gap, engineering educators from the United States conducted a workshop in Colombia during the summer 2015. The workshop was offered to engineering students and focused on completing a design challenge related to energy efficiency using an educational CAD tool called Energy 3D. Students’ interactions with the CAD tool were captured into log files, which were analyzed to identify students’ design strategies.
Research question The research question for this study is: How do engineering students’ design strategies change while solving a design challenge as captured by process data from CAD Logs?
Theoretical Framework The theoretical framework that guides this study is the informed design teaching and learning matrix (Crismond & Adams, 2012). This matrix describes nine design strategies that beginning and informed designers approach differently: Understand the challenge; Build Knowledge; Generate Ideas; Represent Ideas; Weigh Options and Make Decisions; Conduct Experiments; Troubleshoot; Revise/Iterate; and Reflect on Process. For instance, beginning designers do not spend a lot of time on understanding the challenge and building knowledge because they usually consider the problem to be well-structured, where there is a single solution. Conversely, informed designers wait to make design decisions until the challenge is well understood. Informed designers also identify associated “key issues” to be sure they do not commit to a final design decision in a very early stage. There are two implications of this framework for the research study. First, students were introduced with these differences and were asked to reflect on where their own design process within each of the design strategies. Second, student interactions with the software will be analyzed to identify whether they depict any changes that represent students moving from beginning to informed designers. For the context of this study, we focused on two of the design strategies that have been previously explored using process data. Table 1 describes these design strategies, how designers with different levels of expertise approach them, and how these can be characterized using process data.
Methodology Participants and Procedures During the summer 2015, an eight-hour workshop was delivered for two groups of engineering students from two different programs: Systems Engineering and Industrial Engineering. Sixteen students from Systems Engineering participating in these activities are included in this study. The goals of the workshop were (Vieira, Aguas, Goldstein, Purzer, &
2
Magana, 2016): (1) to promote student understanding about engineering practice; (2) to foster acquisition of solar science knowledge; and (3) to increase student understanding about engineering design. The workshop was created as an active learning pedagogical strategy guided using Learning by DesignTM (Kolodner et al., 2003). The purpose of this design was to provide engineering students with active learning strategies that were more engaging than traditional classroom lectures. Table 1: Characterization of beginning and informed design strategies with process data. Design Strategy
Beginning Designers Approach
Informed Designers Approach
Process Data
Generate Ideas
Get fixated into early ideas with little interest on changing it.
Generate multiple ideas for their design, avoiding fixation
Number of designs generated using the CAD tool. Number of actions with a given object.
Conduct Experiments
Perform no experiments or confounded ones.
Perform multiple controlled experiments using relevant variables.
Model: Experimentation a Strategies in Design (Vieira et al., 2016)
Students worked on finding a solution for a design challenge using a computer-aided design (CAD) tool called Energy 3D (Concord, 2017). This is a free open source software that allow students to create energy efficient buildings. The challenge consisted of building a house located in Boston-Massachusetts that generated enough energy for itself in a one-year period. Besides the net-zero energy goal, there were other criteria and constraints involved in this challenge: max cost of $60,000; area of 100-200 m2; each side of the house must have at least one window; tree trunks must be at least two meters away from the house; and attractive exterior or “curb appeal”. The workshop was divided into four two-hour sessions as illustrated in Figure 1. During the first session, the software was introduced to the students and they were allowed to get familiar with it. The second half of this session, students were introduced with the design challenge and started working on it. During the second session, the differences between novice and informed designers were discussed as described by Crismond and Adams (2012) with the group. In order to avoid fixation of ideas, students were asked to initially create three different designs and make an informed decision to choose their best design during the third session. During the fourth and final session, students finished their selected design and presented it to the group.
Figure 1: Sequence of the four workshop sessions
3
Data Analysis We then analyzed the data on student interactions with Energy 3D (herein called process data) before and after the discussion on session 2, to identify how students’ design strategies changed in their own design process. For instance, Figure 2 represents the number of actions of a single student over time during the second session. The top four line plots represent construction actions on four key elements that had an effect on the energy efficiency of the building. The slope of the roof changed both the volume and cost of the building, as well as the efficiency of the solar panels for a given location. The trees could provide shade to the building to keep it cool during the summer months, but inappropriate choices of trees (e.g. deciduous vs. evergreen) or locations for these trees could also increase the need for heating during the winter months. The size and location of windows is also an important element of this design, because they allow sunlight to get into the house, and heat to escape from it. The bottom line plot corresponds to the number of energy consumption analysis actions this student carried out. The blue box in the middle of the figure represents the time when the discussion and reflection about effective design strategies took place.
Figure 2: Number of actions from a single student over time There are a couple of elements that can be visually identified as differences in this student design process before and after the intervention. For instance, this student started to focus on installing and moving solar panels after the intervention, always in parallel with data collection procedures to identify changes in energy efficiency of the building. This suggests that the student was informing her decisions based on data collected from the CAD tool. This exploration seem to be taking place in parallel with changes in the roof, which can potentially have an effect on the efficiency of the solar panels. That connection between solar panels and the roof is not evident before the intervention. Besides the visual analysis of students’ interactions, this study also explores the two design strategies in terms of changes in specific objects. The generation of ideas is explored at a building level and at an object level. At the building level, we explore whether students started working on a completely new design after reflecting about their fixation on a single idea. At the object level we identify if they started exploring completely new objects, interactions between objects (e.g., windows and trees), or if they stopped being fixated about a particular object (e.g. the solar panels in Figure 2).
4
Students’ experimentation strategies were analyzed using the model experimentation strategies in design (Vieira et al., 2016), which describes a continuum of strategies from beginning to informed designers. We hypothesized that students would start to do more controlled systematic experiments after the intervention.
Findings We present the study results under three subsections. First, we discuss the visual analysis of student interactions on the second session before and after the instructional intervention. For the visual analysis, we present two representative examples of student interactions. Then, we discuss the changes on two of the design strategies averaging these numbers for all students.
Visual Representation of Students’ Interactions with Energy 3D Figure 3 and 4 depict two representative examples of students’ interactions before and after the intervention. For student A in Figure 3, most of the actions were initially between windows, solar panels, and the roof. This student seemed to be modifying the three variables at the same time between the minute 20 and 45. Starting at the minute 25, this student collected data several times, but changing all these variables at the same time, should not give her an informative result. This process becomes a confounding experiment were results cannot be linked to a specific variable. After the intervention, this student started to explore other ideas that did not involve solar panels. For instance, between the minute 60 and 75, this student evaluated the relationship between windows and trees in what seems to be an iterative experimentation process: (1) modify window/tree variables; (2) collect data; (3) modify window/tree variables; and (4) collect data again.
Figure 3: Number of actions from a Student A over time The student B’s interactions with the CAD tool can be visually analyzed in Figure 4. Contrarily to student A, this student continued with very similar actions after the intervention as she was doing before. For instance, the energy annual analysis was only carried out every 10 minutes or so, after several changes were made to the design. This represents confounding experiments. There are no important changes either related to which object this student is focusing on.
5
Figure 4: Number of actions from a Student B over time
Generating Ideas In order to identify how ideation strategies changed after the effective design strategies were discussed in the classroom we considered two different levels. The first level of idea fluency corresponds to students’ intention to work on a completely new design, avoiding fixation. Thirteen students out of 16 started to work on a new design after the intervention. Moreover, four of these students actually work on three different designs during this second session of the workshop. The average number of actions for the first, second, and third designs were 209.8, 136.8, and 137.8 correspondingly. This result suggests that students started exploring idea fluency at the design level after the intervention. The second level of idea fluency corresponds to scenarios in which students would start focusing on different objects after the intervention. Table 2 represents the difference (afterbefore the intervention) on the number of actions over a given object. A negative number means that the student did less actions on that object after the intervention, while a positive number represents a larger number of actions after the intervention. Since the amount time for students to work on their design is comparable (i.e, approximately 55 minutes), a large positive or a small negative number should suggest the exploration of new ideas. Students 1, 2, 4, 5, 8, 13, and 15 significantly increased their number of actions with solar panels, some of them (e.g., 4 and 15) reducing their focus on windows. Student 15 instead focused on solar panels (106) and roof actions (30) significantly more after the intervention. Students 8, 11, 12, and 13 started to explore the effect of windows and trees on the energy efficiency of the building. Students 10 and 16 do not seem to have explored many different things after the intervention, as most of their number of actions on different objects remained stable.
Conducting Experiments The experimentation strategies within student design process were analyzed by identifying both the number of experiments students did before and after the intervention, as well as the number of systematic experiments. The average number of experiments (i.e., data collection) after the intervention (mean= 21.06, SD=12.82) was significantly different (t (15)= -2.44, , p = 0.03) from the number of experiments before the intervention (mean= 12.18, SD= 12.84).
6
Table 1: Difference between the number of actions for a given object before and after the intervention Roof Tree Solar Student Panel Custom Hip Pyramid Maple Oak Pine Dogwood Window 1
0
6
-8
0
0
34
-5
-2
100
2
0
3
12
-2
-2
17
0
1
45
3
13
0
0
2
6
8
4
2
10
4
-6
-5
0
7
-2
-4
1
-31
34
5
4
-5
15
0
0
0
0
32
49
6
0
0
13
0
0
4
6
7
7
7
-14
1
5
0
1
1
-2
-5
-7
8
1
-12
10
4
4
-1
9
56
26
9
0
-15
0
0
0
-2
-9
-23
-22
10
0
0
-4
-1
3
-7
0
5
20
11
0
6
2
4
1
17
14
73
-54
12
-1
0
1
0
0
8
2
31
19
13
0
5
-4
0
0
5
8
20
27
14
0
0
-3
2
0
9
3
2
9
15
30
-2
2
1
21
5
2
-21
106
16
0
4
7
0
0
0
9
-5
21
Likewise, the number of systematic experiments after the intervention (mean= 10.88, SD=8.41) was significantly different (t (15) = -2.565, p = 0.02) from the number of systematic experiments before the intervention (mean= 5.81, SD= 6.75). This result suggests that students started doing more controlled experiments after the intervention, and that the clickstream data was able to capture these changes in students’ design behaviors.
Implications and Future Work This study explored changes in student design strategies as characterized by process data from student interactions with an educational CAD tool. Specifically, we identified changes in the number of ideas generated and the experimentation strategies before and after an instructional intervention about effective design strategies. The results suggest that the process data can be used to identify changes in student design behaviors after a given condition. However, additional studies are required to validate what has been identified from the process data with what the student is actually considering in these changes. A future study can implement a think-aloud protocol that would enable us to compare this process data with students’ rationale for their design process. The implications of this study fall into the use of technology-based assessment techniques to characterize open-ended tasks such as engineering design. The study has demonstrated how this process data can be analyzed both visually, and using common statistical approaches to identify changes un student design strategies.
7
References Atman, C. J., Chimka, J. R., Bursic, K. M., & Nachtman, H. L. (1999). A comparison of freshman and senior engineering design processes. Design Studies, 20(2), 131–152. http://dx.doi.org/10.1016/S0142-694X(98)00031-3 Bailey, R. (2008). Comparative study of undergraduate and practicing engineer knowledge of the roles of problem definition and idea generation in design. International Journal of Engineering Education, 24(2), 226–233. Concord. (2017). Energy3D: Learning to build a sustainable future. http://energy.concord.org/energy3d/ Crismond, D. P., & Adams, R. S. (2012). The informed design teaching and learning matrix. Journal of Engineering Education, 101(4), 738-797. Goldstein, M. H. Purzer, S., Vieira, C., Douglas, A. and Zielinski, M (2015) Assessing Idea Fluency through the Student Design Process Proceedings of the 45th Annual Frontiers in Education (FIE) Conference. El Paso, Texas. October 21-24, 2015. Hirsch, L. S., Berliner-Heyman, S. L., Carpinelli, J. D., & Kimmel, H. S. (2012). Introducing middle school students to engineering and the engineering design process. Proceedings of the American Society for Engineering Education Annual Conference & Exposition. San Antonio, TX. Kolodner, J. L., Camp, P. J., Crismond, D., Fasse, B., Gray, J., Holbrook, J., ... & Ryan, M. (2003). Problem-based learning meets case-based reasoning in the middle-school science classroom: Putting learning by design (tm) into practice. The journal of the learning sciences, 12(4), 495-547. Vieira, C., Aguas, R., Goldstein, M. H., Purzer, S., & Magana, A. J. (2016). Assessing the Impact of an Engineering Design Workshop on Colombian Engineering Undergraduate Students. INTERNATIONAL JOURNAL OF ENGINEERING EDUCATION, 32(5), 1972-1983. Vieira, C., Goldstein, M. H., Purzer, Ş., & Magana, A. J. (2016). Using Learning Analytics to Characterize Student Experimentation Strategies in the Context of Engineering Design. Journal of Learning Analytics, 3(3), 291-317. Seah, Y. Y., Vieira, C., Dasgupta, C., & Magana, A. J. (2016). Exploring Students’ Experimentation Strategies in Engineering Design using an Educational CAD Tool. Proceedings of the 46th Annual Frontiers in Education (FIE) Conference. Erie, PA. October 12-14, 2016 Worsley, M., & Blikstein, P. (2014). Analyzing engineering design through the lens of computation. Journal of Learning Analytics, 1(2), 151–186. Xie, C., Zhang, Z., Nourian, S., Pallant, A., & Bailey, S. (2014). On the instructional sensitivity of CAD logs. International Journal of Engineering Education, 30(4), 760–778.
Acknowledgements This paper was partially supported by the NSF under the awards DUE 1348547 and DRL 1503436. The research team would also like to thank our colleague Roberto Aguas and the Universidad del Magdalena for inviting us to do these activities at their campus.
Copyright statement Copyright © 2017 Camilo Vieira, Alejandra J. Magana and Senay Purzer: The authors assign to the REES organisers and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The authors also grant a non-exclusive licence to REES to publish this document in full on the World Wide Web (prime sites and mirrors), on portable media and in printed form within the REES 2017 conference proceedings. Any other usage is prohibited without the express permission of the authors.
8