The McMaster Problem Solving Program - Wiley Online Library

23 downloads 0 Views 335KB Size Report
Apr 7, 1997 - Engineering at McMaster University to be skilled problem solvers. Yet, in the late 1960's the faculty observed that our students had unexpected ...
Developing Problem Solving Skills: The McMaster Problem Solving Program DONALD R. WOODS Department of Chemical Engineering McMaster University

ANDREW N. HRYMAK Department of Chemical Engineering McMaster University

ROBERT R. MARSHALL Director, Ontario Region, MettNet Inc. McMaster University

ing skill and which methods were successful in developing the skills. We found that students need both comprehension of Chemical Engineering and what we call general problem solving skill to solve problems successfully. We identified 37 general problem solving skills. We use 120 hours of workshops spread over four required courses to develop the skills. Each skill is built (using content-independent activities), bridged (to apply the skill in the content-specific domain of Chemical Engineering) and extended (to use the skill in other contexts and contents and in everyday life). The tests and examinations of process skills, TEPS, that assess the degree to which the students can apply the skills are described. We illustrate how self-assessment was used.

PHILIP E. WOOD Department of Chemical Engineering McMaster University

CAMERON M. CROWE Department of Chemical Engineering McMaster University

TERRENCE W. HOFFMAN Department of Chemical Engineering and Senior Consultant, AspenTechnology Inc., Houston, TX

JOSEPH D. WRIGHT Department of Chemical Engineering and President, Pulp & Paper Research Institute of Canada

PAUL A. TAYLOR Department of Chemical Engineering McMaster University

KIMBERLY A. WOODHOUSE Department of Chemical Engineering University of Toronto

C.G. KYLE BOUCHARD Toronto Institute of Pharmaceutical Technology Chemical Engineering Department, McMaster University

ABSTRACT This paper describes a 25-year project in which we defined problem solving, identified effective methods for developing students’ skill in problem solving, implemented a series of four required courses to develop the skill, and evaluated the effectiveness of the program. Four research projects are summarized in which we identified which teaching methods failed to develop problem solvApril 1997

I. INTRODUCTION Educators expect that students will graduate from their courses and their programs with good problem solving and group skills. Teachers do their best to develop these skills. They assign many open-ended problems; they use many group projects. We expected the graduates from our four-year, accredited program in Chemical Engineering at McMaster University to be skilled problem solvers. Yet, in the late 1960’s the faculty observed that our students had unexpected difficulties. They could not solve problems if we changed the wording and context of the problem statement. The students seemed to be “collectors of sample solutions.” They seemed to try to solve problems by patching various parts of previous solutions together to match the new problem situations. Nor could they bring together ideas from different courses to solve realistic problems from industry. Confirming our lack of satisfaction with our graduates’ ability to solve problems were reports from industry. Reports suggested that university graduates did not possess sufficient expertise in communication, in solving problems, in working effectively with people, and in applying lifetime learning skills.1-7 Resnick8 reviewed the need for the development of higherorder thinking skills. Since the late 1960s, we have perused the research literature on teaching problems solving and completed four research projects to answer the questions “What is problem solving?” “Can it be taught?” and “Isn’t what we currently do in the classroom sufficient?” To clarify our meaning of problem solving, here is the working definition that we have developed. We define “problem solving” to be the processes used to obtain a best answer to an unknown, or a decision subject to some constraints. The problem situation is one that the problem solver has never encountered before; it is novel. An algorithm or procedure to be used to solve the problem is unclear; problem solving requires much mental work. By contrast, the term “exercise solving” refers to the “recalling” familiar solutions from previously-solved problems. The terms “a best answer” and Journal of Engineering Education 75

“subject to some constraints” emphasize that for real world problems, we never have perfect information. We lack information; there are errors in the information we do have; we never have enough resources. We must apply the law of optimum sloppiness (or optimum accuracy, as some prefer to call it) and obtain the best possible answer with the resources we have. Problem solving skill includes the following 12 attributes: • being aware of the processes used; • using pattern matching to quickly decide whether a situation is a problem or an exercise; • applying a variety of tactics and heuristics; • placing an emphasis on accuracy (as opposed to speed.); • being active by writing down ideas, creating charts and figures; • monitoring and reflecting on the process used; • being organized and systematic; • yet being flexible (keeping options open, seeing the situation from many different perspectives and points-of-view); • drawing on the pertinent subject knowledge and objectively and critically assessing the quality, accuracy and pertinence of that knowledge and data; • being willing to risk and cope with ambiguity, welcoming change and managing distress; • being willing to spend time reading, gathering information and defining the problem (as opposed to equating problem solving with “doing something” despite its pertinence); and • having an overall approach that uses fundamentals rather than trying to combine various memorized sample solutions. These then characterize the skills we seek to develop in our students. In this paper, first, we summarize our research to define the task of developing problem solving skills. Then, we describe our approach to develop problem solving skills: the MPS training program and its implementation. Our 20 years of program evaluation are summarized. Finally, we describe how our MPS program has been extended to other schools and contexts.

II. BACKGROUND RESEARCH We used four research projects. In the first project, 1972-73, faculty members sat in on colleague’s lectures and assessed the learning environment used to develop problem solving skills, and our colleague’s attempts to illustrate and develop problem solving skills. Our impression was that our students should be getting good problem solving skills. Colleagues carefully displayed many sample solutions, tried to engage the students in the problem solving process, and tried to help students “discover” the process. Colleagues used open-ended problems and gave explicit suggestions about how to solve problems. For example, they suggested that students “try working backwards from the goal,” “try dimensionless numbers,” and “work the problem in symbol form and then later substitute the numerical values.” The faculty asked students to show how they solved problems by having the students work problems on the board. We hoped that students should develop problem solving skills from these activities. Yet, students did not develop skill in problem solving. The students still could not solve problems when faculty changed the conditions slightly. They continued to depend extensively on sample solutions, even when these were 76

Journal of Engineering Education

shown to be inapplicable. From this research project we could not identify why the students were not learning problem solving skills. In the second project, funded by the Ontario Universities Program for Instructional Development (OUPID) 1974-1978, one of the team (DRW) became an undergraduate again. He enrolled in 1974 as a first year student and “graduated” in 1978. Over that four-year period, he attended eighteen hundred hours of classes in all the required courses (English, Chemistry, Differential equations, Physics etc.). The two purposes of attending classes were 1) to monitor weekly the subject knowledge that the students were learning and 2) to note explicit suggestions that faculty used to help students become skilled in solving problems. In addition, about 20 of his student classmates volunteered to attend four hours a week of problem solving workshops. The four purposes of the workshops were as follows. 1. To identify—and answer—the student’s most pressing needs each week in the four-year program. Each workshop session began by asking the students to identify their current greatest challenge. For example, near the beginning of the first year the needs were more about self management and basic learning skills: how to manage time; how to read the text; how to take lecture notes; how to sort out lecture notes.9,10 Later their needs were more related to problem solving: for example, how to solve problem 5.3 in Physics. In each workshop, we addressed their needs first. 2. To assess their comprehension of the subject knowledge. By attending class and taking lecture notes, Woods knew what subject knowledge had been “covered” in the lectures. In the workshop, we continually assessed the students’ comprehension of that subject knowledge. In this way we could find out if students could not solve problems because they did not understand the subject knowledge or because they were missing a problem solving skill. 3. To help the students to see the structure in that subject knowledge and to help them classify the information into different types of knowledge and heuristics, and suggestions for problem solving. To achieve this objective, each week the students created coded cards summarizing the knowledge learned in each class. These cards were then added to the “knowledge board” hung on the walls of a dedicated “problem solving room.” Photographs of the knowledge board are given elsewhere.10 4. To see how the students went about solving problems (when they knew the basic knowledge needed to solve the problem). We asked them individually to do one task important in solving a homework problem (such as “define the goal,” “draw a diagram,” “choose symbols,” “explain what the problem is really asking you to do”). When individuals were finished, each displayed his/her work simultaneously on the walls of the room in what we called the everybody-share method. Photographs of this everybody-share method are available.10 We then discussed the variety of approaches taken and extracted guidelines for successful problem solving. Thus, we observed, videotaped, tried various interventions and did protocol analysis of over 800 hours of student activity. From 19741978, we also observed groups at other universities: University of Western Ontario, McGill University, Virginia Polytechnic Institute and State University, and Denison University. We analyzed the problem solving approaches taken by our seniors—who had not attended the problem solving workshops—via the everybody-share method. They exhibited the same approaches as those taken by first year students. Indeed, students from other universities displayed the same approaches as students who had received no formal trainApril 1997

ing in problem solving. Throughout the project we sought advice on what to do and what to look for from colleagues.* In the third project, 1975, a second team member (CMC) repeated the second project described above. He enrolled as a first year student in 1975 and repeated the first half year. He attended all of the lectures and had four-hour weekly workshops with student volunteers. Although this was a second independent observer with a second cohort of students, Crowe observed no new insights about the students’ skills. Because the payoff from this faculty-intensive project was negligible, we stopped this third project in January 1976. In the fourth project we gathered evidence from our Ph.D. qualifying examinations. Candidates for the Ph.D. program orally describe the processes and fundamentals they use to solve original, unique conundrums in Chemical Engineering. The candidates had three such two-hour examinations. The purpose is to identify • Those candidates who do not know the three major subject domains in Chemical Engineering but who could think critically and solve problems based on what they do know, • Those who know the subject but their problem solving approach is based on memory. They can recall every problem they ever solved, but they cannot solve a problem that is new to them. • Those who know the subject and can apply it to solve new situations. Three faculty members observed each candidate. To prepare for the examinations, students typically studied cooperatively in small groups for about 600 h. We gave students the assessment criteria and simulated examinations. Therefore, the performance conditions and forms of evidence were clear. In any year, we asked all candidates the same questions by the same professor following the same format. Three teachers assessed each student by consensus. A few candidates knew the subject knowledge but had difficulty displaying their problem solving skills. For them, we offered 40 hours of intense workshops on “problem solving.” Later, they retook the exams. From 1964 to date, we have observed 150+ candidates. The results from these four research projects were:9,11,12 1. In class, we rated the faculty as doing an excellent and conscientious job of showing their approach to problem solving. They worked problems, supplied sample solutions, and showed a variety of problem solving heuristics. However, when tested, our students were unable to recognize, transfer or apply the skill for the process of solving their homework problems. In-class faculty demonstrations of the faculty’s skill in solving problems did not transfer skill to the students. 2. We tested students on their “comprehension” of the subject knowledge. We found that they “knew the subject knowledge.” Yet they could not solve homework problems. We called the general, content-independent, missing skill “problem solving.” This skill embodies the twelve characteristics given at the beginning of this article. These characteristics summarize the findings of the novice versus expert literature.13-35 This extensive set of references is given to lead to the breadth and depth of the basic research underlying this work. The citations illustrate that the research is not just in sci* We appreciate the input during those early days from Jack Lochhead, Jill Larkin, Fred Reif, Alan Schoenfeld, Eric Hewton, Joe Novak, Donna Mitchell, Elizabeth Brain, Vic Neufeld, Maynard Fuller, Robert Sparks and Art Whimbey.

April 1997

entific problem solving, but also in social sciences,25,34 behavioral sciences,22 and writing.20,30 The same findings have been found in Australia28 and in Europe.27,33 Our work with the everybody-share method with our volunteer students, our seniors and with students at different levels at four other institutions showed that all displayed the same weaknesses. These findings were confirmed in our fourth project: observing students during their Ph.D. qualifying examinations. 3. Despite individual professors’ dedication and efforts to develop problem solving skill, “general problem solving skill” was not developed in the four years in our undergraduate program. Students graduated showing the same inability that they had when they started the program. Some could not create hypotheses; some misread problem statements. During the four-year undergraduate engineering program studied, 1974-1978, the students had worked over 3000 homework problems, they had observed about 1000 sample solutions being worked on the board by either the teacher or by peers, and they had worked many open-ended problems.36 In other words, they showed no improvement in problem solving skills despite the best intentions of their instructors. Caillot37 and Meiring26 confirm these findings. 4. The workshop-style interventions** that we offered our volunteers—and Ph.D. candidates—made a difference. Their confidence and skills improved when compared with the performance of students who did not experience these workshops. In summary, our research showed that: 1) there is an identified, subject-independent skill set called problem solving, and 2) that students do not develop the skill in a four-year program by having teachers display how they solve problems, by giving out sample solutions, by using open-ended problems or by having peers show their problem solving. Workshops to explicitly develop skill seemed to hold potential for improving students’ problem solving skill and confidence.

III. PROGRAM DEVELOPMENT-ISSUES During the development of what we call the McMaster Problem Solving program (MPS program) we addressed such issues as How extensive would be our interventions? When? How? How would we assess the student’s skill? How could we ensure that the students could use their problem solving skills in other contexts? Would the MPS program be effective? Could other teachers use the MPS program in other contexts? We address each in turn. ** The programs that influenced us include Alverno College’s outcome-based curriculum, deBono’s CoRT thinking, Lipman’s Philosophy for Children, the American Institute of Chemical Engineers 2-day “Applied Problem Solving” course, the Chemical Institute of Canada’s 2-day “Problem Solving” course, the Kepner-Tregoe week-long program, Xerox’s Problem Solving Program, Feurestein’s Instrumental Enrichment, Rubinstein’s Patterns in Problem Solving,” Wales and Stager’s Guided Design, Ford’s “Team Oriented Problem Solving,” Creative Problem Solving, Paul’s Critical Thinking program, Conger’s Life Skills Program, Chaffee’s Critical thinking program, the Structure of Intellect, Covington’s productive thinking program, Ontario Mental Health program on stress management, Whimbey-Lochhead talk aloud pairs problem solving (TAPPS) activity and materials and McMaster Medical school’s small group, self-directed problem-based learning program. For details about most of these programs and the potential use for program development, Woods has published reviews of these programs over the past ten years in the column that he writes, called PS Corner, in the Journal of College Science Teaching.

Journal of Engineering Education 77

A. How Extensive? Based on our research we identified 37 separate skills:* four related to self-management; 14 related to personal problem solving ability for “well-defined, well-stated ordinary homework problems”; five for solving ill-defined problems; seven for interpersonal and group skills; two skills for self-assessment; one for change management; and four * Over the 25 years of this project, in response to needs of organizations, industries and educational institutions, we have developed MPS workshops for over 55 topics (on such additional topics as implementing change, dealing with difficult behaviors, networking and procrastination). Nevertheless, the 37 basic units given in Table 1 have remained the core for our undergraduate program.. Of the 37 units, 23 units (designated with *, †, and ‡ in Table 1) are usually included in the program. ** We define lifetime learning skill as assuming the responsibility for all elements in learning: deciding what to learn, creating goals and objectives, selecting the resources to be used and assessing the extent to which the goals have been achieved. We expect lifetime learners to accept people as probably the best resource for new knowledge. We expect learners to be skilled in learning independently and interdependently.

for lifetime learning skills.** The skills are both attitudinal and cognitive. Furthermore, skills overlap; we need some skills for several different categories. For example, stress management is needed for selfmanagement as well as for personal problem solving, interpersonal and group skills and change management. Table 1 lists the MPS units by number, the skill being developed, the hours usually used to develop the skill and the distribution of the MPS units among the four courses. Thus, the MPS unit 1 on Awareness usually requires about one to four hours and is part of the first course, called Chemical Engineering 2G2, to develop individual skill in solving traditional homework problems. The 48 contact hours in the course are used for in-class workshops. Which of the first 18 units are selected depends on the needs of the students. For example, some years all students have achieved the goals for the MPS unit 1 within two hours; other groups of students require four hours. Our goal is not to cover material. Rather, it is to develop the student’s confidence in a set of skills needed most by them. To develop all the skills given in Table 1 would require in excess of 200 contact hours. We have 120 hours

Table 1. List of the MPS units and the various courses. 78

Journal of Engineering Education

April 1997

available. In addition, students need time outside the workshop to reflect on and practice the skills in different contexts. This time is not included in the contact hours listed in Table 1. The usual units selected from the 37 units in the four courses are given by *, † and ‡ in Table 1. B. When? Students take these workshops in the first semester in our Departmental program, namely, the first semester of the sophomore year. The first course focuses on skills in solving well-defined, ordinary homework problems. A second course, in the junior year, gives added practice in applying these skills. A third course in the second semester, junior year adds skills related to group or team problem solving. A fourth course in the senior year addresses the skills needed for openended problem solving and for lifetime learning. Thus, the skill development started with our first contact with the students and continued through most of the remaining semesters. We asked all teachers to reinforce the skill application in all courses. Table 1 summarizes these courses and illustrate how and when the different MPS units occur. We have found that the needs of each cohort of students

differ. Therefore, the emphasis, the MPS units selected and the amount of time needed by each cohort to develop the skill varies. For example, the class of 1995 did not receive explicit workshops on MPS units 13, 14, 16 or 18. They spent extra time on the MPS units 3, 6 and 11 (on “self assessment,” “analysis” and “the unique you,” respectively). The needs of the students set the pacing of all the courses. C. How? Providing the students with opportunities to solve interesting problems is ineffective in developing problem solving skill. Watching others solve problems is ineffective. We developed workshopstyle activities for each explicit skill based on research in cognitive and behavioral psychology. Each workshop followed a standard pattern: the skill was defined and its importance given; the skill was put into the context of the other skills being developed. We gave students learning objectives and a brief pretest. An overview is given of the activities in the workshop. The key was to provide explicit activities that a) gave students a chance to see how they do the skill in a content-independent domain, b) allowed them to compare their behavior with target behavior, and c) helped them develop the target

Table 1 Continued. April 1997

Journal of Engineering Education 79

behavior through practice and immediate feedback. We needed to help them collect evidence about how they did the task. Bridging activities were included to link the skills application into the subject domain and into everyday life. When the participants had acquired the skill, the workshop closed with a post-test. Participants summarized change and growth and identified times and situations when they would apply the acquired skill. Throughout the workshop, participants reflected on what they were experiencing following the ideas suggested by Schon.39 Following each workshop, participants monitored (the weekly application of each new and previously-developed skill), kept a journal and provided evidence of growth. In developing the program and the workshops, our greatest challenge was to write observable and testable learning objectives in the areas of “soft” skills. It took us over three years to evolve our current set. For example, consider “creativity.” What explicit objectives can one use to distinguish a creative student from an uncreative one? How can one use these objectives as the basis for developing a workshop and how can one devise Tests or Examination of Problem Solving (TEPS) to assess the degree of skill acquisition? Tables 2 and 3 give example objectives and assessments for creativity. The objectives are structured following Bloom’s taxonomy.39 Hence, in Table 2, objective 3.1-3.3, requires application, objective 5, synthesis and so on. The objectives and example TEPS are available for other skills.40 We used Sternberg’s41 criteria in selecting and developing the methods, procedures and activities. Descriptions of most of the key workshops are available.42 D. Assessment of Students The assessment centered on the published, explicit objectives and criteria created for each unit and on the synthesis of these units. Formative assessment consisted of the journals that each student completed for each MPS unit and two personal interviews. In addition, participants elected to identify personal enrichment goals. For these, they created objectives, criteria, forms of evidence and kept a journal of progress. Table 4 lists examples of the topics selected. This illustrates the typical student needs in the second, third and final years. For summative assessment, students wrote two or three-hour examinations, tests and examinations for process skills, TEPS. Self-assessment was a significant component in the program.43 Self-assessment included the following dimensions: the personal interview in which each student identified the degree to which he/she had achieved each MPS objective as substantiated by evidence; the personal enrichment activity; and the personal choice of selecting the weighting the formative and summative components had in the final mark. Thus, students contracted to have the final examination count any weighting between 10 and 90% toward their final mark. The default contract was that the student received the best of a 60 to 40 split (exam and team work).42,43 E. Student Transference of the Skill A major concern was to develop the process skills such that students could use the MPS materials in any context. For example, students who experienced a separate course on “processing skills” using context-independent materials usually could not apply the skills in other contexts.44-47 Teaching a separate, stand-alone course does not develop transferability of the skill. On the other hand, developing the skills when students are learning the new subject knowledge is challenging. Students self-reported that the processing skills are so com80

Journal of Engineering Education

plex that they cannot develop them when they are learning subject knowledge. This was particularly true for problem solving skills. Perkins and Salomon45 give an excellent overview of the issues. To transfer the skill, we have found that the processing skills are best developed through a three-stage process:42,48 1. Build the skill in a stress-free exercise so that students can focus on the mental processes being used (instead of thinking about both subject-knowledge and the processing skill); 2. Bridge those processing skills to apply them in a simplified problem situation in a target subject domain; reflect on the process used to solve the simplified problem; 3. Extend the application of those process skills to any type of problem situation. We enriched this three-stage development process by requiring the students to write a reflective journal in which they reflected on the build, bridge, and extend experiences. To build the process skill, the students assessed the degree to which they developed the skill in the context-independent portions of the workshop. For the bridge stage, they reflected on the workshop task elements that included subject-rich activities. In the extend stage they reflected on their use of the skill in their subject courses and in their everyday life. Alumni report that this is an extremely important and critical characteristic of the program. The starting activities were “exercises” in subject-independent contexts so that the participants could focus on the “process” and not get hung up on the content. For example, IQ or detective stories and everyday activities were used first. Then, once students had internalized the “process,” they used the process on “exercises” in the subject discipline. Then, the participants used the process on “problems” in the subject discipline.42,48 Table 5 summarizes our approach to bridging and extending and illustrates these activities for some example MPS topics. F. Effectiveness? We would assess the program effectiveness by control studies and by longitudinal studies involving our alumni. The results are discussed in Section V. G. Program Transference We wrote instructor’s guides and would assess the ease with which we and others could run MPS workshops in other contexts. The results are given in Section VI.

IV. PROGRAM IMPLEMENTATION The program was set up in 1982 46,49 and has operated similarly to the original plan since then. Class size has ranged from 20 to 60 Chemical Engineering students. We dovetailed the problem solving/interpersonal skills/self-directed learning components into four required courses: a sophomore, two junior and one senior course, what we now refer to as the MPS program. The introduction of the four required courses with explicit problem solving skill development was part of an overall curriculum revision. In this revision, we removed duplication of topics among courses, introduced design across the curriculum, integrated computer skill development and communication skills throughout the program, presented an integrated laboratory experience, and tried to have all of the core subjects (including process control and reaction kinetics/reactor deApril 1997

Table 2. Example learning objectives for one MPS unit.

Table 3. Example assessment activities for an MPS unit. April 1997

Journal of Engineering Education 81

sign) completed by the end of the junior year. This required that in all of the Chemical Engineering courses in the sophomore and junior years we focus on the fundamentals and leave all of the interesting enrichments to later specialty courses. All in all, the contact hours and the student load remained the same. Achieving this required a combined effort among students and faculty, a willingness for faculty to allow their favorite courses to become electives and an acceptance that we provide the core fundamentals plus skill in lifetime learning. We realized that we could not provide all the electives and specialty options graduates would need throughout their career. We accepted the analogy “Instead of giving the students more fish, we needed to teach them how to fish.” Details of curriculum revision are available.49 Concerning the introduction of the MPS program, in particular, all faculty accepted the need for the four required courses together with integration of the concepts throughout the program. About 30% of the faculty spearheaded the development. Here are more details of the four required courses. The emphasis in the first, required course, ChE. 2G2, was on developing students’ skill in solving ordinary homework problems. We selected the homework problems in concurrent, required courses in Chemical Engineering. In this course the focus was on self-awareness and problem solving with the MPS units given in Table 1. The students developed and then used the skill to solve homework problems in two independent, separate Chemical Engineering corequisite courses: the four-credit Mass and Energy balance course (ChE 2D4) and the two-credit communication course

(ChE 2C2). In this way, students bridged to apply the problem solving skill to their homework problems in other contexts. (Originally, we asked students to solve the computer programming homework assignments.46 However, students did not find these homework problems difficult. Instead, they found the homework in the mass and energy balance course (ChE 2D4) to be challenging. Therefore, we changed the focus of the bridging activity to the mass and energy balance course.) Students reflected on the application of the skills in other courses and in their everyday life to extend the skill application. They documented evidence of this bridging and extension in their weekly journal. The second, required course, ChE. 3E4, introduced no new problem solving skills. Students used the problem solving skill set to create mathematical models for different Chemical Engineering situations.50 Therefore, students bridged and extended the problem solving skill directly in Chemical Engineering. Thus, the skills that students had developed previously were reinforced and bridging occurred continually within the course. The third, required “project” course, ChE. 3G3, was teamtaught. One faculty member focused on mathematical modelling of a local chemical plant. The other faculty member focused on developing the student’s interpersonal and problem solving skills. The students worked in teams and usually interacted with the plant operators. Table 1 lists these skills: group skills, broadening perspectives, how to ask questions, and small group, problem-based learning. The students developed new skills and then applied (or

Table 4. Example topics selected by the students for self-enrichment. 82

Journal of Engineering Education

April 1997

bridged) these to the team project within the context of the same course. Students reflected on the application of the skills in other courses and in their everyday life to extend the skill application. They documented evidence of this bridging and extension in their weekly journal. The fourth, required course, ChE 4N4, focused on new skills in problem solving and team skills and on subject-discipline knowledge in engineering economics, ethics and chemical process plant design and analysis. Table 1 summarizes the context and the content. The new skills were defining open-ended problems, trouble shooting, chairperson skills, optimum sloppiness, tacit subject knowledge, and developing subject knowledge structure. We developed the problem solving, interpersonal skills and the subject knowledge concurrently. Bridging and extending occurred continually. Students reflected on the application of the skills in other courses and in their everyday life to extend the skill application. They documented evidence of this bridging and extension in their weekly journal. Through these four courses, we explicitly developed 30 to 40 skills through about 120 workshop contact hours. Bridging in the subject discipline of Chemical Engineering was done continually through the four different patterns given in Table 5 of build, bridge

and extend to subject-specific courses and to everyday activities. We used the threefold pattern of subject-independent “exercises,” subject-specific “exercises,” and subject-specific “problems” in the workshops. Once the students have strong skills in problem solving and interpersonal skills, small group, self-directed, problem-based learning could be used to develop skill in learning how to learn. That is, instead of lecturing on engineering economics, groups of five students are given a series of five problems. For each, they identified what they knew already and what the vital issues were and what they needed to learn. Then, they allocated their team resources so that each student contracted with the group to learn a given topic and return to the group and teach others. The group then synthesized all the knowledge and used it to solve the problem.51-55 The activities were modeled after Knowles,55 Tough,56 and McMaster Medical School.57-59* Initially, this activity occurred only in the senior level course ChE. 4N4. At the students’ insistence, we moved this earlier in the program to be part of the junior level course. * We thank the reviewers for bringing to our attention the 25 year old problembased learning program implemented at Aalborg University. 2,60,61 Whereas in our MPS program, the PBL activity is limited, in their program PBL is used extensively.

Table 5. Developing the transference of the skill. April 1997

Journal of Engineering Education 83

Some key components in our success in implementing the program were: • the team effort among Woods, Crowe, Hoffman, and Wright during the research phase with projects 2 and 3; • the impact of the intervention with the Ph.D. candidates, in project 4, where all faculty could see the effectiveness of a 45 hour block of skill development; • the team teaching of some of the courses, with one teacher focusing on developing the problem solving skills and the other on bridging to computer programming, mass and energy balances, modelling and applied mathematics, and computer modelling; • the clarification sessions that we ran for all faculty to help them understand the skills being developed and the tasks they can therefore ask students in any class to do that would reinforce the skill application; • the very positive response from alumni and recruiters, and the host of foreign visitors who have come to observe our classes; • our inclusion of extensive self-assessment and reflective journal writing. • treating the development of these “soft skills” with the same academic rigor that we use in the learning of any knowledge: objectives, assessment, standards of performance, and accountability. • using instructors in these courses whose personality matches this style of course. An early challenge was to find teachers to facilitate the problem solving sessions. Running these sessions requires shifting a teacher’s attitude from being the lecturer and focal-point of the classroom, to being the facilitator/coach working on the sideline to help the students develop the skill and to ensure that performance standards are high. The focal point of the classroom is the student. Indeed, in the 1980s when any faculty remained uncomfortable with this unfamiliar role of being a facilitator, we hired alumni from the program to serve as instructors for these courses to cover for research leaves.

V. PROGRAM EVALUATION For the total program, we report on seven different evaluations: 1) marks improvement in other courses, 2) student acceptance of the learning environment, 3) their confidence in their problem solving skills, 4) their skill in problem solving 5) their attitude toward lifetime learning, 6) development of self-assessment skill, 7) alumni, recruiter and employer response, and 8) student and faculty acceptance. We use t-tests and the “d” statistic to report the data (where “d” is defined as the arithmetic average of the performance of the treated group minus the arithmetic average of the performance of the control group divided by the standard deviation of the performance of the control group). The “d” statistic is used so that the current work can be put in the context of previous studies on developing problem solving and thinking skills. In particular, Herrnstein et al.,62 in a major study evaluating the effectiveness of the Odyssey/Project Intelligence program to develop problem solving, used the “d” statistic In their comprehensive study they found “d” values in the range 0.10 to 0.72.

A. Marks Improvement in Other Courses In our program, Chemical Engineering and Applied Chemistry students take a required, 4-credit course (ChE 2F4) in the second semester of the second level. We set up the new curriculum in the fall of 1982. Chemical Engineering students were required to take a course in problem solving (ChE 2G2) concurrently with ChE 2F4. The Applied Chemistry students did not take ChE 2G2 and therefore served as the control group. Therefore, we had data for both groups before 1982 (before the treatment) and pooled data for 1982-83 and 1983-84 when the chemical engineering students received the problem solving courses and the control group did not. For both groups we measured the difference between the marks in ChE 2F4 after 1982 and before 1982. The null hypothesis was that the difference in Chemical Engineers’ marks before and after 1982 would be the same as the difference in Applied Chemists’ marks before and after 1982. In other words, the problem solving course would have no significant effect on the students’ marks in ChE 2F4. A t-test (t = 1.56, p

Suggest Documents