new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work ... have implemented in the sophomore computer science course. ... make the current process visible to all those involved [7].
Copyright 1998 IEEE. Published in the Proceedings of the Tenth Conference on Software Engineering Education and Training, Virginia Beach, VA, April 13-16, 1997. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works, must be obtained from the IEEE. Contact: Manager, Copyrights and Permissions / IEEE Service Center / 445 Hoes Lane / P.O. Box 1331 / Piscataway, NJ 08855-1331, USA. Telephone: + Intl. 908-562-3966 .
Designing Process-based Software Curriculum Richard L. Upchurch Computer and Information Science Department Judith E. Sims-Knight Psychology Department University of Massachusetts Dartmouth N. Dartmouth, MA 02747-2300
Abstract Computer science education traditionally has stemmed from its mathematical roots and has been related to practice through instruction of programming languages. Good software engineering practice, in contrast, requires expertise at a complex of activities that involve the intellectual skills of planning, designing, evaluating, and revising. Cognitive research has revealed that developing intellectual skills, such as these, requires (a) explicit instruction and practice, (b) in the context in which such skills will be applied, (c) in carefully structured ways. We are applying the techniques of cognitive apprenticeship, situated cognition, and reflective practice, based on our earlier successful application of such techniques, to the development of laboratories to accompany two undergraduate classes. The first section of this paper provides the foundations from the computer science/software engineering domain that justify our effort. The second section provides the background in cognitive research we use to structure the learning environment and activities for the students. Section three provides an overview of the goals we have established as part of this development activity. Section four describes the activities we have implemented in the sophomore computer science course. We conclude our remarks with a discussion of problems and intended directions.
1: Introduction Many of the problems in the software industry appear to stem from an assumption that technology is a panacea [1]. The industry continues to search for a technological fix to issues of low productivity and poor quality. This approach manifests itself in a failure to attend to the development context, which includes the people, the problem, and the product characteristics [2][3]. Researchers in software process have a differing perspective, and have identified the need for project organization to take precedence over methodology, and methodology over technology [4][5][6]. The software process argument leads to a focus on process improvement. A clear objective in the process improvement model is to make the current process visible to all those involved [7]. With the process visible then one
can begin to assess actions that claim to offer improvement, based firmly on measurement of process activities. The assimilation of good practices into the existing process or the modification of existing processes to accommodate new practices and methods will require a differently educated software engineer [8]. A software engineer must be prepared to a) understand the development process or processes; b) conceptualize a desired process; c) establish process improvement actions; d) plan the improvement activities; e) find the resources needed by the plan; f) execute the plan and g) repeat the improvement process [9]. This, coupled with the need to assess and measure [10], provides the software engineer with the intellectual tools to cope with the complexities of their work. It will not suffice merely to impose modifications of an individual's practice; the mandate is to create a process whereby software engineers evaluate and modify their work processes based on metrics and measurement. Results from NASA's Software Engineering Laboratory [4] and the University of Maryland [11] provide positive support for this view. Likewise, to teach students how to practice effectively, their process(es) must be made visible to them [12]. Yet computer science education does not teach all the intellectual and cognitive skills necessary for the task of software development. While many prominent authors [8][13] see the education of future practitioners as the most critical factor in resolving the problems of the software industry, the educational preparation given our students remains in the mathematical tradition. The hope of faculty is that these foundations are sufficiently mature and robust to support the emergent, and/or existing, commercial practices, a hope that has its critics [14][15]. The transfer activity from the mathematical foundations to actual practice is typically in the form of an exercise in programming. Students are expected to apply classroom knowledge (how it works) to tasks (how to do it), even though Anderson (e. g., [16]) has shown convincingly that such transfer does not occur automatically (see [17] for a review of the conditions that affect transfer of learning). This also leaves students with a product orientation, that is, they are primarily concerned with whether the program works or whether it includes the "good" algorithm ("good" by the criteria of supporting efficient use of hardware resources, for example). Little attention is paid to process (i. e., how to go about solving the problem), either explicitly or implicitly. Topical areas and knowledge units throughout the curriculum ignore software process and thus conceal the need to provide systematic attention to it. For instance, students typically code first and write documentation (including the high level design) at the end, just before turning in the project. In a recent survey of sophomore computer science majors at our institution, only 10% of the students reported they always created a written design for a project. The simple step of requiring students to document their design plan before beginning to code would make them attend to the planning process. The perspective we suggest requires a paradigm shift, and, as such, strikes at the heart of what we do in computer science education. A process orientation is not typically found prior to a capstone course, usually software engineering or project practicum, taught late in the student's academic career [18][19][20]. In this context the students begin to see process, the interaction of process with product, and the influence various factors have on product. This is typically accomplished by assigning real, or at least realistic, projects with customers to students working in groups. The requirement is typically to specify, design, implement, and deliver a product. In this one course, students must learn 1) various software engineering methodologies, 2) to work in teams, 3) to work with customers, 4) to create high level designs and implement them, and 5) to create a large-scale product. In our view, one course, late in a student's program of study, is not sufficient to allow the development of the complex cognitive skills needed. By the latter part of the program of
study students have developed numerous bad habits. Perhaps worse, they have been successful with these habits; thus, it is increasingly difficult to persuade them that differing approaches are more effective. To move students toward the new cognitive abilities we are suggesting would be difficult. What is needed is a strategy that first identifies the underlying foundations regarded as process-centric, then integrates these foundations in a principled way into the fabric of the educational program. Richardson [21], at the Air Force Academy, detailed a spiral approach to software engineering education where particular software development topics or issues were introduced early, and later elaborated and refined. This spiral approach provided the mechanism for controlling the level of detail and repeatedly reinforcing the identified behaviors or best practices [22][23]. More recently, the University of Virginia [24][25] attempted a similar construction. Prey, et. al. divided the intellectual content into two major categories (knowledge and skills), established three levels for "depth of treatment" (exposure, familiarity, mastery). These were mapped to the courses in the curriculum providing an approach that resembles that of Richardson [21]. Our approach differs from those described in that we provide explicit structure for teaching best practices as part of the laboratory activities in our courses. These best practices are introduced early in the program, then revisited and refined later. Secondly, the approach we are implementing insists on reflective activity by the students in assessing the processes and products. The explicit structure we provide in our laboratories is based on the results of considerable research in cognition.
2: Development Principles There are several principles that should guide the construction of curriculum to support the development of competent practitioners. The first principle addresses the realism problem. Those whose desire for curriculum reform apparently focuses on merging computer science with software engineering (e. g., [26][27]) argue convincingly for a tighter coupling between the type of problems given students and those encountered in the work environment. This is the situated cognition principle (by which we mean learning within an authentic context). Most software engineering educators further suggest that these development activities be tackled by groups, which allows students to experience the problems associated with group activity; and thus, gain an appreciation for the process of large-scale software development. A second principle is to focus on the processes underlying the development activity [28][29]. Although most process educators still advocate "realistic" problems, they are willing to compromise on realism to permit more focus on process activities. In particular, Werth [28] encourages the use of lab time for students to practice team techniques and advocates providing team members with role definitions and responsibilities [30]. Other aspects of the process-oriented approach focus on making explicit the roles and tasks needed by the students during development activities. For example, rather than leaving students to debug their programs any way they see fit, good practices require a series of tasks--creating a test plan, determining test cases/results, conducting the tests, evaluating the results, evaluating the test plan, and creating an improvement plan for the next iteration. Students need to understand how these processes work and be given an instructional context in which to learn them. The framework for that instructional context is provided by a third principle, that of cognitive apprenticeship [31][32]. The techniques are designed to explicitly help students develop into skilled practitioners, as medieval artisans developed their skills through apprenticeship to masters. As with medieval apprenticeship, students first watch while the expert demonstrates the activity and explicates what he is thinking as he does it. Then the students, in small groups, practice on a similar task. The expert acts as coach to these groups, providing guidance, not answers. The group context implicitly and the coach explicitly encourage students to articulate both their knowledge and their understanding of the process by which the task gets done, which facilitates performance (e. g., [33][34]).
Likewise, the situation encourages students to reflect upon the proposed solutions and think about alternatives in non punitive ways. The framework supporting the cognitive apprenticeship approach defines the structure of the activities. This framework specifies the content of the activities to a high degree:
Domain Knowledge
subject matter specific concepts, facts, and procedures
Heuristic Strategies
generally applicable techniques
Control Strategies
guidelines to focus direction during the various activities
Learning Strategies
knowledge about how to learn concepts, facts and procedures
In addition to the content model, cognitive apprenticeship introduces a method that should support process education. The method consists of six basic activities: Modeling
The expert demonstrates the particular task. The modeler attempts to work through the task while describing decisions, and rationale. The component tasks are made explicit during the activity.
Coaching
The expert/coach observes the novice conduct the task, offering hints, evaluating partial results, and offering encouragement.
Scaffolding
The type and form of support for the novice during conduct of the task is provided by expert. This is sensitive to the nature of the learner and the current state of learning. Guzdial, et al. [35] discuss the use of different types of scaffolding in cognitively distinct tasks (such as collaboration, design, and case interpretation).
Articulation
Encouraging the learner to verbalize knowledge and thinking. Much like making the process visible, this activity works toward making the student's thinking visible to them.
Reflection
Learners are encouraged to review their performance critically and non-punitively.
Exploration
Learners pose and solve problems of their own creation.
We have used cognitive apprenticeship successfully in our previous research (National Science Foundation Grant Number MDR-9154008) to teach software design to a variety of students [36][37][38]). We even found it effective with novices (e. g., high school students without prior programming knowledge or skills). This was possible because we developed our instructional approach and material according to principles of user-based design. Thus, we started with a simple prototype of the curriculum, tried it out with students, used formative evaluations to assess its effectiveness, and revised the curriculum to
fill in the gaps in the students' understanding. We discovered that students do not exhibit their best thinking unless the situation both forces them and makes it possible for them to succeed. Therefore, it is crucial to start with a simple situation, to make the process concrete and visible, then to increase the complexity of the task as students' skills develop. By the end of our program they were competent enough to develop designs for programs of their own choice. Our prior experience and success with the cognitive apprenticeship approach as an instructional model leads us to believe it to be the most viable approach to software process education since it 1) structures the learning experiences in terms of identifying the skills and decomposing these into subskills that can then be modeled by experts, 2) provides supportive coaching as students engage in the various activities, 3) intentionally provides less support as the students develop their skill, 4) encourages the students to articulate aspects of their performance, and 5) requires students to reflect on their performance (in our approach based on measurement data). These latter criteria, articulation and reflection, are essential if students are to develop the ability to modify their own processes.
3:. Curricular Construction We are currently constructing the laboratory component for two courses in the software track of the computer science major with a software process focus. We adapted portions of the goal-question-metric approach [39], as suggested by Greese et. al. [40], to structure our exploration of the issues and outcomes. This approach seems justified based on our need first to determine the kinds of activities to incorporate in the laboratories (practices derived from the goals), and to systematically assess our success (determined by the metrics and measures). The primary goal is to improve the effectiveness of the student's development process. Table 1 1. Subgoal Purpose Issue Object Viewpoint 1.1 Subgoal Purpose Issue Object Viewpoint 1.2 Subgoal Purpose Issue Object Viewpoint 1.3 Subgoal Purpose Issue Object
Improve the effectiveness of development activities from the student's perspective. Increase the use of roles and responsibilities during development by the student. Improve the utilization of roles and responsibilities during development by the student. Develop accurate plans for development activities
Viewpoint
by the student.
1.4 Subgoal Purpose Improve Issue the use of Object development resources Viewpoint by the student. Specifically the intent of the development activity we have undertaken is to help the students learn to work smarter, and to exercise more control over their development process. Our perception of this involves first helping the students to make the way they work explicit (similar to personal software process [41]). Subgoal 1 (Table 1) focuses on moving the students towards developing and implementing plans for their activities. One of the primary movers for planning is the control of resources. In making plans students will see how scheduling resources is required to meet their deadlines, and the consequences of contention for scarce resources. Furthermore, a focus on planning provides the opportunity to explore issues related to risk management and project characteristics. Whereas the first subgoal is written from the perspective of the student, where the emphasis must lie if curricular change is to be effective, the second subgoal is from the instructor's perspective. Table 2 2. Subgoal Purpose Issue Object Viewpoint
Improve the effectiveness of development activities from the instructor's perspective.
There may be some question regarding the difference between Subgoal 1 and 2 (Table 2). This is not a cosmetic distinction. The focus in Subgoal 1 is the student. It is our working principle that the students must attend to their work habits. Focusing their attention on the nature of their work should be followed by activities where students reason about strategies that support more effective development. This implies structuring laboratories to require the student to engage in the articulation and reflection activities suggested by the cognitive apprenticeship approach. Table 3 3. Subgoal Purpose Issue Object Viewpoint 3.1 Subgoal Purpose Issue Object Viewpoint 4. Subgoal Purpose Issue
Improve the quality of programs by the student. Develop effective quality assurance procedures by the student. Improve the quality of
Object Viewpoint
student programs from the instructor's perspective.
While Subgoal 1 is aimed at making the students aware of their development process, Subgoal 2 insists that the instructor attend to changes in students' development behaviors. Subgoal 2 states that the instructor should perceive changes in the development process engaged in by the students. The instructor should notice differences in approaches and strategies employed by the students. As the Subgoal 1 and 2 relate to process, Subgoal 3 and 4 (Table 3) relate to product. Learning to work smarter is improving the quality of the expected products. The pressing need in this area is to provide the students with a perspective from which to assess their work-products. We will use a two-pronged approach: 1) reading/rating of experts' solutions characterized as "good", and 2) evaluating students' products. The first suggests that our students could learn much from studying complete examples [42]. It is also important to define and apply measures [43] that caputure the differences between student and expert solutions. By evaluating both expert solutions and those of other students, students will hopefully learn how to use metrics to evaluate designs and programs. They will also learn to learn from errors and to approach the review process from a task-oriented rather than an egooriented perspective, which research has found to lead to better evaluative processes. These skills can probably be learned more easily in the context of reviewing the designs of others than in evaluating themselves. In addition, the feedback from the students whose products they review will itself help to evaluate their review skills. Table 4 5. Subgoal Purpose Issue Object Viewpoint
Develop the use of reflective practice by the student.
The final goal (Table 4) is perhaps the most ambitious, and the most difficult to achieve. Our hope is that by the end of the curricular intervention students will understand the best practices [22][23], but that they will be able to use them in a reflective [45], flexible, and self-regulating fashion. To maximize the probability of attaining this goal, we use the best practices that are closely related to key processes of the Capability Maturity Model [46]. Our approach includes the control strategies needed to help students decide when to use particular practices, and how to tailor practices to their development context.
3: Current Activity The first course receiving the treatment was taught in the fall of 1996. This course is required of all computer science majors, and is part of their program of study in the sophomore year. Students enter this course having completed the equivalent of CS1 and CS2. The focus in this course is the individual and individual skills. This perspective is consistent with activities suggested in the Personal Software Process [9][41][44]. We are asking these students to use the process activities we are developing in creating their programs and in the manner in which they produce them. The process activities we developed are usually executed during the laboratory portion of the course. The laboratory activities use the programming activity normally required in this course as object of study, and thus do not compromise the content traditionally taught in this course.
Pursuing our objective of a process focus, we asked the students to complete a survey regarding their development activities. In the initial survey, we presented a list of possible development activities and asked how frequently the students engaged in that activity when given a software project. The results of the initial survey are provided in Table 1. The intent of the survey was to get the students thinking about how they conduct their software development activities. Each student had an electronic design notebook in which all writing assignments given during the class were posted. Each writing assignment, such as the survey mentioned above, was in electronic form and posted in the design notebook. The design notebook was username and password protected, therefore only viewable by the student and the instructor. The design notebooks were accessible through a web browser. The use of the design notebook provided a persistent store of information regarding student development activities. The use of web technology also provided students with access to their design notebooks from any web-literate computer on campus. By the end of the semester all material was available to the student for review. The first industrial strength practice we added to the course was project postmortem. The goals of our postmortem activity are to Provide an environment that supports openness and constructive criticism. Provide an environment that encourages improvement. Provide a means that encourages students to share lessons learned. Provide an environment that supports easy collection and archival of information. Provide a process that links the activities to future projects. The second major addition to this sophomore course was special attention to quality assurance activity in the form of reviews [50][51]. We believe the review process provides a wealth of opportunities for students in assessing their products and learning about others' approaches to development activities. Prior attempts using reviews with students had taught us the value of a highly structured process. We defined a review process [52][53] that specified clearly defined roles, responsibilities, and outcomes. Two particular outcomes of the review process were an assessment of the product being reviewed by the team in the form of a defect report from the reviewee after being reviewed, and an assessment of the relative benefits of the review process from all team members. In particular, the students were asked to consider how well the process uncovered defects, the amount of time spent in the preparing for the review and in the review itself, and if the activity was worth the time invested. The students reviewed rated the review experience at 4.4 (on a 1 to 5 scale with 5 being very positive). On the same survey they rated the experience very worthwhile (average rating of 4.7 on a 1 to 5 scale with 5 being very worthwhile). Apparently students recognize the value of this practice. They may even see the benefits from an educational perspective. For example, one student commented, "I liked the whole process of having to explain what each section of the code means. If I couldn't explain it adequately, then that meant that I might have some problem with the code or my understanding of the problem." We adapted the postmortem process from [47], and provided it to our students (see Appendix C). Our process included a short survey (the surveys changed somewhat from project to project as the students' understanding increased, see Appendix A for a sample), and the collection of data on the project. The concluding activity in the process was the establishment of goals for the next project. Again, all material the students submitted was via web-based methods and recorded in their personal design notebooks. The postmortem process is vital for students in that it allows them to reflect upon the project just completed, and consider alternative approaches that may work better or be more effective. Furthermore, postmortems provide an excellent context to support the introduction of measurement in their activities [48][49]). The students completed three project postmortems during the course of the semester.
Table 5: Student use of activities (N=25) Activity spend time thinking about the requirements to be sure you understand what the program should do create a specification based on all the inputs and all the outputs the program should have create a written design review the design before implementing ask peer to review design before implementing ask instructor to review design before implementing compare your design to an expert's design create a written plan for testing the program review the test plan write the code debug until the program runs systematically test using inputs systematically review the pre- and post-conditions ask peer to review code compare your program with an expert's program evaluate program read and review the program documentation write an improvement plan review requirements from the user's perspective
Never
Always
0%
Sometimes 10%
5%
70%
25%
10% 10%
80% 65%
10% 25%
80%
20%
0%
75%
25%
0%
60%
35%
5%
60%
35%
5%
60% 0% 0% 0% 10%
30% 5% 15% 30% 70%
10% 95% 85% 70% 20%
45% 55%
50% 45%
5% 0%
0% 5%
45% 30%
55% 65%
45% 5%
50% 45%
5% 50%
90%
Supporting the review process meant explicit attention to design/code reading [54] by the students. This activity provides an excellent opportunity to introduce the cognitive apprenticeship model. The course instructor included code reading activities for the class where he modeled the reading/comprehending activity. The use of explicit cues provided the students with early strategies as they engaged in the activity. The use of checklists may be used as a form of scaffolding to help organize the students' early reading activities, then the checklists can be omitted or not explicitly allowing the students to determine the amount of external support they need for complete the task. The culminating activity of the semester in this sophomore class was to reflect consciously on the activities of the semester's work. Each student had completed an initial survey, three project data sets, and three postmortem surveys. Thus, the students had an enormous amount of information in their design notebooks. The final task was to encourage students to assess their personal progress over the course of the semester by completing a survey that is identical to the initial survey (refer to Table 5 for a sample). The students were then asked to compare the two surveys, noting areas of significant change and to speculate on the cause of this change.
The second course, scheduled to begin in January 1997, includes aspects of teamoriented processes. The second course is a junior-level software engineering course that involves group projects. The development of materials for this course involves specifying and implementing process roles and responsibilities akin to those presented by Werth [30]. We firmly believe that this sequencing is appropriate based on our learning model and the skills and attitudes the students must acquire. It seems most reasonable to focus the students early on the individual at work, then progress towards the concerns of teams [41]. Furthermore, from our view as developers, the experience developing individual process activities will certainly inform the development of comparable team activities.
4: Conclusions What we have described were our initial efforts in the area of integrating software process activities into the undergraduate computer science program. The results and activities described are preliminary in the sense that they represent outcomes from our first prototype. We have received sufficient positive indications from the students to suggest that our approach is viable and will produce some of the results we have discussed. In particular, the major impact of this project, if successful, will be to facilitate student use and understanding of best practices and reflective practice. Students who develop such skills will undoubtedly be successful in the software industry, and, in the long run, improve the software industry itself. Although we do not downplay the importance of on-the-job training, such training in the current software industry has inadequacies similar to those of computer science curricula. A second, perhaps more important, aspect of the project is the provision of a learning environment that helps students develop as reflective practitioners. Though we understand that reflection requires considerable time to evolve effectively, we feel that if they do not begin to develop reflective reasoning in their computer science major, it is unlikely that they will receive adequate compensatory training. Some people, of course, develop such skills on their own, but recent assessments of the software industry indicate that most do not [55]. We do feel that competent software developers armed with significant skills in reflective practice is what the doctor orders for the ailments currently afflicting the industry. Our preliminary results indicate that an early focus on best practices coupled with explicit attention to reflective activity by the students can provide an excellent educational environment for computer science. This approach does not require massive reconceptualization of the computer science offerings nor does it require that students learn less about core computer science theory to accommodate the attention suggested to practice-oriented skills. Adding laboratory components to pre-existing courses is efficient, because the lessons of process can be incorporated into assignments designed to teach programming or algorithmic skill. Furthermore, because the laboratory components can be piggybacked on a variety of specific exercises, they will be highly portable to other institutions. Students can, and should, be empowered early in their careers. We believe, and when available our results should confirm that the approach we have defined modifies in significant, positive ways the approaches our students use in program development. Furthermore, these changes are accompanied by an ever increasing ability to plan and coordinate their activities. If the curricular changes proposed here do not result in more effective software engineering practices by students, the extensive formative and summative evaluations will point the way to improvements. In that case, the major outcome of the project will be an assessment of what works and what doesn't, plus a significant cognitive analysis of the reasons for failures and a series of recommendations that direct us towards solutions. From our perspective this is clearly a win-win situation. But much work remains to be done.
Note This project is supported, in part, by grant DUE-9555042 from the National Science Foundation. The material discussed in this paper is freely available at http://www2.umassd.edu/SWPI/NSF/material.html.
References 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24.
Curtis, B., M. I. Kellner, and J. Over. (1992) Process Modeling. C o m m u n i c a t i o n s o f t h e A C M , 3 5 . pp. 75-90. Lai, R. (July 1994) The Move to Mature Processes. IEEE Software. pp. 14. 17. Cain, B. G. and J. O. Coplien (1993) A Role-Based Empirical Process Modeling Environment. In P r o c e e d i n g s o f Second I n t e r n a t i o n a l Conference o n t h e Software P r o c e s s , Los Alamitos, California: IEEE Computer Press. pp. 125-133. McGarry, F., R. Pajerski, G. Page, S. Waligora, V. Basili, and M. Zelkowitz (1994) Software Process Improvement in the NASA Software Engineering Laboratory. CMU/SEI-94-TR-22Software Engineering Institute, Carnegie Mellon University, . Perry, D. E, N. A. Staudenmayer, and L. G. Votta (July 1994) People, Organizations, and Process Improvement. IEEE Software. pp. 36-45 Fischer, G., D. Redmiles, L. Williams, G. I. Puhr, A. Aoki, and K. Nakakoji (1995) Beyond ObjectOriented Technology: Where Current Approaches Fall Short. Human-Computer Interaction, 10. pp. 79-119. Humphrey, W. S. (1989) M a n a g i n g t h e S o f t w a r e P r o c e s s . Reading, MA: Addison-Wesley. Mills, H. (1988) Strategic Imperatives in Software Engineering Education. In G. A. Ford (ed.), Software Engineering Education: SEI Conference 1988 . New York: Springer-Verlag. pp. 919. Humphrey, W. S. (July 1995) Why Should You Use A Personal Software Process? Software E n g i n e e r i n g N o t e s 2 0 ( 3 ) , p 33-36. Pfleeger, S. L. and H. D. Rombach. (July 1994) Measurement Based Process Improvement. IEEE Software. pp. 9-11 Basili, V. R., G. Caldiera, H. D. Rombach. (1994) The Experience Factory. Experimental Software Engineering Group, Department of Computer Science, University of Maryland. Hsia, PP. (September 1993) Learning to Put Lessons Into Practice. IEEE Software. pp. 14-17. Brooks, F. (April 1987) No Silver Bullet: Essence and Accidents of Software Engineering. IEEE Computer. pp. 10-19. Shaw, M. (November 1990) Prospects for an Engineering Discipline of Software. IEEE Software. pp. 15-24. Fenton, N., S. L. Pfleeger, and R. L. Glass (July 1994) Science and Substance: A Challenge to Software Engineers. IEEE Software. pp. 86-95 Anderson, J. (1987). Skill acquisition: Compilation of weak-method problem solutions. P s y c h o l o g i c a l R e v i e w , 9 4 , pp. 192-210. Reder, L. and R. Klatzky (1994) The Effect of Context on Training: Is Learning Situated? CMU/CS-94TR-187, School of Computer Science, Carnegie Mellon University. Boardman, D. B. and A. P. Mathur. (1994) A Two-Semester Undergraduate Sequence in Software Engineering: Architecture and Experience. In J. L. Dìaz-Herrera. (ed.), Software E n g i n e e r i n g Education: 7th SEI CSEE Conference. New York: Springer-Verlag. pp. 5-22. Modesitt, K. L. (1994) When the Golden Arches Aft Agley: Incorporating Software Engineering into Computer Science. In J. L. Dìaz-Herrera. (ed.), Software E n g i n e e r i n g Education: 7 t h SEI CSEE Conference. New York: Springer-Verlag. pp. 35-61. Gotterbarn, D. and R. Riser. (1994) Real-World Software Engineering: A Spiral Approach to a ProjectOriented Course. In J. L. Dìaz-Herrera. (ed.), S o f t w a r e E n g i n e e r i n g E d u c a t i o n : 7 t h S E I C S E E C o n f e r e n c e. New York: Springer-Verlag. pp.119-150. Richardson, W. E. (1988) Undergraduate Software Engineering Education. In G. A. Ford (ed.), Software E n g i n e e r i n g E d u c a t i o n : S E I C o n f e r e n c e 1 9 8 8. New York: Springer-Verlag. pp. 121-144. Brown, N. (July 1996) Industrial-Strength Management Strategies. IEEE Software. pp. 94-103. Yourdon, E. (1996) Rise & Resurrection of the American Programmer. Upper Saddle River, NJ: Prentice Hall. Prey, J. C., J. P. Cohoon, and G. Fife. (1994) Software Engineering Beginning in the First Computer Science Course. In J. L. Dìaz-Herrera. (ed.), S o f t w a r e E n g i n e e r i n g E d u c a t i o n : 7 t h S E I C S E E C o n f e r e n c e. New York: Springer-Verlag. pp. 359-374
25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35.
3 6. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49.
Knight, J., J. Prey, and W. Wulf. (1994) Undergraduate Computer Science Education: A New Curriculum Philosophy and Overview. P r o c e e d i n g s o f t h e A C M S I G C S E S y m p o s i u m . New York: ACM Press. pp. 155-159. Denning, PP. J., D. Menasce, and J. Gerstner. (1995) Re-engineering the Engineering School. ASEE Conference Proceedings. Moore, M. & C. Potts. (1994) Learning by Doing: Goals and Experiences of Two Software Engineering Project Courses. In J. L. Dìaz-Herrera. (ed.), S o f t w a r e E n g i n e e r i n g E d u c a t i o n : 7 t h S E I C S E E C o n f e r e n c e. New York: Springer-Verlag. pp. 151-164. Werth, L. (1994) An Adventure in Software Process Improvement. In J. L.Dìaz-Herrera (ed.), (1994) Software Engineering Education: 7th SEI CSEE Conference. New York: Springer-Verlag. pp. 191-210. Robillard, P. N., J. Mayrand, and J. Drouin. (1994) Process Self-Assessment in an Educational Context. In J. L.Dìaz-Herrera (ed.), S o f t w a r e E n g i n e e r i n g E d u c a t i o n : 7 t h S E I C S E E C o n f e r e n c e . New York: Springer-Verlag. pp. 211-225. Werth, L. (1995) Software Process Improvement for Student Projects. IEEE 1995 Frontiers in Education Conference. Collins, A., Brown, J. S., and Newman, S. E. (1989) Cognitive apprenticeship: Teaching the crafts of reading, writing, and mathematics. In L. B. Resnick (Ed.), K n o w i n g , l e a r n i n g , a n d i n s t r u c t i o n : E s s a y s i n h o n o r o f R o b e r t G l a s e r . Hillsdale, NJ: Erlbaum. pp. 453-494. Collins, A., Brown, J. S., and Holum, A. (1991, Winter). Cognitive apprenticeship: Making thinking visible. American Educator, pp. 6-11, 38-46. Bielaczyc, K., PP. L. Pirollli, and A. L. Brown (1995) Training in Self-Explanation and Self-Regulation Strategies: Investigating the Effects of Knowledge Acquisition Activities on Problem Solving. C o g n i t i o n a n d I n s t r u c t i o n , 1 3 , pp. 221-252. Chi, M. T. H., N. De Leeuw, M. Chiu, and C. LaVancher (1994) Eliciting Self-Explanations Improves Understanding. C o g n i t i v e S c i e n c e , 1 8 , pp. 439-477. Guzdial, M., J. Vanegas, F. Mistree, D. Rosen, J. Allen, J. Turns, and D. Carlson (1995) Supporting Collaboration and Reflection on Problem-Solving in a Project-Based Classroom. GIT-EduTech-95-15, The EduTech Institute, Georgia Tech. Sims-Knight, J. E.and R. L. Upchurch (1992). Teaching object-oriented design to nonprogrammers: A progress report. P r o c e e d i n g s o f O O P S L A - 9 2 E d u c a t o r s ' S y m p o s i u m . Vancouver, British Columbia, Canada. Sims-Knight, J. E. and R. L. Upchurch (April 1993). Teaching software design: A new approach to high school computer science. Paper presented at the annual meeting of the American Educational Research Association, Atlanta, GA. Sims-Knight, J. E. and R. L. Upchurch (1993). Teaching Object-Oriented Design Without Programming: A Progress Report. Computer Science Education, 4, 135-156. Basili, V. R., G. Caldiera, and H. D. Rombach (1994) The Goal Question Metric Approach. Experimental Software Engineering Group, University of Maryland. Gresse, C., B. Hoisl, and J. Wüst (1995) A Process Model for GQM-Based Measurement. STTI-95-04-E, Software Technology Transfer Institute, University of Kaiserslautern. Humphrey, W. S. (1995) A D i s c i p l i n e o f Software E n g i n e e r i n g . Reading, MA: AddisonWesley. Linn, M. C. and M. J. Clancy (1992) Can Expert's Explanations Help Students Develop Program Design Skills? I n t e r n a t i o n a l J o u r n a l o f M a n - M a c h i n e S t u d i e s , 3 6 . pp. 511-551. Pfleeger, S. L. (1996) Integrating Process and Measurement. In A. Melton (ed.), Software Measurement. London: International Thomson Computer Press. pp.53-74. Humphrey, W. S. (May 1996) Using a Defined and Measured Personal Software Process. IEEE S o f t w a r e . pp. 77-88. Schön, D. and J. Bennett (1996) Reflective Conversation with Materials. In T. Winograd (ed.), B r i n g i n g D e s i g n t o S o f t w a r e . New York: ACM Press. pp. 171-184 Paulk, M. C., B. Curtis, M. B. Chrissis, and C. V. Weber (1993) The Capability Maturity Model For Software, Version 1.1. CMU/SEI-93-TR-24, Software Engineering Institute, Carnegie Mellon University. Collier, B., T. DeMarco, and PP. Feary (July 1996) A Defined Process for Project Postmortem Review. IEEE Software. pp. 65-71. Ford, G. (1993) Lecture Notes on Engineering Measurement for Software Engineering. CMU-SE-93EM-9, Software Engineering Institute, Carnegie Mellon University. Bieman, J. M., N. Fenton, D. A Gustafson, A. Melton, & L. M. Ott (1996) Fundamental Issues in Software Measurement. In A. Melton (ed.), Software Measurement. London: International Thomson Computer Press. pp. 39-52.
50. 51. 52. 53. 54. 55.
Fagan, M. E. (1976) Design and Code Inspections to Reduce Errors in Program Development. IBM S y s t e m s J . , 15, pp. 182-211. Fagan, M. E. (1986) Advances in Software Inspections. IEEE Trans. Software Eng., 12. pp. 744751. NASA (April 1993) Software Formal Inspection Standard. NASA, Office of Safety and Mission Assurance, NASA-STD-22-2-93. NASA (August 1993) Software Formal Inspections Guidebook. NASA, Office of Safety and Mission Assurance, NASA-GB-A302 Deimel, L. E. and J. F. Naveda (1990) Reading Computer Programs: Instructor's Guide and Exercises. CMU/SEI-90-EM-3, Software Engineering Institute, Carnegie Mellon University. Bryan, G. E. (1995) Not All Programmers are Created Equal. UCIrv-95-PROC-CSS-006, University of California Irvine.
Appendix A - Postmortem Survey 1.
Were the computing resources you needed to complete the project by the due date available, and consistent with your schedule and work habits?
Always Available 2.
Adequate
Somewhat rushed
Approximately the same
Needed more time
Adequate explanation
Less than adequate explanation
Insufficient explanation
Never received adequate explanation
Once the project was assigned, did you establish a plan identifying important milestones to achieve in order to complete the project by the due date?
Detailed Plan 7.
Needed much more time
Did you seek sufficient clarification of problem prior or during the project?
Adequate explanation given Questions and answers Sought additional posted helped clarify clarification prior to beginning 6.
Not Sufficient
Was the project explained adequately prior to your initiating work?
More than adequate explanation 5.
Never Available
In you estimation, was the time you needed to complete the project consistent with the time provided?
Was given too much time 4.
Some of the Time
In you estimation, was the time allotted by the instructor adequate for the project?
More than adequate 3.
Most of the Time
Outline of Plan
Mental Plan
No Plan
Did you complete a preliminary design of the project as either a structure chart, pseudocode, or program skeleton prior to coding?
Did a design first
Alternated between design and code
Coded the complete project
Alternated code and debug
8.
List at least two things you would repeat, if you had to do this project over again.
9.
List at least two things you would change about your development activity, if you had to do this project over again.
Appendix B - Exit Survey 1.
I enjoyed the course
not at all 2.
a little
some
quite a bit
a lot
a little
some
quite a bit
a lot
I learned
nothing 3.
What experience(s) in this course have been the least rewarding?
4.
What material in this course did you find most interesting, and why?
5.
Did this course meet your expectations? Explain.
6.
What skills, skills related to constructing better and more robust programs, did you gain from this course?
7.
What skills, skills related to coordinating and controlling your resources in getting your work done, did you gain from this course?
8.
What do you consider to be the ideal characteristics of a program? If you were the instructor, what criteria would you establish then use to assess the worth of those submitted?
9.
What one positive experience during the course stands out most?
10.
What one negative experience during the course stands out most? Appendix C- Postmortem Process Guide
1.
What is a project postmortem?
Read the material provided and participate in the discussion during the first part of lab. This sets the stage for the activities that follow. You will conduct a postmortem for each project in this course. 2.
Personal Survey
Complete the postmortem survey provided at the above link. 3.
Software Measurement
4.
Data Collection
Take the program you submitted for the project, and collect the data requested based on the definitions of the measures provided. 5.
Results
What do you see? Does reviewing your own work help you see and think about things you can focus on? Write your results as comments on your postmortem material page. 6.
Personal Goals
Define your personal goal for the next project. Be sure to define the goal and those actions you believe you can take to achieve that goal. 7.
Debriefing