Teaching More Comprehensive Model-Based Software Engineering

2 downloads 111 Views 172KB Size Report
Our goal in teaching software engineering in the School of Computer Science of ... with an outline of our future plans and considerations with respect to ...
Appeared in the 8th Conference on Software Engineering Education (CSEE'95), New Orleans, LO, Lecture Notes in Computer Science, Linda Ibraham (Ed), Springer Verlag. April 1995.

Teaching More Comprehensive Model-Based Software Engineering: Experience With Objectory’s Use Case Approach Robert F. Coyne

Allen H. Dutoit

Engineering Design Research Center ([email protected])

Department of Electrical and Computer Engineering ([email protected])

Bernd Bruegge

David Rothenberger

School of Computer Science ([email protected])

School of Computer Science ([email protected])

Carnegie Mellon University Pittsburgh, PA 15213 Abstract. This paper is an experience report and discussion of an experiment in teaching and using a comprehensive model-based methodology and tool (Objectory) in a large software project. The paper describes in detail the preparation of such an experiment, states our assumptions during the planning phase, and presents the results together with a discussion of the most important issues. We observed certain improvements in the productivity and understanding of the students, while discovering a number of non-trivial organizational and pedagogical issues still remaining to be solved (e.g. communication latency and breakdown; project set up time; training time etc.). We believe that this paper will provide valuable insights to the reader at a number of different levels: 1. To those interested in state of the art software engineering methodologies; 2. To those interested in Objectory per se; 3. To those interested in the issues of introducing a sophisticated modeling methodology into a pilot project involving a large number of participants.1

1

Introduction

The intent of this paper is to report on an experiment in introducing new technology, in the form a new object-oriented development methodology2 that offers more extensive artifact3 modeling, in a recent offering of our software engineering course in the School of Computer Science at Carnegie Mellon University. We expected the methodology to enhance support for teaching and practicing more comprehensive model-based software engineering and here report on the results.

1. This research was sponsored in part by the Division of Undergraduate Education, National Science Foundation grant number USE-92511836, by grants from Bellcore Communication Research and the Wireless Research Initiative at Carnegie-Mellon's Information Networking Institute and by the Engineering Design Research Center, a NSF Engineering Research Center. 2. According to the Oxford Dictionary, methodology can mean both the science of methods as well as a body of methods. We are using the term methodology to mean a body of methods applied to the development of software systems. 3. As defined in [6], artifact modeling denotes the set of activities covering pre-development and development. This includes pre-development processes such as concept exploration and requirements elicitation.

Our goal in teaching software engineering in the School of Computer Science of Carnegie Mellon University has been to provide a realistic software engineering experience to students. We have done this by immersing students in a single, teambased, system design project targeted to build and deliver a complex software system for a real client. We have previously reported on our goals, “successes”, emerging issues and problems, and continuing plans with teaching a project-based software engineering course [4, 5, 5, 6, 7]. In particular, we decided to extend the exposure of the students to requirements engineering and its influence on iterative design, system decomposition and object identification. It should be noted that we have already been using object-oriented software development methodologies and CASE tools effectively, but found certain limitations with most of the available methodologies and tools, such as OMT (Object Modeling Technique [23]), particularly in the area of requirements engineering [5]. In [6] we described some of the complexities of enabling iterative design and decomposition within the context of a single semester team-based project that has the firm goal of moving through the whole development cycle and delivering a prototype system to a client. There we conjectured that the OOSE/Objectory™ methodology,4 with its central focus on use cases as a requirements engineering thread that runs through all development phases, appeared to be a strong candidate methodology to help address many of these issues and had the potential for satisfying the objective of enabling the teaching of more comprehensive artifact modeling. We considered the risks associated with introducing Objectory into the course not primarily based on the merits of Objectory itself -- we had enough knowledge and experience with it to be comfortable with its quality. The larger sources of risk were the limited preparation time and large number of tasks that are typical for launching a project based course, the constrained timeframe for the project overall (5 months counting 1 month of pre-course requirements modeling), and the limited overall experience of the participants with the methodology. Nevertheless, we considered the attempt to use a new “state of the art” methodology to be a well-motivated “experiment” with some risks for the students but higher potential benefit in comparison to reusing other methodologies that we had previously experimented with. In this context, we report here on our experience with using Objectory. In Section 2, we give an overview of Objectory and the significant features it offers that motivate our use of it. This is followed by an introduction to the course project in Section 3 in terms of a summary of the requirements for the system to be built. In Section 4, we present how we set up the project organization, development process and teaching structure to manage the project and the introduction of Objectory into the course. This is presented in an “as planned and ideally how we hoped it would go” mode. Section 5, paralleling each of the major areas presented in Section 4, gives an experience report on what 4. Objectory™ is a trademark of Objective Systems SF AB. It is a process and CASE-tool supporting the Objectory object-oriented analysis and design methodology. In our use of it in Fall, 1993, as reported on here, it was in Version 3.3. In that version, the process and methodology were supported by ~1200 pages of manuals [20] (primarily Overview, Process, Guide and Tool). OOSE (Object-Oriented Software Engineering) is the name given to the methodology in a ~ 200 page book [14] that is an introduction to Objectory. In Section 2, we give an overview of Objectory and our motivation for using it. In the remainder of this paper, unless we are referring specifically to OOSE, we shall refer to both OOSE and Objectory generically as Objectory.

actually happened. We then discuss the consequences, observations and insights that emerged from the overall experience in Section 6. The paper concludes, in Section 7, with an outline of our future plans and considerations with respect to continuing to use Objectory or other methodologies in our course, and with a list of general recommendations for those considering the use of Objectory. Though we report here on the use of Objectory in an academic setting, the issues and insights should be of general interest because 1. the course reflects the “state of the art” in both the approaches and tools used and in the problems encountered. 2. Our use of Objectory was as intensive and complex as any application of the methodology and tool that we were aware of as of the Fall, 1993 in the USA. We believe that the insights gained will be valuable to others at several different levels, to those who are: • interested in state of the art software engineering methodologies, • considering using Objectory per se, • concerned with how to incorporate more requirements modeling in a software engineering course, • interested in the issues and consequences of introducing a sophisticated modeling methodology into large scale projects involving a large number of participants, • interested in the dynamics of the interaction of team-based processes, communication, and negotiation with the artifact modeling phases and products of a methodology, and, how to manage iteration in that context.

2

Motivation for using Objectory: the Use Case approach

Currently, there are many competing object-oriented methodologies and tools with widely varying degrees of maturity -- even the “best” are all still rapidly evolving. However, by all accounts, Objectory is a methodology that is being taken seriously by commercial developers and by educators.5 In comparisons of it with other major objectoriented methodologies (such as OMT [23] and Rational Rose [2, 3]) it consistently rates well [19, 17] as a comprehensive methodology. The book popularizing the methodology as OOSE [14] has received a number of positive reviews in a wide variety of the popular forums for object-oriented issues [1]. Additonally, the use case approach, as an early, upfront focus in system modeling, is now being adopted or adapted by many other OO methodologies6 [9, 22], and behavior modeling of systems from the user’s perspective is included in some form and at some stage in most current methodologies [6]. However, our consideration and choice of Objectory was guided more by our positive experience in using it on complex industry and research projects [10, 12] and the limitations of other object-oriented methodologies that we had used [5, 6]. This section briefly describes the methodology and discusses its relevance to the issues and problems we have encountered in moving toward our overall goal of teaching (and practicing) effective model-based software engineering in a project course. In particular, we believe it has great potential for satisfying the objective of supporting more comprehensive artifact modeling. In comparison to OMT and other object5. In addition to our use of Objectory at Carnegie Mellon University in Fall, 1993, it was used by Dr. Gregory Abowd in teaching the 5 week segment on object-oriented analysis and design as part of a comparative Methods course in the MSE program of the Software Engineering Institute and the School of Computer Science. 6. Personal conversation with Grady Booch, Carnegie Mellon University, Spring 1994.

oriented (software engineering) methodologies, Objectory appears to provide more and better integrated modeling methods and notations covering the software engineering lifecycle. 2.1 A brief overview of Objectory Objectory provides modeling and documentation facilities to represent the system being designed at various levels of abstraction and detail. Fig. 1. shows the entry port for the tool and describes two major components which allow users to do modeling and create documents based on the models. Systems enables selection of a database of models for a particular software artifact. System browser provides facilities for creation of models and structured annotations for actors, use cases, objects; through it one can create links between use cases and actors, or between one object and another, browse through any model in any detail, cross-index things from one model to another. Document archive reassembles all of the textural information generated in the model and some of the graphical information into structured documents; additional information about the system and models is collected in special sections and user defined sections. Fig. 1. Objectory Entry Port

Objectory is distinguished from other current object-oriented approaches primarily through its reliance on system behavior modeling to identify and classify the objects in the system. Behavior modeling of a system results in a high-level description of the system’s functionality from a user’s point of view. Its main purpose is to communicate the intended uses of the system to clients (especially potential users) and among system developers. The behavioral description is based on actors and use cases. Actors model prospective users -- everything external communicating with the system, including other systems. Actors represent a certain role, or class description of behavior, rather than an actual person who uses the system. Actors are the main tool for finding use cases. Each use case describes a possible sequence of interactions among the system and one initiating actor and possibly other actors -- the use case follows a thread of control in and out of the system. Each use case is a description for a set of scenarios in the same sense that a class is a description for a set of instances (objects). Use cases are the central concept running through the whole Objectory development process. Taken together, actors and use cases define the complete functionality of the system each system will have a finite number of use cases, but they must all be enumerated in order to understand what the system does. In addition they support understandability, extensibility and maintainability in the face of evolving system requirements. System development in Objectory is viewed as model building based on three main development processes: analysis, construction, and testing. Each of these processes produces one or more associated models: the analysis process produces the requirements model and the analysis (object) model; the construction process produces the design and the implementation models; and the testing process produces the testing model. The production of each of these models constitutes a phase within an overall iterative process as illustrated in the Fig. 2., adapted from materials provided in Objectory manuals/training materials:

Analysis Requirements Analysis

Robustness Analysis

Construction

Testing

Structured by

Implemented Realized by by

Verified by

Use Case Model

Expressed in terms of

Domain Object Model

Analysis Object Model

class... class... class...

✔ ✔ ✔

Design ImpleObject mentation Model Model

Test Model

Fig. 2. Phases, iterations and the influences of use cases in the Objectory methodology

• The requirements model (the result of requirements analysis) aims at delimiting the system and defining what functionality the system should offer in terms of actors and use cases. Use cases are natural language descriptions of all possible scenarios involving the system. They are written to be readable by all stakeholders in the system. Fig. 3. shows a draft of the main section of a use case called the Flow of Events.7 An integral part of the development of use cases is a glossary of key terms and concepts, which forms the basis for communication between developers, sponsors, and users. This glossary later becomes a basis for the identification of objects in the system. The requirements model thus describes the developers’ view of what the customer wants. • The analysis object model (the result of the robustness analysis phase) aims at structuring the system independently of the implementation environment. It captures information, behavior and presentation in three respective object types: entity objects, interface objects, and control objects. The analysis model derives from the requirements model and forms the basis of the system architecture. Fig. 4. is a typical view used to show communication associations between the analysis objects participating in a use case. 7. Figures 3 - 5 show a sequence of models or views from different phases of Objectory related to the same use case. This modeling was produced by students in course that we report on here for the use case Close Emergency Incident (as listed in Section 4.1). We make no claims with respect to the content, quality or consistency of these models, but for realism, include them as examples of the use of Objectory rather than the characteristic illustrative examples such as ATM machines.

Fig. 3. Text of a use case as seen in Objectory

• The design model refines and formalizes the analysis model. Analogous to the analysis model, it is formulated in terms of objects and associations. However, it takes into account the implementation environment and is much closer to the code produced by the next phase. It defines explicitly the interfaces of objects and the semantics of their operations • The implementation model consists of the source code written for the system. It implements each specific object specified in the design model. • The test model is developed to support the verification of the system developed. This involves mainly documentation of test specifications and test results. Testing starts with the lower levels (unit testing), in order later to cover the use cases and finally the whole system (integration testing). Thus the requirements model supports and is verified by the testing process.

Fig. 4. A view of robustness analysis objects participating in a use case as seen in Objectory

Fig. 5. An Interaction Diagram as seen in Objectory

In actual development, each of these models will undergo many changes before they become stable. Generally, work on a successive modeling phase is not initiated until results from the previous phase have stabilized; however, iteration between phases is the norm. Each of these modeling phases has a methodology associated with it to build and refine the model in an iterative fashion. 2.2 Significant features In general terms, Objectory offers support in many areas relevant to the issues and problems we are currently encountering in our course such as support for teaching iterative system design. • Use cases, as a single thread running through all the modeling levels of the Objectory methodology, support the following activities: uncovering system requirements; clarifying issues and properties in the application domain which provides a context for the identification of objects; guiding the detailing and evaluation of design alternatives; and, testing and validating designs. • Most object-oriented analysis and design techniques claim that the most stable systems are built by using objects that correspond to real-life entities. Objectory augments this practice by basing the object model on three object types - entity, interface and control objects. These correspond roughly to the dimensions of model, view and controller in the Model View Controller paradigm, or more generically to the presentation, task or use case sequencing, and domain or application subject modeling. The rationale for the three object types is to localize changes occurring during maintenance and upgrades, thus providing greater stability. For example, behavior that is placed in control objects will in other methodologies be distributed over several entity objects, which makes it harder to change this behavior. • At the design level, a notation and modeling methodology called Interaction Diagrams is provided for synthesizing and describing the interactions between objects; these are initially specified as operations on the objects and ultimately refined into message sends (see Fig. 5.). • Objectory provides a framework for establishing the architecture of the system (focusing on components and levels that will typically be part of any system [8]), operationalized as interface, control and entity levels and their corresponding objects. By decoupling these components in the system architecture modifications and iteration are more easily accommodated. Also, by isolating these components from one another and establishing early the principles and protocol for their interaction, more rapid prototyping of the user interface is possible, and the development group may gain experience quickly by fast tracking of a vertical slice of the system through all phases of the methodology. More specifically, Objectory addresses several of the deficiencies we cited in our experience with OMT in [6]: it supports requirements engineering through use case modeling; it differentiates between the analysis, design and implementation levels of objects by providing new notations, methodologies and terminology for each level that capture appropriate detail while preserving the modeling information from levels above; modeling concepts and methodologies similar to those for identifying and designing objects can be recursively applied to groups of objects packaged as subsystems.

Based on our experience with applying Objectory in an industrial software engineering project and our knowledge of other existing methodologies, we expected it to provide a more comprehensive set of modeling notations and methodologies at appropriate levels of abstraction with traceability between them to adequately support artifact modeling.

3

Course project: the FRIEND system and client

In order to provide as realistic as possible a software engineering experience, the system that the students were asked to build was directed toward solving an actual problem specified by a real client. Chief Bookser, of the Bellevue Police Department attended several meetings with the instructors and teaching assistants (TAs) of the course to formulate the problem statement for an emergency response system. The problem statement was then used by the instructors and the students to develop a complete set of requirements for the system. A summary of the generated problem statement is provided in Table 1. Table 1. Problem statement summary Weaknesses of todays emergency handling. Most local government emergency handling systems are established for normal call loads and are immediately overloaded when an unusual event or multiple events occur. During an emergency, responders may acquire useful information for the decision makers. This information is not transmitted by radio, unless asked for, due to limited air time. Government services possess a wealth of information invaluable during emergencies. Fire, police, emergency medical services, public works and building inspection departments compile large amounts of paper-based information. During an emergency, this paper-based information is extremely hard to use, and is rarely used. Crucial information is not always immediately available during emergencies. Incident location information, such as hazardous materials, gas line locations, etc. are not available or do not exist. There is a need for a system which would speed up response time to emergencies and provide information to make better decisions. Functional requirements. The objective of the FRIEND project is to build a system that handles multiple emergency incidents at one time under unusual call loads. The system would be prepared for a worst case scenario of an emergency. The system would allow quick and remote response between police officers, fire fighters, maintenance workers, and the headquarters’ dispatcher. A wide bandwidth will be used, so that the need for acknowledgment before transmission would not be a constraint. The system would provide instant use of information, such as: • Geographic information - such as city maps including above and underground utilities. • Hazardous materials information (Hazmat). • Building specific information - such as gas lines, water lines, fire hydrants location, exit doors, floor maps, etc. • Available government resources - such as the Emergency Operations Plan (EOP) and other resources that may help. Following guidelines and the Emergency Operations Plan the system would automatically notify personnel in the appropriate department, create task lists to follow and allocate resources, as well as any other tasks that would save time during an emergency. Every command post vehicle interacting with the FRIEND system is to have a mobile computer that uses wireless communication to report to the headquarter dispatcher. The goal is to replace the first responders reporting input mechanism by an easy input and responsive user interface, voice recognition, a touch screen or a pen based system. All transactions on the system must be archived for future analysis.

Table 1. Problem statement summary Non-functional requirements. The FRIEND system is to be used by the Bellevue Police Department as a stand alone product. Also, FRIEND is to be later used by a group of municipalities, with the ability to provide higher levels of government with local information. The hardware used by field service personnel will be exposed to weather and rough duty. FRIEND should be portable to existing hardware available in local governments. The prototype is expected to be scalable. System design should, therefore, be engineered to be expandable to deal with accidents on the state and federal levels. Acceptance Criteria. The client considers this problem statement to be a broad definition and does not expect all functionality to be delivered this semester. During the Requirement Engineering phase of the project the client will negotiate with the software engineers an acceptable prototype for delivery. After the negotiation phase the specific requirements for the client acceptance test will be frozen. The client expects to sign off on the negotiated deliverables within 4-6 weeks of the client presentation. As a minimum the client expects the delivered prototype to be expandable in a future course. As a minimum acceptable test the client expects the negotiated prototype to be successfully demonstrated on the Andrew system (campus network) with a wireless component. As a desired acceptable test the client expects the prototype to be successfully demonstrated in a field trial at the Bellevue Police Department.

It is to be noted that the target system described by the above problem statement poses similar problems to complex systems developed in industry. FRIEND is a distributed system, spanning a wide spectrum of implementation technologies (Geographical Positioning System (GPS), speech recognition, wireless communication) and operating environment constraints (rough duty use, less than ideal communication media, backward compatibility with existing platforms, and so on).

4 Project organization, development process and teaching structure As we have characterized elsewhere [6] and related to the IEEE 1074 standard on the software process life cycle [13] there are three central issues in the structuring and management of a project course: (1) the software process model utilized, (2) selection and adaptation of the software development methodology, and, (3) the project management infrastructure (including information and communication structures). Introduction of Objectory to the course most directly impacts category 2. Our use of it required an adaptation to the class context: tight time constraints of a single semester, intensive use by instructors and TAs with little or no experience with the methodology, and students with no experience in object-oriented concepts or software engineering methodologies. The introduction of Objectory also required some reconsideration and adjustments in terms of category 1, but we expected Objectory to help in better supporting our commitment to a spiral model and iterative development process. (We have already explained in the overview section of Objectory how it divides the system development process into phases and employs use cases as a guiding thread for transitioning and iterating between phases.) We considered that using Objectory was orthogonal to how we would handle category 3. Objectory does not provide much guidance about project management or communication issues. For that reason, we adapted our previous approach to project organization (team decomposition, team meetings, use of electronic bboards, and so on)

to the structure resulting from our initial decomposition of the system into groups of use cases. Special purpose teams were also formed to handle technological problems such as the communication layer and the user interface layer. We were aware of limitations to this approach to system decomposition, organization and communication [6,7] and in retrospect this did contribute to problems that will be discussed in Section 5 and in Section 6. However, we chose to minimize the risks to students by not introducing too many changes to the course during the same semester. We concentrated on which parts of Objectory to use, when and how to manage them, both from the project development and teaching perspective. One of the problems encountered when adapting Objectory is the very richness and complexity that it provides; it is a sophisticated modeling methodology with many options available that one must understand in terms of their significance if one is to decide to use or exclude them. Objectory actually provides more functionality than we used. For the purpose of customization, Objectory provides the concept of Development Cases and some guidance as to how to adapt the methodology to local context and needs. We made only very informal use of these materials, judging that they would require more expert consultation (as Objectory regularly provides to clients in this regard) and time to fully detail and document a development case. Instead, we relied upon our own experience with Objectory and more informally adopted a modeling sequence that nearly matches one of their primary development cases. The sequence we used is: • use cases, • analysis object modeling (robustness analysis), • design object modeling, including object interactions, • code generation and coding. Our plan for system development was to go through two major iterations through these major phases of Objectory primed by an initial preengineering of use cases by the project management. We discuss how we planned to execute these two cycles in this section under Section 4.6, and how they actually went under Section 5.6. The rest of this section elaborates on the preparation of the course in an “ideally and as planned” mode. The next section (Section 5) describes what actually happened. 4.1 Problem statement and initial use cases The course started on August 26, 1993 and the final client review took place on December 15, 1993. Two months before the course, the instructors and a subset of the TAs met with Chief Bookser of the Bellevue Police Department to express the initial requirements. A problem statement describing the context and the problem to be solved was first written. From the problem statement, an initial set of use cases was then developed (with the participation of the client), structuring the high level classes of functionality that the FRIEND system was to provide. The purpose of the initial requirements was dual: on one hand, it served as a hands on Objectory training for the TAs and instructors, on the other hand, it provided an actual system decomposition which enabled us to establish corresponding project teams and it gave a starting point to the students in developing the full requirements of the system, thus saving some time by setting the limits of the system. The initial set of primary actors identified was: Data Maintainer, Dispatcher, Field Supervisor, FRIEND operator, and Resource Allocator. System Administrator was identified as a secondary actor. The initial set of use case groups and use cases for the primary actors were:

Communication Receive FRIENDgram, Receive High-Priority FRIENDgram, Send FRIENDgram, Send High-Priority FRIENDgram. Coordination Close Emergency Incident,8 Create Emergency Incident, Document Emergency Situation, Report Emergency Situation. Location Get Location of Mobile Resources, Get Location of Static Unit, Identify Site Location, Update Location of Resources, Get Building-Specific Information. EOP Request Emergency Operations Plan Information, Create Request. GIS Highlight Building on Street Map, Highlight Mobile Resources on Street Map, Show Street Map Around Emergency Incident Area. Hazmat Report and Identify Hazardous Materials Involved in Emergency Incident, Request Hazardous Material Information. Resources Allocate a Resource, Deallocate a Resource, Disable a Resource, Enable a Resource, Reallocate a Resource. In presenting this initial set of use cases, we communicated to the students that we had engaged in a process of requirements engineering in order to provide a modifiable and extensible core system. Emergency response is a vast and complex context for system design and the scope of the proposed system could grow enormously to encompass many interrelated functionalities including expansion from local to state and federal emergency management agencies. To support our choice of targeted use cases for the project and to emphasize that use case modeling helps to support incremental and iterative development, we also described possible growth paths for the core system in the form of identification of optional and potential sets of use cases. For instance, a group of use cases for the System Administration actor were named but not further developed. Students were told that they were responsible only for use cases related to core functionality, and that this use case model should be refined and updated continuously throughout the development as needed. 4.2 Background and experience of participants One of the instructors and two of the TAs had experience with Objectory prior to the class. The rest of the project management team had prior experience with objectoriented software engineering methodologies (e.g. OMT). Most of the students had little or no background in object-oriented methodologies nor in object-oriented technologies (e.g. C++, OODMS,9 etc.). Most students had no experience with large scale software development. The students who did have prior experience in object-oriented programming were carefully assigned to different teams in order to even out the strength of the teams.

8. Figures 3 - 5 show a sequence of parts of student models and views produced for this use case in Objectory. 9. Object-oriented Database Management Systems.

4.3 Project and team organization Based on the initial requirements, a preliminary decomposition of the system into subsystems was done by the TAs and instructors. A team of students supervised by a single TA was then assigned for each subsystem. Each team also had the responsibility for the refinement of a number of use cases (and their associated object models and interaction diagrams) which exercised their subsystem. The primary responsibility of each team was the design, construction and testing of their assigned subsystem. Each team met formally on a weekly basis. Team members communicated with teammates and with other teams through bboards, personal email or informal meetings. Issues requiring special treatment were dealt with during special meetings attended by representatives of relevant teams. Students were responsible for producing meeting agendas and minutes which where then made available to the rest of the class through bboards and through a FrameMaker10 hypertext document browser. Project management also met weekly to deal with organizational and educational issues. Project management did not deal with artifact issues whose responsibility was left entirely to the students. Aside from subsystem teams, a documentation team, a user interface team and an integration team were formed to deal with specific tasks (respectively, documentation integration, user interface design and construction, and model integration). The documentation team was composed of a TA and a student from the English department. The integration team was composed of representatives from each of the subsystem teams. Each user interface team member was responsible for attending the weekly meetings of each of the subsystem teams. The project organizational structure is illustrated in Fig. 6.

Communication 1 TA 6 Students

Client

Managers 2 Instructors 8 TAs

Resource 1 TA 4 Students

EOP 1 TA 5 Students

Coordination 1 Instructor 6 Students Total: 2 Instructors 8 TAs 1 Client 1 Industrial Advisor 45 Students

Location 1 TA 3 Students

Industrial Advisor

Input 1 TA 5 Students UI 1 TA 7 Students

Documentation 1 TA 1 Student

Hazmat 1 TA 5 Students GIS 1 Instructor 4 Students

Integration 1 TA 9 Students

Fig. 6. Project organization chart

10. FrameMaker® is a desktop publishing software developed and distributed by Frame Technology Corporation. Recent versions of FrameMaker provide hypertext functions.

4.4 Computing environment Students had 24 hour access to a cluster of DECstations (3100 and 5000) running the Andrew operating system. 11 They used them for running Objectory and development tools (compilers, mailers, databases, revision control, etc.). They also had access from all machines to a single afs (Andrew File System) disk partition of 1 GigaByte for storing Objectory databases, C++ code and other documents and tools. 4.5 Project support functions In addition to being responsible for a subsystem and a set of use cases, each team was assigned a support function. Each function consisted of the evaluation and choice of a tool for solving a specific implementation or organizational problem. Each choice was then ratified by project management. Project support functions included: Support function Coding standards Class library Version control Makefiles Database User interface toolkit Input modes Communication toolkit Code integration Browser

Group Location Coordination Hazmat EOP Resource UI Input Communication GIS Documentation

Due date 10/1 10/15 10/1 10/15 10/1 10/1 10/1 10/1 11/30 - 12/14 9/25

4.6 Development process and training sequence We based the project learning and work structure for the project based on our prior experiences in teaching system modeling methodologies in academia and industry, and in working with them. Our experiences included teaching and use of Objectory. The course textbooks included [14] and [20]. Objectory advises that new developers do a quick iteration through the whole methodology by taking a vertical slice through some system development project (preferably a pilot project). The purpose of this first iteration is to gain familiarity with and understand the motivation for each phase in the modeling process. This level of understanding is necessary for the developers to put the appropriate amount of effort on each step and product of the process. The developers can also learn the usefulness of each intermediate product.

11. The Andrew operating system is a version of Unix developed at Carnegie Mellon University. One of the innovations of Andrew is afs (the Andrew File System) which is a distributed file system accessible uniformly from any Andrew workstation.

Ideally, this structure iteratively interleaves training and use of a new methodology in the following steps (annotated with our planned approach for the step): • See the methodology. Objectory Overview and subsequent lectures. ATM tutorial (Objectory manual). • Learn the methodology. First pass through system development with each project team focusing on their set of use cases. This phase was scheduled to last 4-5 weeks and to be concurrent with the series of lectures on the Objectory modeling phases homeworks, guidance in the project meetings from TAs, and a required set of deliverables in the form of system models and documentation. • Apply the methodology. Iterating, refining and integrating system development processes and products, cross-team coordination - rest of semester project resulting in delivered system. The lectures devoted to Objectory (listed in Table 2.) were designed around the development sequence that we had targeted for using Objectory. Table 2. Objectory lectures Phase Overview I: Objectory Methodology

Topics Concepts; Phases; Methodologies; Overview of first pass through methodology; Detailed use cases and team assignments.

Phase

Topics

Design Model I: Analysis objects to Object Interactions design objects; Object interactions: Interaction Diagrams.

Overview II: The documentation process; Products and Different users of Documentation documentation; Project documents; Hypertext; Objectory products and documents.

Design Model II: Detailed Design

Homogenization of objects; Detailed object design.

Requirements Model

Actors; Use cases: motivation, structure and roles; Object Identification: Domain Object Catalog.

Implementation Model: Introduction to C++

Features of OO languages; C vs. C++; Overloading; Compile time type checking; Base and derived classes; Calling mechanisms; Exception handling; Inheritance and encapsulation in C++.

Analysis Object Model

Robustness analysis: three object types; Object identification: entity, interface and control objects; Roles in system

System Design: (map to Objectory, packages...)

Decomposition into subsystems; System Topology; Concurrency; Data management; Software control; Boundary conditions; System design; Document template.

The remaining lectures were devoted to various general software engineering topics including the Software Lifecycle, Software Development Process, Project Management, Prototyping, Configuration Management, Testing, Delivery and Maintenance, and a number of internal and external (client) reviews and meetings. In parallel to the lectures, students were made familiar with the tool. They first went through the ATM machine tutorial (provided with the Objectory tool) under the guidance of a teaching assistant. At that time, they were introduced to the interface of the tool and learned to browse existing models and to print documentation. In a second step, the students were asked in the context of a two week homework assignment to add a use case to the ATM machine Objectory database. In groups of two persons, they added the new use case to the requirements model, created robustness analysis objects and associations, presented them in views, generated their corresponding design objects and created an interaction diagram picturing the control flow during the new use case. They were asked to submit PostScript files corresponding to the documentation products of each phase. Finally, students were asked to use Objectory in the context of their respective teams for a one month long iteration on the requirements, robustness analysis and design of the FRIEND subsystem assigned to their team. Each team worked on a separate database which would then be merged with the other teams’ databases by the integration team (see Fig. 7.). Each team was required to produce all the products for each phase of the methodologies for a subset of the use cases assigned to their team, knowing that those products would then be refined and completed during a second iteration involving more inter-team coordination. By the end of the first iteration, the students were to have learned enough of the FRIEND system requirements, the Objectory methodology and tool (and how products from each phase feed in the next phases) to effectively carry out a second iteration leading to the construction and testing of the prototype system.

Requirements Robustness Analysis Design I (Obj.Inter.Diags.) Design II (Detailed Objs.)

10/04 10/11

Requirements & Robustness Analysis

10/18

10/22

11/01

11/08

11/15

11/22

Design I (Obj.Inter.Diags.) Design II (Detailed Objs.)

Fig. 7. Planned schedule for merges

4.7 Inter-team coordination Students were provided with a number of electronic bboards, one allocated to each team, in addition to a number of bboards for project wide announcements and coordination. Bboard name announce handin GIS coordination hazmat location resource.db ui

Function Lecture announcements Homework submissions GIS group Coordination group Hazardous material group Location group Database help User interface group

Bboard name help discuss communication eop input resource sys-integration writer

Function Student help Project level discussion Communication group EOP group Input group Resource group Code integration status Documentation group

Each member of the user interface team was also responsible for attending the weekly meeting of one subsystem team. Occasionally, the user interface team would uncover inconsistencies between two subsystem teams and would raise integration issues. At the end of the first iteration, an integration team was formed with a representative of each subsystem team and a representative of the user interface team and met on a weekly basis to discuss system wide model integration issues. Subsystem teams were also allocated disk space for an Objectory database. Each team worked independently on their own team database until merge points, at which time all the team databases were merged into a single project-wide database by the integration team. After the merge point, each team then copied the project wide database and continued refining their portion of the database based on the other teams’ contributions. The refinement process occurred independently from each team until the next merge point. The first merge was planned for the end of the first iteration. Thereafter, the merges would occur at the completion of each phase of the second iteration (i.e. about every second week). The following figure illustrates the planned schedule for the merges. After the merge following the detailed design phase of the second iteration, the teams were to copy the project wide database and generate the C++ header files. They would also propagate in their local copy any design changes made during the implementation. At the end of the implementation, all the databases were then to be merged one last time and checked for inconsistencies before all documents were generated and handed to the documentation team. 4.8 Documentation and deliverables Each team was responsible for document sections relevant to the use cases it was assigned to. Each individual in the team was responsible for individual use cases. Document sections were composed of raw text, PERT charts (for the software project management plan), use case event flows, Objectory model views (for the requirements and design documents), etc. The models produced by each team were integrated by the integration team and then handed to the documentation team together with other relevant document sections.

The source code was integrated by the GIS team whose project support function was system integration. The code integration process was defined by the GIS team itself. Table 3. lists the deliverables to be produced by the students together with their planned deadline. Table 3. Documentation and deliverables for FRIEND Phase

Deliverable

Description

Due date

Requirements & Documentation

User Guide

Describes how to use the FRIEND System

12/15

Planning

Software Project Management Plan

Defines the technical and managerial processes necessary for the development and delivery of the FRIEND system.

12/15

Requirements & Robustness Analysis

Requirements Document Use Case Survey Use Case Descriptions Robustness Analysis Survey Analysis Subsystem Description

Describes completely the functionality of the system in terms of event flows. The functionality is also classified per user. This document is part of the project agreement between client and developers, representing a contract between the client and the developers. This document also describes the functionality of the system in terms of three types of objects, their attributes, and their associations. It describes the high level decomposition of the system independent of the implementation platform.

10/22

Design I & II

Design Document Design Survey Design Object Descriptions Design Attribute Types

Describes completely the implementation of the system in terms of design objects, their attributes, associations and operations. This document results in the detailed specification of each class used by the programmers during the implementation phase.

11/22

Test

Test Manual Unit Tests Use Case Tests System Tests

Describes completely the tests performed on the system before delivery along with expected and actual results. This document is used by the developers and maintainers

11/29

Implementation

Source Code

Source code for all subsystems of FRIEND

12/15

The Objectory methodology is a model-driven methodology, not a document-driven methodology. Most of the time is spent developing models vs. editing documents. Objectory generates most of the content of the documents based on the models they describe. The developer only has to complete the documents by writing introduction paragraphs and drawing figures (called views) to enhance the understandability of the document. Once a document is generated, Objectory maintains consistency between models and documents as models are modified. This supports the iterative development process encouraged by Objectory. For this reason, students were asked to use Objectory for generating the bulk of the documentation. Only the User Guide, the Test Manual and the Software Project Management Plan were to be written outside Objectory. At the completion of the development process, the Objectory documents were to be converted and formatted as FrameMaker documents by the documentation team. 4.9 Summary of project organization and process The requirements of the FRIEND system were pre-engineered by the project management and an actual client prior to the course. A set of use cases and a decomposition into subsystem was produced. At the beginning of the class, students were divided into a number of teams based on their academic background and technical skills. A team per subsystem was formed in addition to a documentation team, an integration team and a user interface team. The first month of the course was spent in learning Objectory and other support tools used by the project. Students were guided through an Objectory tutorial, and then went through a quick iteration including use case development, robustness analysis and design for FRIEND. Concurrently, lectures were given on the relevant Objectory and software engineering topics. Project support functions were also performed by the teams during the first month of the course. Project support functions included revision control, choice of a database, choice of a user interface toolkit, etc. After the first iteration, the integration team was put into place to perform a merge of the subsystem-teams’ databases. The remainder of the semester was spent iterating on the use cases, robustness analysis, design, implementation and testing of FRIEND using Objectory. The integration team served as a forum for resolving inter-team or project wide issues. The integration team was also responsible for integrating various pieces of documentation and writing the project wide documents.

5

Experience report: what happened?

In the end, the students did produce a prototype version of the system with many functionalities and a rich user interface. They successfully demonstrated it to the client in terms of proof-of-concept. The prototype served as good input for a next round of development in the subsequent semester. The same cannot be said for the status of the system models (in most cases, they did not drive the final implementation) nor was back propagation from the implementation to the model-base completed so that the models would reflect the final state of the implemented system.

The software prototype included multiple emergency incident processing, resource allocation management, detailed city maps, hazardous materials information, emergency operations plan, and logged emergency incidents. The students also demonstrated support for wireless communication, a Global Positioning System and speech recognition. The overall size of the software was estimated to about 21,000 lines of C++ code for the underlying system and about 10,000 lines of Tcl code for the user interface.12 There was a total of 41 use cases implemented in the prototype. 5.1 Problem statement and initial use cases Only about half of the TAs participated in the development of the problem statement, because this phase occurred prior to the beginning of the semester. Of those who participated in this effort, only a few had experience with a use case driven methodology. The resulting use cases were unevenly developed, some of them having only one line descriptions, others having complete event flows. Teams who were assigned the brief use cases felt more inclined to throw them away and model the requirements of their subsystem from scratch, while the teams inheriting complete use cases spend little time revising them. The training of the TAs on the Objectory methodology was also delayed given that a large proportion of them were not available before the start of the semester. The training of some TAs occurred at about the same time as the students they were responsible for. 5.2 Background and experience of participants Given the scant familiarity of the students with the tools and methodologies used in the course, the first month witnessed five major concurrent learning curves (Objectory, FrameMaker, C++, object-oriented concepts in general and software engineering). The breadth and depth of the material students had to become familiar with lowered the pedagogical impact of the first iteration on the modeling itself. As a result, a majority of students concentrated mainly on learning how to use tools instead of focusing on the methodology. The rationale for many of the steps of Objectory and the benefits of the methodology were only understood during the last month of the project. 5.3 Project and team organization The integration team meetings started only after the completion of the first iteration. A representative of each team was responsible for attending the integration meetings and helping with integration tasks. The original task of the integration team was to mechanically merge all the team databases into one consistent database which would be used by the teams as a starting point for the next phase. The main focus of the team should have been to review the merged database, detect inconsistencies due to interteam miscommunication and ensure model completeness at the system level. The integration team meetings were also meant to be a platform for raising any issues concerning multiple teams. Finally, the integration team was also responsible for writing and integrating the system wide documents.

12. Tcl[21] is an interpreted environment for building user interfaces. The originality of Tcl is that it can be integrated with applications written in C or C++.

It was observed that the integration team meetings started too late in the process. Most of its members were already assigned to other tasks within their teams and were reluctant to spend much time on integration. In some instances, a team would send a different representative for each integration meeting, thus making it hard to maintain any consistency in the process. Deadlines within the teams were usually considered higher priority by the students than the deadlines of the integration team. Another problem related to the late start of the integration team was inconsistent requirements. Each team refined their use cases with little interaction with other teams, leading to a system specification with redundant functionality in some areas, and missing functionality in others. This problem was aggravated by the communication structure with the customer: each team contacted the customer separately to deal with issues specific to their use cases, leading sometimes to inconsistent answers and use of terminology by the customer. 5.4 Computing environment A number of problems due to the environment occurred in the first few weeks of the course. Although the Andrew operating system is backward-compatible with Ultrix, its poor performance made it difficult for students to use Objectory effectively. Students experienced start up times of up to ten minutes and struggled whenever they had to use a diagram editor. Moreover, the file locking mechanism used by Objectory was not implemented by afs, and this, opened the door for internal database inconsistencies during concurrent use of Objectory. Those inconsistencies resulted afterwards in a number of tool crashes. After Objective Systems provided a solution for the locking problem, further inconsistencies were avoided. However, a clean database had to be reconstructed to get rid of anomalies from the original databases. Since the cause of the problem was detected only after several weeks, the size of the database to be reconstructed was significant. Once the locking mechanism was solved, performance degraded even more due to the large number of students working on the same database. Eventually, due to the locking and performance problems, the use of Objectory was scaled down to one representative per team, each team having a local copy of the database on an Ultrix machine (vs. an Andrew machine). Only after four weeks in the course did the software/hardware environment became fully usable by the students for project development. 5.5 Project support functions Most project support functions accomplished by the teams were due early in the semester, thus competing with the development of the use cases. In most instances, project support functions were perceived as more concrete than modeling tasks and gave the students a greater sense of progress and accomplishment. Use case requirements progressed slower than planned, leading in some cases to significant modifications of the use cases as late as the second design iteration. 5.6 Development process and training sequence All of the lectures on Objectory were given before the environment problems were solved. As a consequence, most of the material presented during those lectures was not internalized by the students, since they had only limited access to the tool at that time. In a survey at the end of the course, students expressed the need for a series of refresher

lectures which would present certain aspects of Objectory more in depth at the same time that they were using the tool for building FRIEND (e.g. a Robustness Analysis lecture during the robustness analysis of FRIEND including example problems with the analysis models of the students). Another problem discovered late in the course was that the students were not fully aware of the benefits of the tool and methodology before they generated the C++ header files and started implementation. Only then did they realize the mapping between the concepts of each of the phases from requirements to code and gain a clearer idea of how much more effectively they could have used the tool for design support (until then, the widespread belief was that Objectory was helpful only as a documentation tool). For example, it was very late in the project when students understood the rationale for finding communication associations in the robustness analysis model which would then be modeled as associations and operation paths in the design phase and finally translated into C++ object pointers and methodology invocations. This understanding did not occur after the first iteration since the generated design models were too weak to allow any meaningful code generation. 5.7 Inter-team coordination The first Objectory database merge at the end of the first iteration took more time than intended, due to internal inconsistencies left in some team databases after the locking problem. The version of Objectory which was used also had a number of problems with the basket facilities. 13 The first merge problem literally stalled the project for a week, given that all of the teams were waiting on the result of the merge to start the second iteration. Eventually, the integration team spent most of its time mechanically merging the databases and had little time left for the actual review process, thus failing to detect a number of misunderstandings between several teams. Other organizational problems lengthened the propagations of changes from the team databases to the merged databases. Some teams missed integration deadlines, thus postponing the release of their changes by a two week cycle. Other teams continued modifying their database during the merge process, making it impossible for them to use the merged database as a starting point for the next phase. The merged process itself sometimes introduced inconsistencies in the database when teams named different objects with the same names. Other times, teams modified object models they were depending on, but which belonged to a different team, without notifications introducing additional mechanical problems during the next merge. After the problems associated with database merges were solved, the developers had to spend time learning where to find relevant models and how the products of the Objectory database related to the system decomposition. This learning curve was not taken into account during the planning phase. Most of these problems resulted in little inter-team communication through Objectory models. Developers were also unable to effectively use Objectory as a communication channel due to the lack of experience (on the part of the project management as well as on the part of the developers) in using a central model-based repository.

13. The basket facility is a set of functions provided by Objectory to export and merge models taken from different databases.

5.8 Documentation and deliverables The students generated all the requested documents, and submitted them at the completion of the project (i.e. most of them past deadline). The following table illustrates the size of the documentation. Deliverable User Guide Software Project Management Plan Requirements Analysis Document Design Document Test Document Source Code Total

Size 14 pages 33 pages 76 pages 32 pages 33 pages ~21k lines of C code ~10k lines of Tcl code (user interface) 195 pages of documentation 31k lines of source code.

The students did not follow the planned process for producing documentation using Objectory and FrameMaker. Objectory was mainly used to develop model structures vs. model documentation. In most teams, object brief descriptions and use case flow of events were limited in content, thus leading to one line paragraphs in the generated document. The relationship between Objectory models and documentation was not understood, or understood only after the final documents were generated. Given the time pressure at the end of the semester, teams preferred to modify the final FrameMaker document itself instead of correcting the model and regenerating the Objectory documents, thus leading to inconsistencies between different representations of the documents. On the other hand, modification of the object structures in the source code was sometimes propagated to the Objectory models but not to the final FrameMaker document, leading to an inconsistent state where none of the representations of the modeling (documents) was both complete and reflected the latest changes in the models. Finally, the complexity of the documentation process was exacerbated by the general perception that models’ documents were merely deliverables for the client and not a communication media across teams. Therefore, the few teams that relied on either Objectory models or documents describing other subsystems based their subsystem on erroneous or obsolete assumptions from other teams, leading to significant modifications late -- during the implementation phase -- and a loss in functionality. 5.9 Summary of experience report A working prototype was demonstrated and delivered with complete documentation. In many participants’ estimation, the quality of the prototype and the level of understanding of the requirements by the students were superior to those of the previous year. However, a number of unexpected technical and organizational problems threatened the success of the project, sometimes for extended period of times. The first month was characterized by a general feeling of chaos. The installation of the Objectory tool was not fit for concurrent use by such a large number of inexperienced participants. This was exacerbated by a large number of concurrent learning phases (new tools, new procedures, new methodology, new programming

language, etc.). The project support functions competed with the development and understanding of the FRIEND requirements, given that they were more concrete to students, and thus more appealing. By the end of the first iteration, few students had an encompassing view of the requirements of the system or the methodology and the rationale for many of its products. The set-up of the integration team at the end of the first iteration helped bring out project-wide issues and made the students aware of the poor status of the project. However, this awareness came fairly late in the project, given that the set-up of the integration team took more time than planned. The second iteration was completed in less time than planned (given that it started late and had to finish on time for students to go on holidays). However, near the end of the client review, a number of problems arose during the documentation process. The integration of FrameMaker documents with the Objectory generated documentation was harder than expected, given that students did not use the tool effectively enough to allow automatic generation of the documentation. The documentation was not as detailed as it would have been if students had been aware of the benefits of the tool in that respect at the end of the first iteration.

6

Discussion (Observations and insights)

In general terms, the consequences, observations and insights that we present here regarding our experience with Objectory fall into three categories: 1. those that result from employing a methodology with a requirements modeling component like use cases within a teaching/training project (that is, including “student” developers in requirements modeling); 2. issues / problems stemming from the Objectory methodology/tool itself, and 3. consequences of our specific (possibly misdirected and/ or mismanaged) application of the methodology and tool. Of course, if we did misapply it, that may also indicate that the methodology is too hard to learn to apply within a highly-constrained, limited time-frame teaching/training project. In conjunction with the above, we hope that the observations and insights presented here may prove to be useful to anyone who is entertaining or planning to use Objectory in any context, whether educational or otherwise. The course project, process and use of the Objectory methodology were designed to be as realistic as possible without compromising pedagogical concerns. However, in generalizing our experiences to broader software engineering contexts, the following similarities and differences of the course process with an industrial software engineering process should be kept in mind: Similarities: • The problem to be solved was a real problem to which no solution already existed. Although some of the requirements of the system were pre-engineered by the instructors and TAs, they did not build a test version of the system a priori. This made the process as unpredictable in terms of planning and realization as an industrial software project. • Domain expertise was provided by an outsider to the course who did not have expertise in computer science or software engineering. During the requirements phase of the project, students interacted with a real client and not with the instructor. • Given the size of the class and the parties involved in the project, the communication web was large and strongly connected. Every team on the project depended on at least two other teams to successfully design and test

their subsystem. This resulted in a communication structure as complex as an industrial software project where teams from different divisions (R&D, QA, Marketing) have to interact effectively in order for the project to be completed successfully. • The problem to be solved was of large enough size that no single person in the project (including project management) had a complete view of the system and its issues. Rather, each team had a different perspective on the system and had to negotiate with other teams in order to integrate successfully their subsystem in the global picture. • We observe that the impact of a learning curve for introducing and applying a new methodology in terms of time, resources and results appears to be similar whether in an industrial or educational context. The most important factor here is the scale and criticality of the project in which the new methodology is introduced and the negative consequences of not allowing developers to learn the methodology iteratively and incrementally -- such as by having a pilot project where the methodology can be more easily tested, applied and integrated into practice.14 Differences: • The time-line of the project was compressed to four months for the course to be teachable in one semester. Consequences were that the delivery deadline was not movable, whether the planned deadline was realistic or not. (Occasionally, industry projects duplicate this pattern with typically disastrous results!) • The philosophy of this project was to provide a vehicle for students to get handson experience with the technical and managerial aspects of a complex software problem. The emphasis was on learning how to work in a team and encouraging individual teams to work together. In fact, we would have not accepted a system built by an individual team or single heroic programmer. This is emphasized by our grading policy: We give a project A to everybody who participates in the project if the complete software system passes the client acceptance test as defined in the requirements analysis document. If the complete system fails the acceptance test, an individual team can still get a project A, if it can demonstrate that its subsystem passes the acceptance test while accessing the other subsystems via stubs and drivers. This type of incentive would not be acceptable in an industry project. • All the teams (including project management) are constituted at the beginning of the semester. Most of the students do not have prior experience working with their team mates. In an industrial context, most of the teams preexist the project, and the proportion of new members to be integrated in a team is relatively small. • One characteristic of teaching a project in a university is that the staffing, i.e. the class enrollment, is flat and usually slightly falling during the development of the project. We immediately started with 60 designers and pedagogical concerns that they should participate in all stages. Software development in industry is generally realized with a phased staffing plan: Only a few people are involved during the requirements engineering and design phases and staff is added gradually to carry out the detailed design and implementation. 14. Unfortunately, our experiences of consulting for industrial software engineering projects suggests that time is rarely taken to introduce new methodologies in pilot projects. Rather they are often introduced on highly constrained and critical projects.

• The project time-line included training and tool setup for every member of the project (including most of the project management). This would be more representative of a start-up company than a mature industry. • Few students had any software engineering, object-oriented development, or process background prior to the course. This made team integration easier (since team members little or no prejudice about how things should be organized) but it introduced a number of problems related to the inexperience of the students (e.g. underestimation of the importance of inter-team communication). 6.1 Training Student training could be improved by structuring the methodology lectures to follow the process more closely. In addition to overviews of all the phases of the methodology during the tutorial/ramp up period, a more in depth lecture per phase could be given in parallel, with the students working in the corresponding phase during a first development iteration. Ideally, the lecture could use example models from the students themselves to point out typical mistakes or misuses. The course should also be preceded by more in-depth training of the teaching assistants (TAs). Detailed modeling questions are best answered in the context of a team meeting rather than during lectures. More knowledgeable TAs would minimize the number of times a student’s question is redirected to the right instructor or teaching assistant, thus encouraging more detailed and context sensitive questions. As per recommendation from the Objectory manuals, the first iteration through the methodology should focus more on a narrower and deeper slice of the system, instead of requiring students, as we did,15 to produce an initial model for all use cases (i.e. broader and more superficial). The first iteration should also focus on code generation. A summary lecture could also be given at the end of the first iteration to emphasize the mapping of the different models throughout the process, therefore giving better motivation for the students to use the tool more effectively during the second iteration. 6.2 Inter-team coordination In the context of team-based development, when the initial requirements of the system are not complete or are to be refined by the students, there is a need for an integration team from the very start of the process. The integration team would then be responsible for interacting with the client, and enforcing some consistency at the requirements level. Members of the integration team should have a permanent role throughout the semester, providing a number of persons in the project who have a global understanding of the system at the functional level. This shared understanding would make the integration of the models more effective in discovering earlier inconsistencies and misunderstandings between teams. The integration team could also use this shared knowledge for code integration (in the course, code integration was handled by a different team, introducing another ramp up period and serious delays in the system integration). 15. Our decision was based on balancing the learning curve for Objectory against the extremely short time frame for the project and wanting to go through two iterations on as much of the system as possible. In retrospect, we believe now that more efficient and effective iterations will be enabled by having developers who are new to the methodology experience all of its modeling phases as completely and quickly as possible.

6.3 Tool Performance. It is clear that the Objectory tool, which is presently implemented in Smalltalk, suffers from some performance problems when used on the scale we attempted - multiple teams of 6 or more people. To be accepted and used effectively by a large team-based project as the central repository for modeling and documenting the artifact it requires enough dedicated servers of adequate speed and memory. Given the platforms we used (i.e. DEC3100 & 5000), we would estimate that 3 running copies of Objectory per machine is a maximum. Given a distributed file system (such as afs or nfs) we estimate that 15 concurrent copies of Objectory accessing the same database is a maximum. In a 50 participant development effort, assuming everyone is accessing Objectory models (which is a preferable use of the methodology), this would mean that the models be split into a number of independent databases. The next subsection suggests how a central repository could be partitioned in order to minimize redundancy and inconsistencies. Separate subsystem development. The vertical division of the models into separate databases for each subsystem team introduced numerous consistency problems at all levels. The names and roles of the actors stabilized relatively late in the semester; the dependencies between subsystems were agreed upon only during the second iteration; the interfaces of public design objects were inconsistent until the beginning of the implementation phase. We felt the need for a higher level of modeling centralized into a single database which would serve as a reference for subsystem dependencies, actors, use cases and public interfaces. In a more recent release of the Objectory documentation, Objective Systems strongly recommends that the interface between subsystems be agreed upon before the modeling repository is split. Pushing further this idea, we propose the following horizontal database organization and development sequence: A single central database (level 1) contains all the actors of the system together with all the use cases which exercise more than one subsystem. The methodology is carried out for one short iteration on this set of higher level use cases to produce the subsystem decomposition and their interactions, modeled as interaction diagrams involving only subsystems (not objects). Once the relationship between the subsystems and their interface is defined, one new database per subsystem is created (level 2). Each of these databases contain only use cases which exercise that subsystem. Each use case corresponds to an operation defined in the database above. Actors in level 2 databases represent the subsystems which depend on this subsystem as modeled in the database at the level above. If the subsystem is large enough that more than one team is responsible for its development, the same process can then be applied recursively, producing a database at level 3. Only the bottom of the hierarchy would contain interaction diagrams describing operations and design objects which are to be implemented as classes (e.g. in C++). The advantages of this approach is that all the interfaces and dependencies for level n are centralized in one place (at level n-1) avoiding inconsistencies by suppressing duplication. Also, a higher level view of the system is factored out in level 1, allowing new developers to become familiar with the system without having to browse through enormous amounts of lower level information.

It is to be noted that such a decomposition also implies a different organization where some teams would only be responsible for defining and maintaining the reference databases describing subsystem dependencies. At the beginning of the process, fewer students would actually work concurrently on the requirements which would leave more person-power to deal with project support functions and training. Coordination. In the event that Objectory is used by a large number of users, all of which do not necessarily communicate directly with each other, it is difficult to identify the user responsible for an object or other entities in the models. This problem could be partially resolved by storing and providing access to the last time of modification of an object along with the id of the user who modified it. More generally, when the number of users grows, a general access control mechanism should be provided, allowing or denying on a per user basis the access and modification of parts of the model. It is difficult to implement conventions and standards for editing models when the number of members of a project is relatively large. Database merge. Baskets were a functionality newly introduced in the version of Objectory used by the students and still showed youth problems.16 The major problem introduced by baskets was a different identification scheme of model entities. In an Objectory database, entities can be accessed through browsers and direct links. Objectory does not prevent a user from creating multiple entities with the same name. On the other hand, entities imported or exported through baskets need to be named uniquely. In the event that duplicate names occurred during the creation of a database, it is fairly time consuming to remove and rename all duplicate entities prior to the export / import of baskets. Given that the functionality provided by Objectory for searching associations is limited, the problem is aggravated in the event of duplicated associations. Another problem noticed during the merge of databases was the difficulty in detecting occurrences of entities with the same name but different semantics. In the event of two objects created by two different teams with the same name, both objects are merged without notifications when the baskets are imported (assuming no duplicate names in the respective databases). In the event that the objects represented different conceptual entities, the resulting model becomes incorrect. Publication. Given the size of the models, reviewing the merged Objectory model was made difficult because of the size of the documents. In other instances, large documents had to be published even when only a part of the model needed to be reviewed. More fine grained access to different parts of documents is needed.17 In general, functionality is needed to allow developers to customize the output either by restricting the type or the scope of the published information as needed. Another problem that occurs during reviews is detecting which parts of models were changed and which remained constant. Adding change bars to a document with dates would solve part of the problem of noticing and selecting the new parts.

16. We are now aware that Objectory is working on a new facility that will replace baskets for exporting and importing between model databases. It is scheduled to be released in Version 3.5 in late 1994. 17. Objectory Version 3.4 has opened-up to complete user customization the structuring, formatting and relationships of documents to the models. We have no direct experience yet with this facility, but have heard reports from other users that it goes a long way toward solving some of the documentation problems we experienced with the methodology/tool.

6.4 Broader participation An important point, given only minor emphasis in this report, is the effect that giving requirements modeling greater emphasis in teaching / practicing software engineering has on including a broader range of participants from different backgrounds (cross cultural, women, minorities, etc.). and disciplines into the system design process. This is an increasingly important pedagogical and pragmatic goal because in building tomorrow’s complex systems we can’t rely on only “star designers” but must train and engage a wide range of people in system design. Within our teams, we observed that the use case approach, with its emphasis on initially modeling a system from the “outside in” (from the user’s viewpoint) rather than from the “inside out” (from a technical, developer’s viewpoint), enabled the majority of our students to move more easily into role of active, vocal, capable and responsible “stakeholders”. With this confidence, in many cases they then successfully took increasingly technical roles through the course of the project. 6.5 Requirements vs. object modeling Most of the teams did not focus enough on object modeling.18 Complete models appeared only in the last two revisions of the Objectory database, and in most cases, few inter-team issues were resolved by that time. It appeared that most of the time was spent initially in understanding the system requirements and the domain of application. Later, the time was spent mostly on technological problems. Only in the last quarter of the project did a global system architecture appear and were control flow problems addressed. In general, the teams viewed object modeling and inter-team communication issues as a lower priority task. Some of the reasons were that: • This was the students’ first experience with object-oriented development, methodologies and modeling. In general they had considerable experience programming and building small systems by themselves (or with one or two others) without formal modeling. Only a few had any object-oriented programming experience, but they were much more inclined to employ it instead of modeling per se. In addition, they did not have a sense of the relative importance of the different phases of the methodology and the mapping between modeling concepts and code before the start of the implementation. • Some teams viewed the completion of the project as unrealistic; they then focused more on implementing their subsystem (for “successful” demonstration as a decoupled module) and viewed integration issues as a secondary problem. Consequently, teams believed that they worked well within themselves but not with one another. This attitude somewhat compromised the general emphasis and understanding of model-based software engineering as a whole and this was reflected in the quantity and quality of Objectory object modeling produced. • Objectory does not focus on object modeling as exclusively as OMT. In OMT, the object modeling is primary and nearly the only thing supported in the tool. On the other hand, we had turned to Objectory precisely because of certain typical limitations in OMT and most other methodologies, namely 1. that there 18. Similar observations were made by Dr. Gregory Abowd, in conversation with one the authors, concerning his experience in teaching/using Objectory in a short module (~5 weeks) within a Master of Software Engineering Methods course at the Software Engineering Institute, Fall’93.

is only one object model for the whole system that must span both the analysis level (user and domain concepts) and design (implementation and environment details), and mixing these generally leads to confusion and poor results; and, 2. that there is no explicit requirements modeling support as a precursor to object identification and detailing (object modeling). Objectory does provide adequate object modeling notations and tools for both analysis and design objects (which are separate object models) but these are not given primacy in the methodology or the tool. The object modeling grows out of and in support of the use case modeling and this is reflected in the structure and interfaces of the tool. Again, it appears to take more experience (and knowledge of the whole methodology and modeling tools) to understand how to create, access, manipulate and communicate the object models in Objectory. • Objectory does not effectively support inter-group communication; intra-group communication was handled during meetings and bboards; since students did not see a priori the advantage of recording their design decisions in Objectory other than pleasing the project management, object models stayed at a fairly high level. 6.6 Objectory models and documents Objectory is a model-driven methodology. However, documents remain the main medium for communication with the outside world. The mapping of modeling that takes place in Objectory (and is available in various forms on-line in the repository) to the documentation that can be produced in external form is non-trivial. For new users, it is difficult to keep in mind that everything in a model will end up in a document and to know in which document(s) to find things. Said another way, it takes a lot of experience to relate the documentation and the models to effectively coordinate the day-to-day activities of developers and to support the periodic team-wide and project wide internal and external reviews; and this problem grows much more complex very quickly when there are multiple teams and possibly separate databases. With considerable experience, discipline and proper use of the guidance in the Objectory manuals, the current documentation facilities provided are workable (just!) albeit far from ideal (see however Footnote 16. in Section 6.3). This again motivates the practice that project developers go as quickly and completely as possible through a full pass through the methodology, or a first iteration through all the modeling phases, so they will understanding the relationships between all modeling components and these to the documents. Otherwise, the documentation produced -- which can be massive for a complex, 60 person project -- will be relatively useless in informing and managing further development. In short, Objectory makes it possible to produce better and consistent documentation directly from the modeling. However, as we discovered, it also makes it easily possible to unnecessarily contribute to the decimation of tropical forests! 6.7 Process vs. methodology vs. tool For a methodology to work well in team-based system modeling it must be supported by an adequate tool and embedded in a well managed process. Our experience is that, at its best, Objectory is a very good methodology supported by a tool that currently works adequately well if it is used in a good setup, in a project of modest scale (< 20 developers actively using it), and by users with enough experience.

Though parts of Objectory can no doubt be used successfully without a tool, for instance the use cases, to use it as a whole effectively it must have tool support. On the other hand, to use the tool effectively, it is important to understand the interconnections between the modeling phases (which as we have discussed did not occur to sufficient degree on our project). Due to conceptual difficulties (resulting from an inadequate first pass through the methodology/tool) and installation difficulties the students found the tool to be cumbersome and more of a bureaucratic task than a useful one. A significant lesson then, is that the tool must quickly be perceived as useful, otherwise, it will be discarded. The Objectory tool also has certain interface peculiarities. It does not seem to follow the same conventions that we are accustomed to for applications in the USA, but has perhaps more of a “European” look and feel. (With familiarity, this may have advantages rather than cause problems). More seriously, it is sometimes difficult to find things once they are entered into the modeling repository. There are many places to put information and it is not always clear where it goes. There are sometimes many ways to do the same thing. There is also something of a multiple level indexing problem: to put something new or change something in the modeling space, you often have to go down several levels of structure and may be unsure if others will see it or know how to access it or will notice changes in general. As we have discovered, in projects with an emphasis on methodologies, you have to be flexible enough to deal with questions such as: what happens when you can’t support the methodology as planned? or, what happens when the tool doesn’t work properly? Possible answers to these problems are to have tool and methodology “gurus” distributed across the project teams and to put in place wellunderstood policies and standards for using the tool and methodology. To fully use the Objectory methodology, there must be a clear relation between the models in an on-line repository, the published documents and team processes. Ultimately, however, the methodology itself, as strong as it, is a necessary component but not nearly sufficient for comprehensive model-based software engineering. This will require a more mature process supported by a conceptual, organizational and information infrastructures. For instance, in our project, communication and negotiation between the teams were insufficient, but the methodology couldn’t do much about that. In some of these areas involving the intimate relationship of methodology to process, Objectory appears to be on the right track, but there is a lot more involved than what Objectory currently provides. It does not do a great job of highlighting what you need to focus on - that is buried everywhere in the manuals, in tons of documentation. It has little information regarding appropriate human organization, what to communicate back and forth, how to divide up the development work, how and when to do reviews, and so on. Although Objectory provides most of the parts, it doesn’t tell you how to put the parts together. For example, having a means for the hierarchical reduction of complex systems into subsystems is good. but it doesn’t tell you how to use subsystems, only how to mechanically divide up the artifact into subsystems. In defense of Objectory, we know that they are well aware of the gap between methodologies and process and seem determined to understand and close that gap [15, 16]. 6.8 Student and TA feedback Our experiences with Objectory and its possible continued use depend not only on evaluation of the methodology itself but on how well it suits the constraints of teaching and applying it in a single semester course. Also, in our curriculum, we have through

several years now carried over the same project used in the Fall, to an advanced course in the Spring. This has several benefits. For instance, though we never reveal this to students in the Fall, we really do not expect that the system modeling and implementation will be complete in any sense - even as a first prototype. The Spring course offers an opportunity to iterate again starting with a fresh perspective, new resources, and the products from the previous course. It also enables some students to continue on the same project and they provide continuity and experience to the Spring session. Finally it provides other dimensions and context for testing how well a methodology performs in terms of the products produced and the reactions of the students who used it - sometimes students have used the same methodology in the Fall and subsequent Spring advanced course. In Spring, 1994, Objectory was used again in the advanced software engineering course. We do not have space in this paper to discuss its role or impact in that course, nor to bring in additional issues or measures for evaluation of methodologies across our course sequence - though we do have a lot of data and some ideas in this area. We are fortunate, however, that one of the authors participated in four consecutive sessions of the course, two as a student, Fall, 1992 and Spring, 1993 where the OMT methodology was used, and two as a TA, Fall, 1993 and Spring, 1994, where the Objectory methodology was used. The following are his comments, which also reflect a distillation of the reaction of many students, on Objectory in comparison to OMT. Objectory vs. OMT from the developers’ point-of-view. OMT’s scripts were not used in Fall, 1992. Only object models and screen mock-ups were used to describe the system from the users’ point-of-view. This resulted in a ill-defined picture of the system at the end of Requirements Analysis of Fall, 1992. This was demonstrated in two ways. First, developers continued to hold widely different views of the system all the way up to implementation. Second, students entering the project in the Spring Advanced SE class had a very difficult time understanding the functionality of FRIEND as it was designed. In contrast, the Use Cases of Objectory were used in the Fall, 1993. The use cases provided as a starting point at the beginning of the semester helped students to gain some initial understanding of the intended functionality. They were also useful to the Spring, 1993 students to gain an understanding of the system. Overall, I believe the student developers found OMTool to be more useful than Objectory. OMTool is a closer match to C++ than Objectory. Its usefulness during the detailed design and implementation far exceeded Objectory’s, primarily because OMTool’s expressiveness enabled it to capture more of the details of the detailed design. Also, OMTool’s C++ code generation is much better than Objectory’s. OMTool also did an adequate job at modeling the dynamic and functional properties of the system, although probably not much better than Objectory. Most importantly, OMTool did not suffer from the performance and reliability problems which prevented Objectory’s success.19 In the end, once all the designs had been captured in the various tools, I believe that the Objectory database provides a more detailed and complete documentation of FRIEND than the documents produced by the Fall, 1992 effort. Mainly this is due to the use cases, which describe the system in a way that was not done using OMT. Although the OMT object models from Fall, 1992 were more complete and detailed than anything

19. Performance and reliability problems mentioned in this quotation were primarily due to the installation of Objectory in the course context, not due to the tool itself. OMTool has a much more limited application to the system modeling process. The broader application of Objectory to the system modeling process places more demand on the tool.

ever entered into Objectory, they are still not useful without the big picture, which was not captured at all by OMT. Value of Objectory models to Spring, 1994. At the end of the Fall, 1993 semester, the Objectory database had fairly complete use cases, mostly inaccurate object models, and few interaction diagrams. It was most useful to the students in the Spring who had not taken the Fall class. While the Objectory database did not provide all, or even most of the details of FRIEND to these students, it provided a starting point which was missing to the S’92 people. Overall, I think the new students in Spring, 1994 became more comfortable with FRIEND faster than in Spring, 1993. Most of the object models were redone during the Spring, 1994 class. Many of the object models were incomplete or inaccurate. Those that were both complete and accurate were redone because they were redesigned. I do not believe they provided much help at all to the Spring, 1994 class; whether this is Objectory’s fault or the developers’ is an open question.

6.9 Summary of observations and insights In our adaptation and application of the Objectory tool and methodology to a large project-based course context, there were a few problems with lots of consequences. Objectory is a very open-ended and demanding methodology in the sense that it requires much more thought and experience than other methodologies. Objectory provides most of the parts needed for a successful software development; however, little guidance is provided on how to integrate them. The Objectory tool does not support the methodology as well as needed. Instead of being a vehicle for teaching the methodology, it became an obstacle which discouraged the students to use it, even though the methodology was not at fault. In order to increase the experience and the awareness of the students with the methodology, the first iteration should focus on a deeper cut of the system (more detail, less functionality) and be completed all the way to code generation and first document drafts. We did not accomplish as much as we hoped with respect to model-based software engineering. Little focus was put on the object modeling by the students, mainly because of competing tasks (e.g. project support functions) and mismanagement. A richer design object model and more reliable code generation function (such as the one OMTool provides) would also help in focusing the teams on object modeling. The integration team did not serve as a catalyst for the modeling as expected, due to the late set-up of the integration team. In general, inter-team communication problems can defeat any methodology, regardless of its usefulness. As suggested previously, a horizontal decomposition of the teams’ Objectory databases together with a better database merge facility could drastically improve inter-team communication and minimize inconsistencies and duplications. Having a developer per subsystem-team focus exclusively on the requirements during the first part of the semester and then become part of the integration team would increase shared understanding. Finally, it is relatively easy to store enormous amounts of information in Objectory. However, reviewing successive versions of the models or the documentation is difficult, due to the complexity of the relationships between models and documents, and the complexity of the Objectory generated documentation in general. Experience with the documentation process could be improved during the first iteration by requiring students to produce first drafts of the documents before the start of the second iteration. Despite the above mentioned problems with the Objectory methodology and tool, we estimate that the students’ productivity and their understanding of system requirements significantly improved relative to the previous course where OMT was used.

7

Future plans and recommendations

7.1 Future plans Despite the limitations we experienced during the Fall 93 and Spring 94 semesters, we plan to continue our use of Objectory with the following differences: • Several performance and reliability problems mentioned in this paper have been corrected in the current version of Objectory (3.4). • We will dedicate more hardware resources (8 HP workstations) exclusively for running Objectory. • We will employ a horizontal decomposition of Objectory databases instead of our previous vertical decomposition (see Section 6). • We will retain the two iteration process; however, we will ensure that the first iteration includes the generation of class headers in C++ and a first draft of the documents. In the longer term, we plan to address communication structure and information management issues. We already have found that metrics based on the number and content of the electronic bulletin board posts sent by the students (about 2500 posts) can give valuable insights in the communication latency and break down. We are currently developing requirements and building an information management system called IWEB (for Information Web) [11] based on the n-dim (n-Dimensional Information Modeling) tool and methodology [18]. We have illustrated the useful relationship of information modeling to project management by experimenting with n-dim on an industrial project that used Objectory [12]. IWEB will support structured negotiation of inter-team issues as well as integrated management of the various artifacts produced during the development process (documents, figures, presentations, source code, and so on). IWEB also provides a more integrated means of developing and computing metrics on the communication, for use by the project management. Deployment of IWEB internally in EDRC is planned for F’94 and, if successful, use in the advanced software engineering course is planned for S’95. 7.2 Recommendations We found Objectory to be a very appropriate methodology to use in a software engineering course or training project. Objectory is comprehensive, covers corners and details that other methodologies do not, and provides a fairly extensive tool support. However, there a number of crucial points which have to be addressed when adapting and using Objectory in a course context: • Objectory supports well a process comprising a small enough number of people such that communication is not a problem; however, communication structures should be created and monitored when the number of participants increases (e.g. more than 10 people); the tool will not solve that problem alone. • Objectory provides a large number of rich models and documents. Although this provides additional flexibility and expressive power, it creates a serious information management problem for large projects (e.g. above 10,000 lines of code). Objectory alone cannot address this issue. • For the same reason, effective use of Objectory requires experience and thought. A quick and complete iteration through a narrow vertical cut of the system (as advised by Objectory) is the fastest way we know of to provide participants with such required experience.

• The Objectory tool is intended to support the methodology and serve as a vehicle for teaching. However, in case of failure, it can become an obstacle and provoke from the participants a negative reaction against the methodology, although the methodology itself is not at fault. We recommend strongly that the installation of the tool be tested for performance to scale prior to a project and that a sufficient number of expert users be trained to provide adequate support to the participants. In summary, we warn against combining such a sophisticated methodology as Objectory with a relatively immature process and organization. From our experience, we believe that adequate organization and process structures should be put in place before substantial methodological changes. However, attempting to introduce a sophisticated methodology into an immature process may also trigger the awareness that the process must be improved.

8

Acknowledgments

The success of the software engineering courses (15-413 & 15-499) at Carnegie Mellon University would have been impossible without the support, work and dedication of a large number of people. We would like to specifically thank: All the students of the course, who managed to keep a very positive attitude through times of chaos, and successfully completed the prototype system beyond expectations. Objective Systems, SF AB, for free licenses of Objectory, extensive support and short turn around maintenance during the deployment of Objectory during the course The TAs of the course Fall, 1993, namely (aside from two of the authors) Bongin Choi, Pepe Galmes, Todd Kulick, Kevin O’Tool, James Uzmack and Minh Tue Vo. Chief Michael Bookser of the Bellevue Police Department, who spent valuable time expressing the FRIEND requirements and interacting with the students. Industrial advisor Michael Ehrenberger, who gave us expert advice during the design of the syllabus, contributed to the series of lectures on the Objectory methodology, and taught the Robustness Analysis lecture. The School of Computer Science and the EDRC at Carnegie Mellon University, for their contribution in computing and other resources. Finally, we would like to thank the anonymous reviewers of this paper for the conference for their insightful comments and suggested edits.

9

Bibliography 1. S. Bilow: “Reflections on the state of object-oriented literature”, Journal of Object-Oriented Programming (JOOP), June, 1993, pp. 72-73. 2. G. Booch: Object-Oriented Design with Applications, Benjamin Cummings, NY, 1991. 3. G. Booch: The Booch Method Process and Pragmatics, Rational Inc., Santa Clara 1992. 4. B. Bruegge, J. Cheng and M. Shaw: “A software engineering course with a real client”, Carnegie Mellon University, Technical Report CMU-SEI- 91-EM-4, July 1991. 5. B. Bruegge: “Teaching an industry-oriented software engineering course”, C. Sledge (Ed.), Software Engineering Education, Lecture Notes, in Computer Science, Vol 640, pp. 65-87, Springer Verlag, 1992.

5. B. Bruegge, J. Blythe, J. Jackson and J. Shufelt: “Object-oriented system modeling with OMT”, Conference on Object-Oriented Programming Systems, Languages and Applications (OOPSLA ‘92), Vancouver, October 1992. 6. B. Bruegge and R. Coyne: “Model-based software engineering in larger scale project courses”, Proceedings of the IFIP Working Conference on Software Engineering Education, Hong Kong, September 1993, Elsevier Science, Publisher. 7. B. Bruegge and R. Coyne: “Teaching iterative and collaborative design: lessons and directions”, Software Engineering Education, Lecture Notes in Computer Science, J. L. Diaz-Herrera (Ed.)., Springer Verlag, January 1994, 7th. SEI Conference on Software Engineering (CSEE), San Antonio, Texas, USA, January 1994. 8. P. Coad and E. Yourdon: Object-Oriented Analysis, Prentice Hall, 1991. 9. A. Cockburn: “In search of a METHODOLOGY”, Object Magazine, July August, 1994, pp. 52 -56, 76. 10. R. Coyne, U.Flemming, P. Piela and R. Woodbury: “Behavior modeling in design system development”, Proceedings of CAAD Futures '93, Conference, Carnegie Mellon University, July, 1993. 11. R. Coyne, A. Dutoit, J. Uzmack and K. O’Toole: “Requirements analysis for Information WEB (IWEB): an issue-based, information modeling and management environment for software engineering”, Technical Report, EDRC05-87-94, Engineering Design Research Center, CMU, 1994. 12. R.Coyne and M. Ehrenberger: “Information modeling for software engineering: an illustrative project history”, Technical Report, EDRC-05-88-94, Engineering Design Research Center, CMU, 1994. 13. IEEE Computer Society: IEEE Standard for Developing Software Life Cycle Processes IEEE 1074-1991, Institute of Electrical and Electronics Engineers, Inc. 345 East 47th St. New York, NY, 1992. 14. I. Jacobson, M. Christerson, P. Jonsson and G. Overgaard: Object-Oriented Software Engineering: A Use Case Driven Approach, Addison-Wesley Publishing Company, NY, 1992. 15. I. Jacobson and D. Bennett: “Life after methods: the software development process”, Object Magazine, Nov -Dec, 1993. 16. I. Jacobson: “Yes, there is life after methods: processes”, Keynote Address, Object Methods Conference, Objex, Boston, October 18-22, 1993. 17. J. Jeffcoate and I. Wesley: “Objects in use: meeting business needs”, Internal Report, Ovum, Ltd., London, England, 1992. 18. S.Levy, E. Subrahmanian, R. Coyne, S. Konda, A. Westerberg and Y. Reich: “An overview of the n-dim environment”, Technical Report, EDRC-05-65-93, Engineering Design Research Center, Carnegie Mellon University, January 1993. 19. T.F.Maleki and N. Hashemi-Nejad: “Evaluation of two major object-oriented methodologies and related tools”, Internal Report, AVEMCO Corporation, Frederick, Maryland, 1993. 20. Objective Systems, SF AB: Manuals for Objectory, Version 3.3, 1992, Box 1128, S-164 22 KISTA, Sweden. 21. John Ousterhout: An Introduction To Tcl and Tk, Addison-Wesley, NY, 1994. 22. James Rumbaugh: “Getting started: using use cases to capture requirements”, Journal of Object-Oriented Programming (JOOP), September 1994, Vol.7, No.5. 23. James Rumbaugh, Michael Blaha, William Premerlani, Frederick Eddy and William Lorensen: Object Modeling and Design, Prentice-Hall 1991.

Suggest Documents