Session F1D HANDS ON SOFTWARE ENGINEERING: BUILDING A JAVA SOFTWARE ENGINEERING DEVELOPMENT ENVIRONMENT Don Shafer1 Abstract This paper describes the team and software engineering experiences of a group of undergraduate computer science students doing their final project class before graduation. The project was to build and deliver a Java software engineering development environment .The students were organized into teams that had specific assignments to deliver portions of the product. The students established a web page to track their progress and kept metrics on all of their effort. They defined and followed a software development life cycle. This was the first time these students had worked on a real deliverable pro duct. The results of the semester long class was a bootable system running under Linux that is a complete Java software engineering development environment. [1] It is being used today to drive a web server and a shared development environment for a consulting organization and their clients. The final delivered cost, including the cost of the CDROM was less than $10.00. This team oriented, product focused class was successful in having the students demonstrate to themselves the value of software engineering disciplines and practices. Index Terms software engineering, java, Linux, team oriented, software development environment
HANDS ON SOFTWARE ENGINEERING Software engineering is the discipline whose aim is the production of quality software. This software product is delivered on time, within budget, and satisfies its requirements. A defined, repeatable process is used by the software engineers developing the product. Understanding this is a prerequisite for the senior software engineering projects course at Southwest Texas State University (SWT). All 28 students who completed this final project class had one introductory class in software engineering. They were all computer science bachelor degree candidates and several were getting dual bachelor degrees and Texas secondary school teaching certificates. Software engineering, for that matter any engineering, was known only from a limited academic perspective. None of them had ever had any experience in a software engineering laboratory or project environment. This course moves students from the theoretical, programming language oriented world of computer science into the professional world of engineering. The critical difference for undergraduates is that this is a team effort. All 1
of the easy, one person projects in computer programming have been comp leted. These students now must learn to work together in teams and cooperate for both a project completion and a group grade. As opposed to graduate students in software engineering, undergraduate computer science students cannot be dragged too far away from code. Building a Java software engineering development environment (JSEDE) project was designed to address this “code comfort” issue. It was not until the project plans and requirement specifications had been reviewed that the students became comfortable with non-code deliverables. It was obvious from the behavior of the integration and tools teams that they were more than ready to write code long before the project needed any.
JSEDE PROJECT INITIATION The statement of work for the project was to build and deliver a Java software engineering development environment with these requirements: 1. Must run on any Pentium 1 based hardware or better 2. Must run on a machine with < 4 Gig hard drive 3. Must load from CDROM(s) 4. Machine will have access to the Internet 5. Must load into a green-field machine 6. Must include the entire software system (yes this does mean the OS!) 7. End User Delivered Software System must cost less than $100 8. Hardware is NOT included in the system cost 9. All documentation must be browser based 10. No pirated software allowed 11. Tools must cover entire software life cycle 12. All project support processes must be implemented 13. Must support team development environment 14. Full system and user documentation must be delivered At the first class session the students kicked off the project by (1) working in six assigned teams to review the initial statement of work and define a project life cycle, (2) define a set of deliverables from each life cycle phase, and (3) determine a minimum set of support processes.
Don Shafer, Southwest Texas State University, Graduate School of Engineering, 5608 Parkcrest Drive, Austin, Texas 78731,
[email protected]
0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31 th ASEE/IEEE Frontiers in Education Conference F1D-13
Session F1D Figure 1 JSEDE Project Development Life Cycle
Process Steps
Process Gates
Requirements Definition
Prototype Version 1
Requirements Review
High Level Design
Prototype Version 2
HL Design Review
Detail Design
Prototype Version 3
Detail Design Review
System Construction
Prototypes
Construction Review
System V&V
V&V Review System Delivery Delivery Review
Project Analysis Review
Project Management Support Processes Risk Reduction
Planning
Estimating
Metrics
Figure 1, JSEDE Project Development Life Cycle is the life cycle agreed upon by the teams. Although several argued for both the spiral model or the evolutionary prototyping model, the classic waterfall model with reviews and a set of prototypes won out. Once the teams were faced with a set of requirements and a project deadline, the simplest process set for the project was derived. What they had learned about software process was extremely academic to them until they had a project to deliver. The pragmatic viewpoint of an engineer took over when cost, schedule and deliverables were brought sharply into focus. The next step for the teams was to define all of the deliverables from each phase and activity of the life cycle. For the six life cycle phases, this was the outline of the deliverables: • Requirements Definition o System Requirements Specification (SRS) • High Level Design o Preliminary System Design Specification (SDS) • Detail Design o SDS • System Construction o JSEDE running on a target machine • System V&V o System Test Plan o Completed System Tests • System Delivery o JSEDE on a CDROM
Training
CM
Quality
For the first five of the project reviews, teams would present their deliverables to the entire class for review. The Delivery Review is done for the client, represented by the professor. It will be done in a laboratory on a target machine using the final deliverable media, CDROM. The three prototypes were thoroughly discussed as to their real purpose in a system that was mostly selection of existing packages. The teams decided to forego the third prototype for reasons of time and need. They felt that they would have a complete grasp on the requirements and the capabilities of the tools selected with two. The actual form and format of the two prototypes would be based on the life cycle phase to which they were most directly attached. For the project management support processes, these deliverables would be produced: • Risk Reduction o Specific risk matrices in Software Project Management Plan (SPMP) o Issue tracking system • Planning o SPMP • Estimating o Specific estimate sections in SPMP • Metrics o Collection of hours worked per team member on the project • Training o One integrated Web based users manual • Configuration Management (CM)
0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31 th ASEE/IEEE Frontiers in Education Conference F1D-14
Session F1D Based on the operating system selected, a configuration management and release control software system • Quality o A working system that meets the requirements o
FORMING THE “REAL” TEAMS As a result of the first engineering process session, the students had become acquainted and were introduced to the concept of software development teamwork. They also realized that not only would they have to cooperate within their teams, but the teams would not be building six separate deliverables. There was not enough time. Therefore the teams had to further divide up the project so that they could be working on parallel pieces of the overall project. The second session began with the professor leading the entire class in partitioning the JSEDE project into team-sized pieces. These five partitions developed: • Operating System • Java Tools • Software Engineering Tools • Testing • Documentation The students were then given time to self select into five teams. They discussed who had the various skills for the teams, who had worked together before and knew their respective work habits, who had similar work schedules and who lived in the same areas. Once the five teams were formed, the class discussion then was directed toward the issue of how the product would be integrated for the final delivery. This was resolved by forming a sixth team consisting of one member of each of the other teams. This was a very real world solution to common problems in complex software systems development and was an easy lesson for the students to learn.
THE JSEDE PROJECT DELIVERABLES The new, self selected teams spent several hours forming, and determining the deliverables that their teams would produce. This table shows the deliverables: Table 1 JSEDE Team Deliverables Teams Operating System
Deliverables SPMP, SRS, SDS, Metrics, Prototype 1, an operating system on which to base the remainder of the JSEDE SPMP, SRS, SDS, Metrics, Prototype 1, Prototype 2, all the software development tools for a Java environment
Software Engineering Tools
Testing
Documentation
Integration
SPMP, SRS, SDS, Metrics, Prototype 1, Prototype 2, all the software engineering tools for CM, documentation, testing, analysis and packaging SPMP, SRS, SDS,. Software Test Plan, Metrics, Prototype 1, Prototype 2 SPMP, SRS, SDS, Metrics, web based, integrated users manual SPMP, Metrics, Prototype 3, JSEDE running on target, JSEDE on CDROM
The deliverables had these formats. For the SPMP, this outline was followed: 1. Introduction. 1.1 Project Overview. 1.2 Project Deliverables. 1.3 Evolution of the SPMP. 1.4 Reference Materials . 1.5 Definitions and Acronyms 2. Project Organization. 2.1 Process Model. 2.2 Organizational Structure. 2.3 Organizational Interfaces. 2.4 Project Responsibilities. 3. Managerial Process. 3.1 Management Objectives and Priorities. 3.2 Assumptions, Dependencies, and Constraints. 3.3 Risk Management. 3.4 Monitoring and Controlling Mechanisms. 3.5 Staffing Approach. 4. Technical Process. 4.1 Methods, Tools, and Techniques. 4.2 Software Documentation. 4.3 Project Support Functions.6 5. Work Packages, Schedule, and Budget. 8 5.1 Work Packages. 8 5.2 Dependencies. 8 5.3 Resource Requirements. 8 5.4 Budget and Resource Allocation. 8 5.5 Schedule. 8 6. Additional Components. 9 6.1 Index. 9 6.2 Appendices. 9
The SRS followed this outline: 1. Introduction. 1.1 Purpose. 1.2 Scope. 1.3 Definitions, Acronyms, and Abbreviations. 1.4 References. 0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31 th ASEE/IEEE Frontiers in Education Conference F1D-15 Java Tools
Session F1D 1.5 Overview. 2. The General Description. 2.1 Product Perspective. 2.2 Product Functions. 2.3 User Characteristics. 2.4 General Constraints. 2.5 Assumptions and Dependencies. 3. Specific Requirements. 3.1. Functional Requirements. 3.2. External Interface Requirements 3.2.1 User Interfaces. 3.2.2. Hardware Interfaces. 3.2.3. Software Interfaces. 3.2.4. Communications Interfaces. 3.3. Performance Requirements. 3.4. Design Constraints. 3.4.1. Standards Compliance. 3.4.2. Hardware Limitations. 3.5. Quality Characteristics. 3.6. Other Requirements. 3.6.1. Data Base. 3.6.2. Operations. 3.6.3. Site Adaptation Requirements. 4. Supporting Information. For the design specifications, this format was followed: 1. Introduction 2. System Overview 3. Design Considerations 3.1 Assumptions and Dependencies 3.2 General Constraints 3.3 Goals and Guidelines 3.4 Development Methods 4. Architectural Strategies 5. System Architecture 5.1 Subsystem Architecture 6. Policies and Tactics 7. Detailed System Design 7.1 Detailed Subsystem Design Each of the teams produced a schedule for the delivery of these engineering specifications that was included in their SPMP. The eventual owner of this schedule was the integration team. As the group that would assemble all the pieces into a product for the CDROM, they were the “internal” customer for all of the teams.
OPERATING SYSTEM TEAM In parallel with the process deliverables, the operating system team with the integration team was making a decision on the operating system to us. Since the entire project budget was to be under $100 and the cost to the end user could not exceed $100, any commercial operating system was out of the question. At the very beginning of the team meetings, the operating choice was determined to be
Linux. In order to meet overall project time lines, the operating system team had to secure a copy of the operating system and get it installed on one of the laboratory systems. For a cost of $2.00, the Red Hat[2] 6.2 version of Linux was acquired and installed within 24 hours of the team decision. It was interesting to observe how the operating system and integration teams worked together through the set up of Linux. Of the 28 total students, only one had any prior experience with Linux. It is a tribute to both the hard work of the teams and Linux’ ease of use that they had the system up and running so rapidly.
JAVA TOOLS TEAM Since the decision was made that the project was to be a Java development environment, this team had to find the development tools available for Linux. The first stop was the Sun[3] web site to get the latest Java Development Kit and all the Java run time libraries. This was the basis for running and developing in Java, but it was not a developers environment or tool box. The team then used the web based meta search engine Copernic [4] to find Linux based Java development environments. This search resulted in over 100 hits with 10 tools from which to choose. The team downloaded and evaluated each tool, eventually selecting AnyJ[5]. This tool provided advanced code completion and the ability of an internal HelpAgent to display JavaDoc on a method level from any point of the source leading to a fast growing knowledge of the Java-Platform's huge class libraries. AnyJ's visual GUI-Builder allowed for creation of Swing-GUIApplications quickly, without the requirement to deal with layout managers. This would support the integration team and the documentation teams in their work. The test team would profit from AnyJ's various source navigation, browsing and analyzing capabilities. This improved source quality and eliminated redundancy caused by lack of project knowledge as the test team imported various Java projects to test the tools.
SOFTWARE ENGINEERING TOOLS TEAM The software engineering tools team had the greatest amount of searching to do because of the project and product tools requirements. Table 2 shows all of their individual end product deliverables. This was a true lesson in product developers “eating their own dog food.” The teams were required to begin using a tool as soon as it was selected. The tools team was not well liked for some early, beta tools they had the integration and documentation teams use.
0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31 th ASEE/IEEE Frontiers in Education Conference F1D-16
Session F1D Table 2 Tool Selection Table Tool Name ABS CVS
URL Reference http://www.ping.be/berti n/abs.shtml www.sourceforge.com
Tool Use spreadsheet configuration management tool word processor metrics tool
Corel WordPerfect PSP tool
linux.corel.com
GIMP EMACS
graphics tool system development editor http://www.etsimo.uniov programmers i.es/vim/ text editor http://www.math.fuemail text berlin.de/~guckes/pico/ editor www.netscape.com browser/html viewer http://www.mew.org/mg presentation p/ viewer http://sources.redhat.com automatic /autoconf/ configuration scripts http://www.seindal.dk/re macro ne/gnu/ processor
VIM PICO Netcape Communicator Magic Point Autoconf
GNU M4
http://www.cm.deakin.ed u.au/~peter/PSP_data/Co ntributor_index.html http://www.gimp.org/ http://www.emacs.org/
TESTING TEAM Along with producing the basic documents, the testing team delivered the test matrices and the validation matrix. All the test matrices are available on the project web site[1]. The validation matrix memorialized the critical tests because here is where the requirements were tested against the final system. The requirements against which the JSEDE was validated were: 1 Validation Requirements 2 Must run on any Pentium 1 based hardware or better 3 Must run on a machine with < 4 Gig hard drive 4 Must load from CDROM(s) 5 Machine will have access to the internet 6 Must load into a green-field machine 7 Must include the entire software system 8 End User Delivered System must cost less than $100 9 Hardware is NOT included in the system cost 10 All documentation must be browser based 11 No pirated software allowed 12 Tools must cover entire software life cycle 13 All project support processes must be implemented 14 Must support team development environment 15 Full system and user documentation must be delivered
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
100% Java IDE Plug-In Architecture Rapid GUI Development Commercial Edition (Linux only) Click & Run Servlet support Help Agent (Help & Java Doc) Code Completion (Intelli*sense) Java 1 & 2 support Java Bean, XML, JSP support Application Templates Java 2 Swing / JFC components Switchable JDK’s UniCode enabled (international) Syntax Coloring OS is required to run any tools provided by the other teams OS is to support only one computer and one user at any given time Editor Spreadsheet Configuration management system Programming metric Produce shell scripts Graphics Editor
As in all real world projects, we began with 14 requirements and ended up validating 37. There were more than 37 tests. Each requirement was tested against the 15 tools eventually delivered. The testing team also selected two tools to use to automate the testing process: 1. nBench - This tool is a Linux release 2 of BYTE Magazine's BYTEmark benchmark program (previously known as BYTE's Native Mode Benchmarks). These are Native Mode (a.k.a. Algorithm Level) tests; benchmarks designed to expose the capabilities of a system's CPU, FPU, and memory system. This test, on the target system after the full CDROM install, validates that there were no installation conflict induced in the hardware BIOS. 2. CaffeineMark 3.0 - The CaffeineMark is a series of tests that measure the speed of Java programs running in various hardware and software configurations. CaffeineMark scores roughly correlate with the number of Java instructions executed per second, and do not depend significantly on the amount of memory in the system or on the speed of a computer’s disk drives or internet connection. When this test runs successfully, the Java environment was correctly installed.
DOCUMENTATION TEAM Collection of all the produced documentation, conversion to a format compatible with the final delivered environment and making it compatible with a browser was the challenge for the documentation team. The final web site is the culmination of their efforts. As in all projects, they were the
0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31 th ASEE/IEEE Frontiers in Education Conference F1D-17
Session F1D last to know when a tool was changed or an installation script abandoned. Regular class reviews and the integration
team, mitigated the worst of the potential documentation disasters.
Table 3 Metrics on Student Team Hours Worked
SE Tools
Testing Documentation Integration
Project Plan
24
10
15
9.5
12
8
SRS
18
8
10
6
6
0
Total 1st week
42
18
25
15.5
18
8
Prototype 1
42
128
15.5
3
6
8
SDS
8
4
9
10
8.5
0
Total 2nd week
50
132
24.5
13
14. 5
8
Prototype 2
8
35
9
8
6
11
Test Plan
9
2
25
20
2
0
Total 3rd week
17
37
34
28
8
11
User Manual
0
10
0
2
19
2
Installation Plan
2
57
45
37
3
27
Total 4th week
2
67
45
39
21
29
Product
2
20
15
16
9
30
Total 5th week
2
20
15
16
9
30
Total
113
274
417.5
111.5
70.5
86
1st week
Java Tools OS
2nd week
3rd week
4th week
5th week
INTEGRATION TEAM This team consisted of one member of each of the other teams. By default, their main deliverable was the overall project plan. The integrated the other team plans into one, comprehensive schedule and monitored each teams progress against it. They were responsible for making it al come together so they were the projects “internal” customer for all deliverables. Their success meant the success of the overall project.
SOFTWARE ENGINEERING PROCESS The goal of this course was to introduce undergraduate students to software engineering as a discipline. Students completed a real software development project. They worked in teams, writing the requirements and design documents, and then the teams produced and tested the software. This course expanded upon the software engineering fundamentals learned earlier. This course provided real-world experiences using Software Engineering principles. Students experienced involvement with real internal and external users who must be satisfied with project results. The project taught team-work, participation and responsibilities to meet commitments. All grades were by team. There were no individual performance grades in this
course. Overall, it was writing-intensive: students learned to produce software engineering process deliverables directed at meeting user requirements. The real question in software engineering is how do you know how much work was completed? The students, being engineers, learned to keep track of the hours worked for each team in each deliverable. Table 3 shows the amount of student effort that went into producing the final product.
THE JSEDE SUCCESS This paper described the team and software engineering experiences of a group of undergraduate computer science students doing their final project class before graduation. The project successfully built and delivered a Java software engineering development environment .The students were organized into teams that had specific assignments to deliver portions of the product. The students established a web page to track their progress and kept metrics on all of their effort. They defined and followed a software development life cycle. This was the first time these students had worked on a real deliverable product. The results of the team-oriented class was a bootable system running under Linux that is a complete Java software engineering development environment. It is being used today to drive a web server and a shared development environment for a consulting organization and their clients. The final delivered cost,
0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31 th ASEE/IEEE Frontiers in Education Conference F1D-18
Session F1D including the cost of the CDROM was less than $10.00. This team oriented, product focused class was successful in having the students demonstrate to themselves the value of software engineering disciplines and practices.
ACKNOWLEDGMENT Thanks go to a great set of undergraduates working on their last software engineering project, over a long summer session: Documentation Team - Chris Doughty, Ian Martines, Michael Shroyer, Prentice Mooney, Thuong Nguyen, Julio Vargas; Operating System Team – Wayne McBroom, Oliver, Srivathanakul, Boedeker; Software Engineering Tools Team - Theo Grier, Ed Pecson, Cody Kutac, Claire Hermann, Brian Young; Testing Team Anthony Barahona , Jorge A. Quintanilla , Kasey Trott , Kevin Gilland , Robert Anderson , Rod Borden; Java Tools Team - Princeston Brown, Neil Cheshire, C. Patrick Maples, Carol Neel, Steven Oakes, Cliff Schramm, Ted Weston.
REFERENCES All of the references for this paper are web sites. Where known, authors or companies are cited. If the main web page has been date stamped, a publication date is provided. [1]
Shafer, Donald F., http://www.cs.swt.edu/~donshafer/JSEDE/index.html, 9 March 2001
[2]
Red Hat Linux Operating System, http://www.redhat.com/, 2001
[3]
Sun Microsystems, Inc, www.sun.com, 2001
[4]
Copernic 2000, www.copernic.com, 2000
[5]
Network Computing, GmbH, http://www.netcomputing.de/html/main.html, 2 January 2000
0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31 th ASEE/IEEE Frontiers in Education Conference F1D-19