CERC TECHNICAL REPORT SERIES Technical ... - CiteSeerX

5 downloads 28919 Views 43KB Size Report
Center (CERC) has developed a model, a measurement ... computer-based tool for readiness assessment is described. ..... 5.0 Computer Support for RACE.
CERC TECHNICAL REPORT SERIES Technical Memoranda

CERC-TR-TM-93-006 Assessing Organizational Readiness for Implementing Concurrent Engineering Practices and Collaborative Technologies H. M. Karandikar et al. 1993 Proceedings of the Second Workshop on Enabliong Technologies: Infrastructure for Collaborative Enterprises, April 20-22, 1993, Morgantown, WV, pp. 83-93. Los Alamitos, CA: IEEE Computer Society Press, 1993.

ACKNOWLEDGEMENT: This effort was sponsored by the Defense Advanced Research Projects Agency (DARPA) under Grant No. MDA972-91-J-1022 for the DARPA Initiative in Concurrent Engineering (DICE).

Concurrent Engineering Research Center West Virginia University P.O. Box 6506, 886 Chestnut Ridge Road, Morgantown, WV 26506

Assessing Organizational Readiness for Implementing Concurrent Engineering Practices and Collaborative Technologies H. M. Karandikar

M. E. Fotta

M. Lawson

R. T. Wood

Concurrent Engineering Research Center West Virginia University, P. O. Box 6506 Morgantown, WV 26506

Abstract

Assessing an organization's readiness and maturity (we use the terms interchangeably) to adopt CE is one of the key preliminary steps in CE implementation. Readiness assessments coupled with a second level analysis consisting of product development process characterization (capture, modeling, and measurement) provide a quantitative basis for recommending CE implementation strategies. Therefore, an accepted CE readiness assessment procedure would help organizations in adopting CE practices and the supporting collaborative technologies. A number of CE assessment models exist in the literature and some are discussed next.

The barriers to concurrent engineering (CE) are cultural, organizational, and technological in nature. A successful implementation of CE requires that these barriers be identified up-front. The Concurrent Engineering Research Center (CERC) has developed a model, a measurement tool, and a methodology – the Readiness Assessment for Concurrent Engineering (RACE) – to assist CE implementors in identifying the barriers and prioritizing implementation actions. In this paper we discuss the readiness model, the process, and the questionnaire associated with RACE. We also identify some open issues concerned with conducting a readiness assessment and interpreting the results. Finally, the concept for a computer-based tool for readiness assessment is described.

1 . 1 Some existing assessment models A CE maturity classification scheme for organizations, in terms of levels, is presented in the CALS/CE Task Group's report [1]. A two step process for developing a roadmap for implementing CE is described. The first step deals with determining the level of CE appropriate for an organization or a program based on its goals and competitive position. For example, a complex, advancedtechnology program involving a number of participants from different organizations and spread out geographically will require comprehensive CE capabilities. In order to facilitate assessment, nine influencing factors for CE are defined together with four levels of complexity. The second step involves the identification of the required characteristics of the CE approach. The attributes of CE are classified into four major categories: organization, requirements, communication, and development methodology. This classification is similar to the one presented by the Mentor Graphics Corporation [2]. The approach presented in [1] has yet to be validated and lacks a replicable procedure to follow when conducting assessments. The methodology is complex and does not clearly distinguish between organizational and technical factors. Additionally, the method seems to be based mostly on knowledge of the electronics industry.

1 . 0 The Need for a CE Readiness Assessment Tool Experience has shown that the successful adoption of CE by an organization requires a phased transformation that inculcates the fundamental principles and practices of CE as well as an enduring commitment to improving all aspects of the organization and its processes. Once a supportive culture and policies for CE are in place, improvements may be sought from technologies that enhance the capabilities of product developers to work cooperatively and to achieve the early resolution of lifecycle issues. CERC has proposed an organizational CE transformation strategy that comprises four stages: awareness, readiness, deployment and improvement, see Figure 1. A set of tools and methodologies are being developed for facilitating this structured implementation strategy via a detailed and comprehensive study of product development processes employed by organizations. This paper discusses one such method and a supporting tool – the Readiness Assessment for Concurrent Engineering (RACE).

The model by Mentor Graphics [2] helps determine, via a questionnaire, a company's current product development environment in relation to four dimensions of CE as well

2

Readiness Assessment for Concurrent Engineering

Karandikar, Fotta, Lawson, Wood

as to key areas within each dimension. These dimensions and areas are as follows:

• Product development

Component engineering, design process, optimization.

• Organization

Team integration, empowerment, training and education, automation support.

• Communication infrastructure

Product management, product data, feedback.

• Requirements

Requirements definition, planning methodology, planning perspective, validation, standards.

This questionnaire may be followed by applying a "methods matrix" to determine the CE methods needed by an organization, a "dimensions map" to determine the variation between the existing status and needs, and a "priority roadmap" for implementing CE. This model, while replicable, is limited in its coverage of the CE elements. The elements of management systems, agility, leadership commitment, and discipline are not addressed. Additionally, the examination of the technological infrastructure is not detailed.

• Process measurement ACTIVITIES

• Metrics

Improvement • Process reengineering • Team launch

• Develop vision and plan • Readiness assessment • Process capture and modeling

Deployment

• Change management strategy • Process improvement guidelines • Collaborative information technology

Readiness • Establish business case Awareness

• RACE template • Process tools • Workshops • Case studies • Economic models

ENABLERS

THE CE TRANSFORMATION PROCESS Figure 1 – CE Implementation Strategy The assessment approach presented by the Software Engineering Institute (SEI) [3] is applicable to the software engineering process. The assessment is made in two categories: process and technology. The basis of the assessment is a Capability Maturity Model (CMM) comprised of five process stages and two technology stages. The CMM may be used for organizational selfassessment and improvement, as well as for external evaluation of an organization's capability to implement good software engineering practices and to deliver quality

software products on time. An assessment questionnaire is provided. The analysis of the responses to the questions places the organization into one of the process and technology stages. The assessment approach does not incorporate any quantifiable metrics that can be tracked to determine the progress of an organization. CERC, in its own assessment model, has begun to identify such metrics for each of the CE critical elements.

3

Readiness Assessment for Concurrent Engineering

Karandikar, Fotta, Lawson, Wood

A detailed assessment, including recommendations for improvement, of the industrial processes in defense systems development is provided in the U.S. Navy's Best Practices Template [4]. The criteria considered for a defense weapon system are Funding, Design, Test, Production, Facilities, Logistics, Management, and Transition. White and Patton describe sixteen accelerators that impact time-to-market for new products [5]. The sixteen accelerators are mapped into a numerical screening measure (from 0% to 100%) that can be used as an assessment tool for an organization's new product development process. The procedure for computing the acceleration index is extremely simplistic. However, the index measure has been positively correlated with time-tomarket reduction at three companies.

2 . 0 A CE Readiness Assessment Model

Finally, the criteria incorporated in the self-assessment methodology of the Malcolm Baldrige National Quality Award [6] are relevant to an organization's readiness for CE. The Baldrige Award criteria cover seven categories and over 28 sub-categories. The categories include Leadership, Information and Analysis, Strategic Quality Planning, Human Resource Development and Management, Management of Process Quality, Quality and Operational Results, and Customer Focus and Satisfaction.

Like the SEI's CMM, the CE readiness (or maturity) is viewed in five stages in the process area and in three (instead of two) stages in the technology area. A questionnaire is employed as part of the assessment, and the scoring of the results follows SEI's pattern. However, the list of critical elements is broader and more general in the case of the CE readiness assessment model.

The authors believe that CE readiness can best be conceptualized in terms of two major components: the product development process and practices, and technology [7]. The process component encompasses nine major elements and follows a readiness scale adapted from SEI's CMM [3]. These elements are customer focus, process focus, strategies for team formation and development, accommodation of teams within the organization, management systems, mechanisms for rapid product assurance, agility, senior leadership commitment, and discipline. The technology component covers five areas, namely, application tools, communication, coordination, information sharing services, and integration.

The list of CE critical elements is based on the definition of CE and on the consulting experience of the authors with CE implementation. For example, the CE definition stresses that "CE is a systematic approach to integrated product development that emphasizes responsiveness to customer expectations and embodies team values of cooperation, trust and sharing in such a manner that decision making proceeds with large intervals of parallel working by all lifecycle perspectives synchronized by comparatively brief exchanges to produce consensus" [8].

The approach discussed in this paper focuses on the application of CE in the product development process and is not focussed on a specific product or any class of products, e.g., software. We have drawn upon all of the models discussed earlier in establishing the content and style of our model, a preliminary version of which is presented in [7] and discussed here in Section 2.

1 . 2 The objectives of a CE readiness assessment

The notion of a systematic approach gives rise to the critical elements of Process Focus, Management Systems, and Discipline. Responsiveness to customer expectations motivates the categories of Customer Focus, Process Focus, Agility, and Product Assurance. Team values are reflected in Strategies for Team Formation and Development, Accommodation of Teams within the organization, Leadership, and Discipline. Parallel working, synchronization, and consensus are facilitated by Process Focus, by appropriate Team management practices and Management Systems, and by leveraging Communication, Coordination, and Information Sharing Technology.

Our objective is to develop an accepted CE readiness assessment procedure that yields a clear understanding of the current state of product development in an organization – the culture, management practices, and technological infrastructure. A number of objectives are associated with the assessment itself: • Develop CE implementation plans. This includes, among other things, • the identification of the CE training needs and applications of collaborative computer technologies; and • the identification of the barriers to deploying CE and enabling, collaborative, technologies. • Facilitate a shared understanding of the product development process; • Enable effective process management; and • Support process improvement using metrics.

For each critical element a set of key criteria, maturity stage definitions, issues to be addressed, and metrics to be tracked have been identified. The set of these factors, see Table 1, for all CE critical elements constitute the RACE Template. The key criteria for all critical elements are listed in Appendix A. For example, for the critical element of Customer Focus, some of the the key criteria

4

Readiness Assessment for Concurrent Engineering

are interaction with customers, methodology to capture customer requirements, and extent of understanding of the customer requirements. As an organization progresses toward heightened CE readiness, the interaction with its customers should move from occurring only early in development to continuous contact where the customer is part of the product development teams.

Karandikar, Fotta, Lawson, Wood

prioritized list of issues or solutions. CERC currently uses NGT as follows: 1. Conduct NGT with a cross-section of the organization. The output is the opinions of members of the organization on what needs to be done to implement CE. These opinions offer positive solutions for what is currently wrong with the organization but are unstructured. 2. Prune the list of issues to eliminate redundancies and have the group develop the top few priorities from the remaining list. This information is used to synthesize and prioritize implementation plans. 3. The issues are categorized into the CE critical elements (some have to be placed under more than one element). This is done by the consultant's team, and the process is subjective. 4. Once grouped, the issues are compared to the maturity stage definitions within each critical element to determine the organization's maturity stage. Again, this step is subjective.

For the Customer Focus critical element, some examples of the assessment questions asked are, • Is the customer continuously involved as a member of the product development teams? • Are there well documented procedures to follow in gathering requirements from customers? • Do most team members know the requirements of their customers? • Are procedures in place to ensure that the product development teams respond to customer input and feedback? The metrics that may be tracked are the Customer Satisfaction Index, Follow-on Business, and Time-toMarket. For a detailed description of these metrics see [9].

Although the NGT method is an efficient method of data collection, the post-processing of the information is somewhat subjective. Also, leading a group through the NGT process takes experienced personnel who may have to travel repeatedly to the organization. With this in mind, CERC realized that the development of a measurement tool, similar to the software engineering questionnaire developed by SEI [10], could improve the assessment process. Such a tool could decrease the subjectivity of the CE readiness assessment, reduce the amount of on-site consultant involvement, and reduce the dependence on CE assessment experts. The latter factor would increase the application of CE assessment to a much wider population of organizations.

CE Critical Elements Process and Technology Maturity Stages For each critical element: Key Criteria Maturity Stage Definitions Database of Questions Metrics Procedure for mapping questionnaire responses to maturity stages Table 1 – The RACE Template

3 . 0 CERC's Current Assessment Procedure

In the future we propose to supplement the NGT data collection with the RACE questionnaire, which is discussed next.

CERC has attempted to assess organizations according to the model presented in Section 2.0, but the current method is labor intensive and relies on the experience and insight of the assessor. The current techniques that are used include: • Individual interviewing; • Structured interviews with groups of people; • Observation; and • Analysis of internal company documents and external evaluations.

4 . 0 The RACE Questionnaire Based on knowledge gleaned from interactions with a number of organizations and from literature, case studies and survey results, and at conferences and seminars, CERC has developed a questionnaire for assessing the readiness of an organization to adopt CE. This questionnaire enables an organization to be assessed according to the model discussed in Section 2.0 and then placed in one of the maturity stages for each of the nine process elements and five technology elements.

The assessment method is primarily structured around the use of the Nominal Group Technique (NGT). This technique is a group brainstorming process that elicits inputs from all group members on a topic and results in a

The questions have been designed to gather information on all the key criteria for each process and technology element. At present, each question is designed to give evidence for or against an organization's classification at a

5

Readiness Assessment for Concurrent Engineering

certain level of CE maturity within one critical element only. An organization may be at different levels for different elements. An example of a possible outcome of an application of the RACE questionnaire is shown in Figure 2 as the thick shaded line (current state).

Thus, to establish content validity, we need to provide evidence for two assumptions: 1) that our definition of the domain, namely the critical elements and key criteria of our model, is complete; and 2) the questions we propose representatively sample this domain. In order to test these assumptions we will be asking knowledgeable CE practitioners whether the critical elements and key criteria we propose define the domain and whether the questions sample this domain. We will gather input from as many experts as possible. If there is agreement that the Readiness Model describes the critical CE areas and that the questions sample this domain, we will have evidence for content validity. If not, we will then incorporate suggestions and modify the model.

PROCESS and PRACTICES

The present set of questions comprise our initial questionnaire. More work needs to be done before the RACE questionnaire can be put into widespread use. First, we need to gather evidence supporting our contention that RACE has content validity; failing that, we will have to modify RACE to achieve content validity. Content validity refers to the extent to which RACE provides a representative sampling of the key issues in the implementation domain of CE.

Product assurance

Karandikar, Fotta, Lawson, Wood

Management systems STAGE 5

Agility

STAGE 4 Leadership commitment

Accommodation of teams Team formation and development Process focus

STAGE 3 STAGE 2 STAGE 1

Discipline

Customer focus

BASIC

Application tools

Integration

TECHNOLOGY

INTERMEDIATE

Information sharing

ADVANCED Communication Coordination Desired state Current state

Figure 2 – Readiness Assessment Results A second issue that needs to be resolved is whether the questions are easily understood. In order to establish this we will ask people to fill out the questionnaire and to

critique the questions. We want to elicit comments on which questions, terms, or phrases are problematic and to discover suggestions on how to improve the wording of

6

Readiness Assessment for Concurrent Engineering

any question. SEI currently employs this tactic even when using their questionnaire for actual assessments.

Karandikar, Fotta, Lawson, Wood

a tool the "CE Meter". CERC has developed an executable mock-up of the CE Meter using Prolog on a personal computer. The program is menu driven and the responses to the assessment questions are automatically synthesized and output as the readiness assessment diagram (Figure 2). The program also has a facility for report generation and inputting support documentation.

A third issue concerns performing an item analysis on the questions. An item analysis helps to discover which questions are most effective in measuring an organization's readiness with respect to a critical element. This may also show that some questions add little to the measurement process and should be dropped. The positive aspect of this is a reduction in the number of questions (thus reducing time to fill out the questionnaire) with no reduction in the efficiency of the questionnaire. The item analysis can also establish the ease of answering the questions.

The final version of the CE Meter will have two components, the RACE Tool and an Electronic CE Implementation Guidebook. The CE Meter will assist in the collection and organization of information for conducting CE readiness assessment, in the synthesis of the assessment results, and in suggesting tailored CE implementation strategies. The benefits of a CE Meter are as follows: 1. The tool can be used as an aid for consulting with organizations concerning the implementation of CE. It will serve as a knowledge-base of information about CE principles and implementation. 2. The CE Meter will reduce the assessment effort and will provide a quick turnaround of the assessment results, including metrics. 3. The CE Meter will facilitate documenting, archiving, and accessing information pertaining to the assessment. The tool will also assist in report generation.

The fourth issue concerns the reliability of RACE. Reliability refers to consistency or repeatability. For example, a wooden ruler is highly reliable for measuring lengths, but an elastic ruler would not be reliable. We would like to show that the assessment results from RACE questionnaires filled out by the same persons (or similar sample populations) do not vary over short time periods (a few days to a few weeks). Obviously, this will involve repeated administrations of RACE and, because of time constraints, may be difficult to do. While we will pursue this avenue, we will also investigate the use of less powerful but less time-consuming reliability tests, such as scoring halves of the RACE and correlating the results (split-half reliability).

5 . 1 Concept of operations Finally, we would like to establish the predictive validity of RACE. Predictive validity certifies the ability of RACE to make predictions concerning the bottom line of an organization. It is important to know the impact of being at any stage of CE maturity or readiness on the profitability of an organization. Predictive validity can be established only after a long term study of product development within organizations.

Two possible models for the deployment of a CE meter are shown in Figure 3. In Model A each user (a member of the group/organization being assessed) either is sent a copy of the CE meter (on diskette most likely) or is interviewed by the assessment advisor using the CE meter. In Model B the CE Meter is a groupware tool that can be accessed over a network so that members of a team can directly input information into the tool during a team meeting. Model B can greatly reduce the time needed to gather RACE data and can directly organize and analyze the results for the assessment coordinator. In fact, Model B may be able directly to produce the output on the readiness assessment diagram.

5 . 0 Computer Support for RACE Eventually, CERC would like to conduct CE readiness assessments via a computer-based tool, for a number of reasons. First, the assessment process can be made more efficient using a computer-based tool as will be described below. Second, many people who are unwilling to fill out "another form" are willing to answer the same questions when the questions are presented via a computer terminal. Finally, research has shown that people are more honest when answering a computer-based questionnaire than a paper-based one [11].

The Readiness Assessment Tool can be further extended and combined with an electronic CE Implementation Guidebook as shown in Figure 4. The Guidebook will be similar to the Program Manager's Workstation (PMW), a decision assistance tool for the acquisition process developed by the Computer Sciences Corporation for the Best Manufacturing Practices Program of the US Navy [12].

Since this computer-supported tool will measure aspects of concurrent engineering readiness, we have dubbed such

7

Readiness Assessment for Concurrent Engineering

Karandikar, Fotta, Lawson, Wood

Multifunctional Assessment Team

Multifunctional Assessment Team Team Members

CE Meter

CE Meter

CE Meter

Assessment Advisor

A.

Team Members

Computer-based Tool

Group CE Meter OUTPUT

Assessment Advisor

OUTPUT

Single-user Tool

B.

Groupware Tool

Figure 3 – The CE Meter Concept

Computerized questionnaire for RACE Tool process & technology assessment

Current Status: Stage X Goal: Stage Y

• Needs assessment • Vision • Corporate strategy

5 process stages & 3 technology stages Status & goal mapped onto the assessment diagram

Guidelines for CE implementation Stage X

Stage Y

CE Implementation Guidebook Figure 4 – The CE Meter

8

Implementation Plan

Responses Evaluated

Readiness Assessment for Concurrent Engineering

Guidelines for CE implementation can be stored in an electronic form. These guidelines will describe how an organization can transition from one maturity or readiness level to another along any of the critical CE elements. Once the results of the assessment – the current status of an organization (thick shaded lines in Figure 2) – are available, the same diagram can be used to indicate the state the organization would like to be in, or foresees being in, in the near future (thick bold lines in Figure 2). The CE guidelines will be structured so that the steps to be followed to progress across the levels of interest (e.g., Stage 1 to Stage 3 or Basic to Intermediate), along with estimates of the associated resources, are clearly identified. Case histories and examples of prior assessments can also be provided.

Karandikar, Fotta, Lawson, Wood

• Group Presentation (For Model B) - It is envisaged that the tool will be used in a group setting. Hence, the information in the tool should be visible to a group, and the tool should assist in synthesizing group responses. One way of achieving this is to implement the tool on a portable computer whose screen can be projected on to a room wall. • Automated RACE Scoring (For Model B) - Since Model B would enable all data to be collected at a few meetings, this CE meter version would store all needed data for the assessment. Therefore, automated scoring capabilities based on group input should be included.

6 . 0 Summary 5 . 2 Requirements for the CE Meter The development and widespread adoption of an objective method for assessing an organization's readiness to implement CE would increase the pace of CE adoption. The instrument proposed by CERC, the Readiness Assessment for Concurrent Engineering (RACE), provides the basis for such a method. RACE is based on a model of CE readiness derived from CERC's experience in the field, existing CE models, e.g., the CALS/CE model [1] and the Mentor Graphics model [2], SEI's software engineering maturity model concept [3], and the Malcolm Baldrige award criteria [6]. At present, CERC is working to improve the validity, reliability, and ease of use of the RACE template and questionnaire. This is being done by gathering input from people knowledgeable about CE and by administering the initial RACE version at organizations that have agreed to serve as test sites. We expect to establish the content validity of the instrument in the near future. Establishing the predictive validity, however, will take at least three to four years.

The preliminary set of requirements that have been determined for the CE Meter are as follows: • Simple and intuitive - Using the tool should be as simple as completing a questionnaire. Help should be provided. Two types of help, an explanation of the terminology and presentation of examples and case studies, are envisaged. • Customizable - The tool should be customizable so that the assessment can be tailored to an organization's needs by addressing their issues of concern in greater detail, using their terminology. • Easily modifiable - The critical elements, key criteria, and questions for the assessment will evolve as the tool is improved; therefore, the tool should be easily modifiable. This is particularly true of the technology assessment area. • Diagnostic as well as prescriptive capabilities The tool should not only help determine the CE readiness but should also assist in synthesizing remedial measures and a plan of action for CE implementation. The latter is achieved via the CE Guidebook. • Linkage to results - The tool should provide advice concerning the definition and implementation of metrics and should provide an idea of the quantitative improvements, in terms of the metrics, that may result from implementing the prescriptive measures of the CE Guidebook. • Documentation - The tool should have the capability to record supporting information elicited during the assessment interviews and the group sessions. The information will be associated with specific parts of the assessment, and this association needs to be maintained. • Report generation - A written report is desirable at the end of an assessment session. This will include a summary of the information collected and the results of the assessment.

Some preliminary applications of the RACE questionnaire have already yielded useful results about the ease of understanding the questions. The questionnaire, which solicits responses from the respondents in four categories – "Yes", "No", "Don't Know", and "Does Not Apply", is currently under scrutiny. It was found difficult to score, and many respondents have indicated a preference for rating scales. CERC plans to develop a computer-based version of RACE, called the CE Meter, once the validation stage for the paper-based tool is completed. A proof-of-concept CE Meter prototype has been developed, but at present it covers less domain than the questionnaire. The CE Meter will be a groupware tool with complete domain coverage and enabling data collection during one or a few meetings. The CE Meter will also automatically analyze the data to provide an assessment of the organization and, using an electronic CE Implementation Guidebook, offer

9

Readiness Assessment for Concurrent Engineering

Karandikar, Fotta, Lawson, Wood

prescriptions for how an organization can improve its CE practice and capabilities.

[11]

Acknowledgements

[12]

This work has been partially funded by the Defense Advanced Research Projects Agency (DARPA) under grant number MDA-972-91-J-1022 "DARPA Initiative in Concurrent Engineering (DICE)". The authors are deeply indebted to Dr. Jack Byrd., Jr., of the Center for Entrepreneurial Studies and Development, Inc., at West Virginia University for his early help in identifying and defining the CE readiness categories.

Appendix A - CE Critical Elements and Key Criteria The key criteria for each CE critical element, which form a part of the RACE template, are presented in this appendix. A number of these are important aspects of CE though they are not specific to CE.

References [1]

[2]

[3]

[4]

[5] [6] [7]

[8]

[9]

[10]

L. Sproull and S. Kiesler, Connections: New Ways of Working in the Networked Organization, MIT Press, Cambridge, MA, 1991. B. Willoughby and E. Rosanville, "Program Managers Workstation System Description", Computer Sciences Corporation, Systems Group, Falls Church, VA, March 1992.

L. R. Linton et. al., "First Principles of Concurrent Engineering: A Competitive Strategy for Electronic Product Development", CALS/CE Electronic Task Group, CALS Technical Report 005, 1991. D. E. Carter and B. S. Baker, Concurrent Engineering: The Product Development Environment for the 1990's, Vol. I, Mentor Graphics Corporation, 1991. M. C. Paulk et. al., "Capability Maturity Model for Software", Technical Report CMU/SEI-91-TR-24, Software Engineering Institute, Carnegie Mellon University, August 1991. "Best Practices: How to Avoid Surprises in the World's Most Complicated Technical Process – The Transition from Development to Production", Reliability, Maintainability, and Quality Assurance Directorate, Department of the Navy, March 1986. D. White and J. Patton, "Time-to-Market Profitability", Presented at ProjExpo '92, San Jose, CA, November 4-6, 1992. "1992 Award Criteria", Malcolm Baldrige National Quality Award, National Institute of Standards and Technology, Gaithersburg, MD. H. M. Karandikar, R. T. Wood and J. Byrd Jr., "Process and Technology Readiness Assessment for Implementing Concurrent Engineering", Proceedings of the 2nd Annual International Symposium of the National Council on Systems Engineering, Seattle, WA, July 21-23, 1992, pp. 117-124. K. J. Cleetus, "Definition of Concurrent Engineering", CERC Technical Report CERC-TR-RN-92-003, Concurrent Engineering Research Center, West Virginia University, Morgantown, WV, May 1992. "Process Issues in Implementing Concurrent Engineering", CERC Technical Report CERC-TR-RN93-003, Concurrent Engineering Research Center, West Virginia University, Morgantown, WV, October 1992. W. S. Humphrey et. al., "A Method for Assessing the Software Engineering Capability of Contractors (Preliminary Version)", Technical Report CMU/SEI87-TR-23, Software Engineering Institute, Carnegie Mellon University, September 1987.

10

Readiness Assessment for Concurrent Engineering

Karandikar, Fotta, Lawson, Wood

Customer Focus

• • • • •

Interaction with the customers. Methodology to capture and evaluate customer requirements. Extent of understanding of the customer requirements. Constant attention to customer satisfaction. Accommodation of new priorities.

Process Focus

• • • • •

Methodology to deploy customer requirements into the organization. Documentation and standardization of processes and metrics. Understanding the value chain and linkages with the customer and supplier value chains. Identification and control of critical process events and parameters. Pursuit of process improvement.

Strategies for team formation and development

• • • •

Team formation: representation of relevant life-cycle perspectives. Team training: social and decision-making skills. Team management and functioning. Team performance measures.

Accommodation of teams within the organization Management systems

• •

Team responsibility, accountability, and authority. Absence of organizational and infrastructural barriers.

• • •

Resource allocation authority and rationale. Integrated technical, schedule, and cost management control systems. Risk management.

Mechanisms for rapid product assurance

• •

Rapid prototyping and smart testing. Adoption of appropriate application tools (e.g., simulation and modeling tools). Adoption of appropriate practices (e.g., robust design). Compliance with appropriate standards (e.g., for product data exchange).

• • Agility

• • •

Ability to respond gracefully to change, e.g., change in the operating environment, change in performance, and change in requirements. Corporate memory. Reuse of assets.

Leadership commitment to support CE

• • • •

Leadership role model. Commitment to empowerment. Steering committee role. Resource allocation for CE implementation.

Discipline

• • • •

Willing and purposeful attention to tasks. Common methods, tools, processes, and measurements. Shared approaches to problems and decisions. Facing reality.

Table A.1 – Process Assessment Critical Elements and Key Criteria

11

Readiness Assessment for Concurrent Engineering

Karandikar, Fotta, Lawson, Wood

Application Tools

• • •

Extent of support for group interaction. Support for the notion of multiple disciplines. Decision assistance provided by the tools.

Communication Services

• • • • •

Medium of communication. Mechanism of communication. Network type and coverage. Transparency of network. Access to computing resources.

Coordination Services

• • • •

Workflow planning: Tools used, their effectiveness and flexibility. Tracking and monitoring activities: Tools used, type and quality of information output. Support for conflict recognition and resolution. Common visibility of data, decisions, and work status.

Information Sharing Services

• • • •

Medium of information storage. Information content (Product, Process, Organization data). Types of data (text, voice, video, or a combination). Accessibility to distributed information stores.

Integration Services

• • • •

Implementation of data representation and exchange standards. Data translation techniques. Openness of the tools. User interface consistency.

Table A.2 – Technology Assessment Critical Elements and Key Criteria

12