Software Engineering Institute. National Research Council ... to help organizations transform a set of requirements into a set of COTS product .... 24X7 help desk. - On-site ... Even with the best criteria, two problems can occur: ⢠Measurement ...
Carnegie Mellon Software Engineering Institute
National Research Council Canada Conseil national de recherches Canada
Pittsburgh, PA 15213-3890
From System Requirements to COTS Evaluation Criteria Grace A. Lewis Edwin J. Morris Presented by: Tricia Oberndorf
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 1
Agenda Motivation System Requirements to Evaluation Criteria—Simple Transformation Process • Define Evaluation Requirements • Define Criteria • Prioritize Criteria Summary
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 2
Motivation
Motivation Problems we have seen with requirements in general • excessive or insufficient • contradictory • represent design rather than requirement • unduly preclude use of COTS products We developed a set of recommendations and techniques to help organizations transform a set of requirements into a set of COTS product evaluation criteria.
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 3
System Requirements to Evaluation Criteria
Why a Plain Transformation Doesn’t Work System requirements tend to be too abstract to evaluate products. System requirements are stated in terms of needs rather than capabilities to satisfy those needs. System requirements are not obviously quantifiable. System requirements do not state every expectation placed on COTS products—qualities other than functionality are often overlooked © 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 4
System Requirements to Evaluation Criteria
A Simple Transformation Process 1. Determine evaluation requirements 2. Define criteria from these requirements 3. Prioritize criteria
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 5
Step 1: Determine Evaluation Requirements
Step 1: Determine Evaluation Requirements Functional Requirements
Users, Integrators...
Operational Environment
© 2004 by Carnegie Mellon University/ National Research Council Canada
Architecture/ Interface Constraints
Programmatic Constraints
Evaluation Requirements From Requirements to Evaluation Criteria - 6
Step 1: Determine Evaluation Requirements
Additional Sources of Evaluation Requirements Several sources can provide evaluation requirements that are not system-specific. • product feature checklists • organizational checklists • previous evaluations for other systems • marketplace
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 7
Step 1: Determine Evaluation Requirements
Product Feature Checklists Feature checklists are standard fare for product comparisons; they are • a kind of specification for a class of products • often reflective of only common, not unique, features • as reliable as the source
Internet Server
Netscape Enterprise
Can require password Supports SSL v. 2 Supports SSL v.3 Supports S-HTTP Supports PCT Prohibit by domain name Prohibit by IP address
Relevant for your context? Why? © 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 8
Step 1: Determine Evaluation Requirements
Organizational Checklists Organizations need (want) consistency in IT. • represent corporate interests • avoid incomplete criteria Organizational checklists provide some uniformity/ predictability. • in coverage of corporate needs • in the overall evaluation process
© 2004 by Carnegie Mellon University/ National Research Council Canada
Categories of Criteria (Level 1) Technical Requirements Compatibility w/ other COTS Adaptability, Flexibility Reliability, Maintainability System Integration System Integrity Security Vendor Support Training Documentation License Restrictions
From Requirements to Evaluation Criteria - 9
Step 1: Determine Evaluation Requirements
Marketplace—Risk-Driven Generation Feature-driven rather than requirements-driven • Products are reviewed to compile a list of features. Focuses on risk rather than fitness • Features are analyzed to determine the risk associated with the presence or absence of that feature in a product. • The risk statement is the explicit mapping of a product feature to a system need.
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 10
Step 1: Determine Evaluation Requirements
Example: Risk-Driven Generation Product features are the source of evaluation requirements. Feature Automatic load balancing of servers based on user connects Vendor is foreign-owned
© 2004 by Carnegie Mellon University/ National Research Council Canada
Risk Response time worse than required by users None: “buy America” not required
From Requirements to Evaluation Criteria - 11
Step 1: Determine Evaluation Requirements
Classes of Evaluation Requirements Negotiable Requirements
Hard Requirements
Flexible needs imply ‘negotiable’ requirements.
Fixed needs imply ‘hard’ requirements.
• Adjust/augment requirement to match. • Adjust/augment product capabilities to fit.
© 2004 by Carnegie Mellon University/ National Research Council Canada
• Adjust/augment product capabilities to fit.
From Requirements to Evaluation Criteria - 12
Step 1: Determine Evaluation Requirements
Example of Classifying Evaluation Requirements Negotiable Requirements
Hard Requirements
Amount of productivity gain Tracking of overhead Flexibility in reporting Degree of automation Support tools Paper-less environment Decision support capabilities Example abstracted from SEI Financial System evaluation and selection process © 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 13
Step 1: Determine Evaluation Requirements
Getting it Right Errors of Inclusion = including non-applicable requirements • can eliminate suitable COTS products Errors of Exclusion = excluding applicable requirements • can cause the selection of an unsuitable COTS product
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 14
Step 2: Define Criteria
Step 2: Define Criteria A criterion consists of two elements: • a capability statement - clearly measurable statement of capability to satisfy a need • a quantification method - means for assessing and assigning a value to the product’s level of compliance with the capability
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 15
Step 2: Define Criteria
Characteristics of Good Criteria Well defined criteria are • assessable - This follows directly from our definition. • discriminating - They allow us to distinguish between products. • non-overlapping - This prevents measurement repetition. • significant - They are contextually useful.
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 16
Step 2: Define Criteria
Example of a Good Criterion Requirement • “COTS vendors shall provide extensive product support.” Capability statement • “COTS vendor support shall include - 24X7 help desk - On-site installation/training support - On-line error reporting - …” Quantification method • “Provide product support survey to potential COTS vendors. Verify claims by contacting current product users.” © 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 17
Step 2: Define Criteria
Tips for Generating Good Criteria Multiple approaches are available. Two popular ones: • Criteria decomposition • Goal Question Metric (GQM)
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 18
Step 2: Define Criteria
Criteria Decomposition Criteria may be difficult to measure in their original form. • Complex criteria are often decomposed into subordinate (and easier to measure) criteria. Internet Security Protect server from intrusion
Assure privacy of transaction
Protect system from browser
Capability statement: “Shall support SSL v2/v3 and TLS internet protocols for privacy, authentication, and integrity” Quantification method: “Support shall be verified by testbed execution of SSL v2/v3 and TLS v1 capabilities.” © 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 19
Step 2: Define Criteria
Goal Question Metric GQM provides • hierarchical decomposition of requirements into capability statements • a quantification method Goal Question Metric (GQM) • The goal is to fulfill a system requirement. • The question is the capability statement. • The metric is the quantification method, either by direct measurement or decomposition.
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 20
Step 2: Define Criteria
Example: Goal Question Metric Goal (Requirements) • minimum impact of learning curve on end-user performance Questions (Capability Statement) • Are the text and graphics visible? • Is there an intuitive interface? • Are the training materials effective? Metrics (Measurement Standard) • Readability of data by user with aging eyesight • Reaction of user to a sample set of common tasks • Retention of information via a formal exam
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 21
Step 2: Define Criteria
Quantification Quantification is a hard problem • Unfamiliarity with products • Difficulty extracting data in often black-box situations • Complexity of measuring characteristics, e.g. nonfunctional attributes However, the goal is to have fair, unbiased results.
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 22
Step 2: Define Criteria
Problems with Quantification Even with the best criteria, two problems can occur: • Measurement bias or error affects the validity of recommendations. - unfair influence from factors outside product (e.g., differing perceptions, inconsistent measurement) - addressed by reducing bias or error • Differing scales affect our ability to reason. - relative scores that do not indicate relative value (e.g., a car twice as fast is not always twice as good) - meaningless mathematics (“fair” + 1024 transactions/second = ?) - addressed via normalization © 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 23
Step 2: Define Criteria
Reducing Bias or Error Qualitative measurements (e.g.., poor, fair, good): • Different perceptions of subjective rating scales lead to inconsistent ratings by multiple evaluators. • Problem is addressed via strong guidelines for assigning ratings. Quantitative measurements: • Measurements can be systematically or randomly biased or invalidated. • Problem is addressed via consistent test environments, sufficiently accurate test harnesses, multiple test executions. © 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 24
Step 2: Define Criteria
Normalization Normalization converts all data to a consistent scale. Normalization can be accomplished at two points in the process: • when the criteria are defined: all criteria are phrased in the same way (e.g., degree of “goodness” or “fit”) • after criteria are evaluated: all scores are converted into the same scale You can use scoring ranges and fitness functions to map to a normalized scale.
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 25
Step 3: Prioritize Criteria
Step 3: Prioritize Criteria Priorities allow us to reason about products that have different strengths and weaknesses. Priorities are composites of different factors. • How relevant is a criterion? • How expensive is it to fulfill a criterion deficit? • What are the risks of not meeting a particular criterion? Some techniques • Unstructured weighting • Delphi • AHP pair-wise comparison © 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 26
Step 3: Prioritize Criteria
Delphi Technique First round: Several people individually provide their weights. Subsequent rounds • Everyone sees the weights (perhaps raw, perhaps summaries) resulting from the previous round. • All have opportunity to change their own weights (i.e., to concur or argue for a different value). Should there be several rounds with no consensus, the group position is determined by averaging.
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 27
Step 3: Prioritize Criteria
AHP Pairwise Comparison The team estimates relative importance: • Color is 1/2 as important as Speed • Color is 2 times ... Security • Color is 1/3 ... Warranty • Speed... Security • Speed … Warranty • Security … Warranty
There are six ways to pick pairs from four items
From pairwise comparisons, AHP computes weights for the four criteria: • Color.12 Speed .25 Security .06 Warranty .57
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 28
Summary
Results of the SEI Financial System Experience Reduced system requirements by two thirds (63 to 20), but it took longer to get there. However… • Exploring COTS products became easier • Evaluation of products became more manageable • There was greater flexibility in accommodating alternate products and architectures © 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 29
Summary
Summary System requirements are not COTS product evaluation criteria — they are just a starting point. You need to • Define evaluation requirements • Define criteria • Prioritize criteria This simple process needs to be instantiated with your favored techniques that make sense for the type of system you are building.
© 2004 by Carnegie Mellon University/ National Research Council Canada
From Requirements to Evaluation Criteria - 30