6 Mar 2012 ... Research Motivation. • Value-neutral SE methods are increasingly risky [Boehm,
2003]. – Every requirement, use case, object, test case, and ...
University of Southern California
Center for Systems and Software Engineering
Value-Based Software Test Prioritization Annual Research Review CSSE-USC Qi Li, Barry Boehm {qli1, boehm}@usc.edu March 7, 2012
University of Southern California
Center for Systems and Software Engineering
Outline • • • • •
3/6/2012
Research Motivation Research Method Case Studies Tool Support Conclusion
ARR 2012
2
University of Southern California
Center for Systems and Software Engineering
Research Motivation • Value-neutral SE methods are increasingly risky [Boehm, 2003] – Every requirement, use case, object, test case, and defect is equally important – “Earned Value” Systems don’t track business value – System value-domain problems are the chief sources of software project failures
• Testing & Inspection resources are expensive and scarce – 30%-50%, even higher for high reliability projects [Ramler, 2005] – Time-to-market [Boehm, Huang, 2005]
• Empirical Findings [Bullock 2000, Boehm & Basili 2001 ] – About 20 percent of the features provide 80 percent of business value – About 80 percent of the defects come from 20 percent of the modules – …
• Value-based Software Engineering 4+1 theorem [Boehm, 2005]
3/6/2012
ARR 2012
3
University of Southern California
Center for Systems and Software Engineering
Outline • • • • •
3/6/2012
Research Motivation Research Method Case Studies Tool Support Conclusion
ARR 2012
4
University of Southern California
Center for Systems and Software Engineering
Value-Based Software Test Prioritization What to be prioritized? •Testing items: Testing Scenarios, Testing Features, Test Cases
How to prioritize? •Value-Based (Business Importance, Risk, Cost) •Dependency Aware
How to Measure? •Average Percentage of Business Importance Earned (APBIE)
University of Southern California
Center for Systems and Software Engineering
Research Method: Value-Based • Risk Exposure (RE) – Where Size (Loss) is the risk impact size of loss if the outcome is unsatisfactory, Pro (Loss) is the probability of an unsatisfactory outcome
• Risk Reduction Leverage (RRL) – Where REbefore is the RE before initiating the risk reduction effort and REafter is the RE afterwards. – RRL is a measure of the cost-benefit ratio of performing a candidate risk reduction or defect removal activity 3/6/2012
ARR 2012
6
University of Southern California
Center for Systems and Software Engineering
Research Method: Value-Based • Value-Based Prioritization Drivers: – Business Case Analysis – Business Value – Stakeholder Prioritization
– Impact of Defect
– Size of Loss
Defect Criticality Risk Exposure
– Experience Base
– Defect-prone Components, Performers
– Probability of Loss
Testing items are to be ranked by how well they can reduce RE 3/6/2012
ARR 2012
7
University of Southern California
Center for Systems and Software Engineering
Research Method: Value-Based • Combining with the testing items’ relative costs • =>Priority Trigger:
• This proposed strategy enables them to be prioritized in terms of Risk Reduction Leverage (RRL) or ROI • Supposed to improve the lifecycle cost-effectiveness of defect removal techniques 3/6/2012
ARR 2012
8
University of Southern California
Center for Systems and Software Engineering
Research Method: Dependency Aware Value of software product to organization
Natural speech input Tertairy application functions
Animated displays Secondary application functions
User amenities
Main application functions
Operating System Investment
Basic application functions Data management system High-payoff
Diminishing returns
Cost of software product [Boehm, 1981] 3/6/2012
ARR 2012
9
University of Southern California
Center for Systems and Software Engineering
Research Method: Dependency Aware • Dependency: – Example: dependencies among test cases to be executed – Solution: Prioritization Algorithm (greedy alg) • Select the one with the highest RRL • Check dependency
3/6/2012
9->3->9->5->9->4->7
ARR 2012
10
University of Southern California
Center for Systems and Software Engineering
Research Method: Metrics • Testing Cost Effectiveness – Average Percentage of Business Importance Earned (APBIE)
3/6/2012
ARR 2012
11
University of Southern California
Center for Systems and Software Engineering
Outline • • • • •
3/6/2012
Research Motivation Research Method Case Studies Tool Support Conclusion
ARR 2012
12
University of Southern California
Center for Systems and Software Engineering
Case Studies Results • Exercise Test Prioritization based on Risk Reduction Level (RRL) software testing scenarios to be walked through in Galorath.Inc software features to be tested in a Chinese company software test cases to be executed in USC SE course projects
All of them show preliminary positive results
3/6/2012
ARR 2012
13
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (Galorath Inc.)
Prioritize testing scenarios to be walked through • Galorath Inc. (2011 Summer) Value-based 100.00% 90.00%
100.00%
93.83%
90.12%
95.06%
83.95%
87.65%
77.78%
80.00% 70.00%
74.07%
Value-neutral
58.02% 61.73%
60.00%
58.02%
51.85%
50.00%
PBIE-1
45.68%
40.00%
39.51%
30.00%
PBIE-2
Value-inverse (worst case)
35.80%
PBIE-3
25.93%
20.00%
22.22%
APBIE-1
70.99%
APBIE-2
10.08%
APBIE-3
32.10%
16.05%
10.00% 9.88% 4.94%
6.17%
0.00% 8
10
12
14
16 18 Stop Testing
20
22
24
26
28
30
– Value-based prioritization can improve the cost-effectiveness of testing 3/6/2012
ARR 2012
14
University of Southern California
Center for Systems and Software Engineering
Outline • • • • •
3/6/2012
Research Motivation Research Method Case Studies Tool Support Conclusion
ARR 2012
15
University of Southern California
Center for Systems and Software Engineering
Automated Tool Support (Beta Version)
3/6/2012
ARR 2012
16
University of Southern California
Center for Systems and Software Engineering
Tool Demo Website: http://greenbay.usc.edu/dacs/vbt/testlink/index.php
3/6/2012
ARR 2012
17
University of Southern California
Center for Systems and Software Engineering
Future Features • Establish the traceability matrix between the requirement specifications and test cases to automatically obtain test case business importance ratings • Establish the traceability matrix between test cases and defects in order to automatically predict the fail probability based on the collected historical defect data via incorporating the-state-of-art defect prediction techniques • Experiment sensitivity analysis for reasoning and judging the correctness of factors’ ratings If you would like to be a beta version tester, please contact me at
[email protected] 3/6/2012
ARR 2012
18
University of Southern California
Center for Systems and Software Engineering
Outline • • • • •
3/6/2012
Research Motivation Research Method Case Studies Tool Support Conclusion
ARR 2012
19
University of Southern California
Center for Systems and Software Engineering
Conclusion • Propose a Real “Earned Value” System to Track Business Value of Testing and Measure Testing Efficiency in terms of APBIE • Propose a Systematic Strategy for Value-based, Dependency Aware Test Processes • Apply This Strategy to a Series of Empirical Studies with different granularities of Prioritizations • Elaborate Decision Criteria of Testing Priorities Per Project Contexts, Which are Helpful for Real Industry Practices • Implement an automatic tool for its application on large-scale industrial projects
3/6/2012
ARR 2012
20
University of Southern California
Center for Systems and Software Engineering
Question and Answer
3/6/2012
ARR 2012
21