Evolving Process Simulators by Using Validated

0 downloads 0 Views 100KB Size Report
simulation a success. Keywords-software engineering; software process; software process simulation; validated learning; lean startup; customer development.
Evolving Process Simulators by Using Validated Learning Jürgen Münch Department of Computer Science University of Helsinki Helsinki, Finland Juergen.Muench(at)cs.helsinki.fi Abstract—Software process simulation has been evolved towards a mature technology for analyzing the behavior of software processes: Methods for systematically developing simulators exist, tools for modeling and execution are available, models for nearly all kinds of processes have been created, and empirical evidence on the accuracy of many models has been gathered. However, software process simulation is still waiting for a breakthrough success. Although simulation is a technology that has been successfully established in many domains, software process simulation is not widely used in software engineering practice. Should we pivot or persevere? This article argues that it is necessary to use a rigorous approach for discovering "customers" of process simulators and finding out what they consider to be value. One mechanism to do this is to apply so-called "validated learning", i.e., to apply an actionable learning approach to identify what is relevant and what is irrelevant by systematically testing value hypotheses. Doing this promises that simulation efforts can be concentrated on value-creation and that wrong avenues can be avoided. Besides this, the article sketched prerequisites and lessons learned that need to be considered when applying simulators in practice as well as upcoming opportunities for making software process simulation a success. Keywords-software engineering; software process; software process simulation; validated learning; lean startup; customer development

I. INTRODUCTION Software process modeling is a mechanism that has been evolved over many years and has significantly contributed to software engineering research. A multitude of simulation models has been developed and many simulators have been evaluated in different development environments. Different modeling approaches exist, systematic methods for developing software process simulators have been proposed (such as [2]), and comprehensive collections of software process simulators are available (such as [3]). Software process simulation has been used for a variety of purposes, especially for planning, understanding, and improvement. In spite of these indicators for the maturation of software process simulation, it is still not widely used in practice. We argue that there are mainly two reasons: a) the value of software process simulation is not sufficiently understood and communicated, and b) industrial practice is often not ready for applying process simulation in a beneficial way.

II. VALIDATED LEARNING The acceptance and willingness of creating or using software process simulation models in organizations requires that practitioners or decision makers see a sufficient value. Different value assumptions have been proposed, for instance that software process simulation supports the improvement of operational efficiency, the reduction of risks, or decision making. However, such value assumptions need to be tested for concrete application scenarios. It is not sufficiently clear who is willing to invest in software process simulation and why. Using a systematic approach of better understanding the value of simulators and experimenting with the value of simulators could be seen as an interesting research field in itself. One approach that aims at better understanding the value of software process simulation is so-called "validated learning". Validated learning has been described in the Customer Development method and the Lean Startup approach [4]. The underlying idea of validated learning is to conduct experiments with potential "customers" to test value hypotheses. The value hypothesis “tests whether a product or service really delivers value to customers once they are using it” [4]. When identifying the value of simulators, especially the following questions might be relevant: What object should be simulated (e.g., a test process, a resource planning scenario, a Kanban process)? What are the relevant dependant variables of interest (e.g., efficiency of the test process, resource consumption, balance between demand and capability)? Who is interested in the results of the simulation model (e.g., the test engineer, the project manager, the product owner)? For what purpose should the simulation model be applied (such as improvement, planning, project management)? In which contexts should the model be applied (e.g., the quality assurance department of a specific business unit, a specific company, a certain project type)? These aspects can be seen as variables when experimenting with potential value hypotheses of simulation models. In order to conduct such experiments, a so-called minimum viable product (MVP) needs to be created, i.e., a simulator that allows for testing value hypotheses. Ries [4] describes an example of using a minimum viable product: “Instead of […] perfecting our technology, we build a minimum viable product, an early product that is terrible, full of bugs and […] stability problems. Then we ship it to customers way before it’s ready. […] After securing initial customers, we change the product constantly – much too fast to traditional standards – shipping new versions of our

products dozens of times […].” The key here is to get information about the value of products by running experiments. A minimum viable product could consist, for instance, of hypothetical results from simulation runs. Such a minimum viable product would not require the development of a simulator in order to find out if a potential customer sees a value in such results. If it turns out after several variations that the value of the simulator is not seen by customers it might be time to pivot, i.e., to change the course. Often, it is also important to be able to demonstrate the value of simulators with respect to higher level goals of an organization up to business goals. One reason is that decision makers that decide on investing in buying or building simulators have a higher-level view and need to see the value in the context of their goals. One approach to aligning software process simulation with higher level goals of an organization in a seamless and quantifiable way is to define simulation goals and use GQM+Strategies [1] for goal alignment. III. READINESS FOR SIMULATION Besides testing value hypotheses, the successful application of simulators requires that software organizations are ready for using simulation models. Based on the experience of the author, there is often a gap between the intended, the real, and the perceived software process in organizations. Having a descriptive process model that reflects the real practices in organizations is in most situations an essential prerequisite for creating process simulation models. However, organizations often do not have such a model and rely on a predictive model that describes, often in a highly generic way, how processes should be enacted. Getting descriptive models that reflect the real company practices requires that organizations have an organized approach to manage software-related processes. This does not mean that all activities need to be defined in all detail by prescriptive processes. Instead, it is necessary that an organization has a process structure that distinguishes between creative and non-creative activities, that does not allow for arbitrary process deviations, and that allows for managing software-related processes based on pre- and postconditions. Getting control over the definition, adherence, and evolution of software processes can be seen as an important prerequisite for the successful application of process simulation models. A set of established methods and techniques for defining and managing processes can be found in Münch et al. [2]. Another prerequisite is that companies are aware of their information needs. Often it is not clear what kind of information is needed in order to make decisions or assess

the fulfillment of goals or strategies on different levels of an organization. One example is that companies are often struggling with making accurate commitments to customers, i.e., they need to improve the capability to make more accurate customer commitments. Business objectives such as customer satisfaction or customer loyalty are typically related to this goal. Refining this goal often leads to improving the capability of assessing the progress of a program or project on different organizational levels and by doing this typically other improvement opportunities are identified (such as better planning or better dependency management). Although such visibility seems to be highly important, many companies do not know their information needs on the different levels of the organization sufficiently. If simulation models should contribute to solving the problem, an analysis of the information needs is necessary. Otherwise, the definition of dependent variables of a simulation model cannot be done in a value-creating way. Both, having accurate process descriptions that reflect the real world as well as being aware of the information needs are prerequisites for successfully using software process simulation modeling. IV. PROSPECTS Finally, there are some prospects that might help to make better use of software process simulation in the future. One example that could support a more widespread adoption of software process simulation is the increasing use of highly structured practices such as Scrum, Scum of Scrums, and Kanban for co-located and global development. This implies that processes get more uniform across organizations and that detailed activities get more structured (e.g., by defining detailed process steps, by using product backlogs, by limiting work in progress). In consequence, simulation models could cover widely used practices, and customization and calibration of the models could be simplified or even standardized by better understanding the relevant factors. REFERENCES [1]

[2] [3]

[4]

V. Basili, J. Heidrich, M. Lindvall, J. Münch, M. Regardie, D. Rombach, C. Seaman, A. Trendowicz, "Linking Software Development and Business Strategy Through Measurement", IEEE Computer, vol. 43, no. 4, pp. 57-65, 2010. R.J. Madachy, Software Process Dynamics, Wiley-IEEE Press, 2008. J. Münch, O. Armbrust, O., M. Kowalcyzk, and M. Soto, Software Process Definition and Management, Springer Berlin Heidelberg, 2012. E. Ries, The Lean Startup: How Today's Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses, Crown Publishing Group, 2011.

Suggest Documents