of automated acceptance tests is assured through software inspections. As a proof of concept ... With STDD, developers write user stories first and then, in collaboration with customers, define .... Secondly, we require that story tests are reviewed by custom- ers, especially if ... Addison-Wesley, New York (2002). 3. Bisbal, J.
A Storytest-Driven Approach to the Migration of Legacy Systems Fabio Abbattista, Alessandro Bianchi, and Filippo Lanubile Dipartimento di Informatica, University of Bari Via Orabona, 4 – 70126 Bari, Italy {fabio,bianchi,lanubile}@di.uniba.it
Abstract. In this paper, we propose an agile approach, for the migration of legacy software which combines a user story-based iterative process with automated acceptance testing. The proposed approach, named Storytest-Driven Migration (STDM), requires that acceptance tests are written both on the legacy and target versions of a software system. Because of their relevance, the quality of automated acceptance tests is assured through software inspections. As a proof of concept, we conducted a first migration project of a web application towards both a web application framework and a mobile platform. Keywords: migration, storytest-driven development, acceptance testing.
1 Introduction Legacy applications draw on outdated technologies but continue to be used because they serve critical business needs; in order to allow their usage in up-to-date environments migration processes are often executed [5]. When embarking on migration projects, organizations must be sure that the outputs of the target systems will be completely consistent with those of the legacy systems. Testing should be a continuous activity during a migration project; up to 80 percent of a migration project time can be spent testing the target system [3]. Storytest-driven development (STDD) [12], also called executable acceptance testdriven development [9], is an extension of test-driven development (TDD) [2]. While TDD focuses on the unit testing level, STDD starts from the functional acceptance test level. With STDD, developers write user stories first and then, in collaboration with customers, define story tests in a tabular format [11], which can be used as a living specification; for each user story they document what inputs are supplied to the system and what outputs are expected. Once story test tables have been defined, developers write test fixtures – the code which is used to take the inputs from the story test tables – exercise the system under test (SUT) and compare the observed and expected results. Finally, developers write the system code which successfully passes story tests. We posit that organizations can benefit from adopting a test-driven approach when they have to migrate legacy applications. In particular, for the migration of legacy P. Abrahamsson, M. Marchesi, and F. Maurer (Eds.): XP 2009, LNBIP 31, pp. 149–154, 2009. © Springer-Verlag Berlin Heidelberg 2009
150
F. Abbattista, A. Bianchi, and F. Lanubile
applications, we propose a user story-based iterative process which is driven by acceptance testing. Our approach proposes to write automated story tests both on the legacy and target versions of the application. Moreover, because of their relevance, the quality of automated story tests is assured through software inspections. We call the process Storytest-Driven Migration (STDM). As a proof of concept, we conducted a first migration experience of a webmail application. The remainder of this paper is organized as follows. Section 2 describes the proposed storytest-driven approach to migrate a legacy system. Section 3 describes the first experience we conducted to validate the approach. Finally, Section 4 concludes the paper.
2 Storytest-Driven Migration Storytest-Driven Migration (STDM) is an iterative process running over user stories and terminating when all the user stories are successfully migrated to the target platform. With STDM, user stories together with related story tests take the place of the traditional requirements documents. User stories describe the features of the legacy application as well as of the migrated application. Story tests are written to be executed both on the legacy and target versions of the application. Story tests are formally reviewed at the end of each cycle. The process model we propose includes the following four steps, which are executed iteratively (see Fig. 1): Customers
Customers and others
Write user stories
Developers
Inspectors
Add a user story
Migrate story tests Write story tests
Test Code [Pass] Refactor
Review [Fail]
Fig. 1. Story Test-Driven Migration process
A Storytest-Driven Approach to the Migration of Legacy Systems
151
Write user stories. This first step is aimed at building a descriptive form of the requirements of the system to be migrated. In our context user stories might also consider issues related to the specific platform for which the migration is executed. Write story tests. Once user stories are ready, customers or users together with testers or developers define story tests aimed at validating requirements described by user stories. Having the tests in place gives a clearer perspective on what one wants to achieve as well as a confidence in performing the migration. Differently to test fixtures, which are strictly related to the software platform, story tests are written just once and they should be completely or, at least, partially reused on the target system after they are written on the legacy system. Migrate story tests. Basically, this step includes the three known activities of the TDD cycle (test, code and refactor) with some changes, shown in Figure 2. In the test step developers should write fixtures for the legacy system and run them against the legacy system. Then in the code step, they should migrate the legacy code, run fixtures written for the legacy system against the migrated system and, if needed, modify or rewrite fixtures according to the target platform. Finally, in the refactor step developers clean up the migrated code, in order to make it easier to understand. Only small changes can be applied because complete refactoring could be too expensive for the purposes of a migration project. Review. The outputs of the previous steps, i.e., test fixtures and migrated code, are reviewed by an inspection team, and in case reworked.
Migrate story tests Test
Add a test
Exercise legacy system as SUT
Code Migrate legacy code to make test pass [Fail]
Exercise migrated system as SUT [Pass]
Refactor
Fig. 2. Explosion of the “Migrate story tests” step
152
F. Abbattista, A. Bianchi, and F. Lanubile
3 A First Experience with the STDM Process The main goals of this first migration experience were the following: (1) to evaluate the feasibility of the STDM process; (2) to assess the reusability level of test fixtures written for the legacy system on the target system; and (3) to evaluate the usefulness of the selected tools to implement the STDM process. The experience consisted of a double migration of a JSP-based webmail application [6] towards the Apache Wicket framework and Java Platform Micro Edition (Java ME). Two final-year students were involved as developers while the researchers played the role of customers. We selected XPlanner as a project management tool and Subversion as a source code management system. We also used Google Docs for the first two steps of the STDM process (see Fig. 1). By using Google Docs, customers produced the document, including user stories, which was also accessible online to developers. Then developers, in collaboration with customers, wrote the story tests for each user story by creating different spreadsheets. Before automating the acceptance testing, developers also created FIT tables using FitNesse, according to the story tests previously defined. To implement text fixtures the developer group used JWebUnit for the legacy system, JWebUnit again for the Wicket-based target system, and J2MEUnit for the Java ME-based target system. This first experience allows us to get some preliminary feedback, mostly based on direct observation and comments provided by the developers. Feasibility of the STDM process. We observed that reviewing test fixtures helps to have correctly written test code which can then be used as formal specifications. According to our previous study [8] we found that inspections can improve the quality of test code. However, reviewing test fixtures when user story migration ended implied that the test issues found on the legacy are often repeated on the target system. The present study also confirms that the most common issues [10] which were found in the test code mainly affect the traceability and maintainability of test fixtures (see Fig. 3). 14 12 10 8 6 4 2 0
Wicket-based target system
Java ME-based target system
Fig. 3. Type of issues found by reviewers during inspection
A Storytest-Driven Approach to the Migration of Legacy Systems
153
Reusability of test fixtures. We found that the test fixtures written for the legacy system were almost fully reused when migrating to Wicket, while the opposite happened for the Java ME migration as the target test code had to be developed from scratch. This is because the legacy test code and the target test code for the Wicket-based solution both used the same acceptance testing framework, JWebUnit. On the contrary, the developer had to implement tests for the Java ME-based target system using a unit test framework such as J2MEUnit because, at the current date, there are no available acceptance testing frameworks for Java ME applications. However, we also found that the developer created test code faster for the Java ME-based target system than for the legacy system. Writing test code for the legacy system made comprehension of the application logic easier, and then speeded up the implementation of target test code. Usefulness of tools. We collected both positive and negative observations about the usefulness of tools we used. Google Docs and IBIS were considered effective for the purpose they were used for. JWebUnit was found better than FitNesse to automate acceptance testing of the legacy application but there was a lack of appropriate frameworks for automating the acceptance testing of Java ME applications.
4 Discussion and Conclusions We have presented a user-story based iterative process to migrate legacy applications. The test-driven approach has already been used for software migration [1], [4], [7], [13]. The novelty of our approach with respect to the existing literature mainly concerns two aspects. On one side, our approach proposes to write automated acceptance tests both on the legacy and target versions of the application. In the best case we can run the same acceptance tests written for the legacy system also on the target system. This can assure that the target system preserves the behavior of the legacy system. However, since migration may also have a different platform or language as a target, test fixtures may not be reused, and thus it may require implementing acceptance tests twice. In this worst case, we posit that writing acceptance tests can bring at least two benefits: better understanding of the system to be migrated and a valid starting point to make a migration plan. Secondly, we require that story tests are reviewed by customers, especially if they take the role of specifications in place of documents. A previous study reported results about the benefits from software inspections conducted on automated unit test cases [8]. We conducted a first migration project of a web application towards both a web application framework and a mobile platform. In general, we found that iteratively migrating a legacy system worked well and that a user story can be the right portion for quick reviews. The migration from a legacy web platform to a modern web application framework, sharing the same acceptance test framework, made it possible a seamless reuse of the story test code initially written for the legacy system. However, we were not able to make any reuse of the story test code when migrating to the mobile platform. This is not a weakness of the proposed approach, but it depends on both the deep differences between the legacy and target platforms and the lack for a Java ME-based acceptance testing framework.
154
F. Abbattista, A. Bianchi, and F. Lanubile
As future work, we intend to apply the STDM process to other and more complex legacy systems in order to assess if an earlier implementation of acceptance tests for the legacy application has a value. Through a number of experiences applying the STDM process we will define the several conditions we may encounter and some patterns to follow during migration projects. Acknowledgments. This work is partially supported by MiUR-Italy, under grant PRIN 2006 “METAMORPHOS”. We would like to thank Teresa Mallardo for the effort she put into the migration project and Mario Scalas for his valuable comments on a first draft of the paper.
References 1. Andersson, J., Bache, G., Sutton, P.: XP with Acceptance-Test Driven Development: A rewrite project for a resource optimization system. In: Marchesi, M., Succi, G. (eds.) XP 2003. LNCS, vol. 2675, pp. 180–188. Springer, Heidelberg (2003) 2. Beck, K.: Test Driven Development: By Example. Addison-Wesley, New York (2002) 3. Bisbal, J., Lawless, D., Wu, B., Grimson, J.: Legacy information systems: issues and directions. IEEE Software 16(15), 103–111 (1999) 4. Bohnet, R., Meszaros, G.: Test-Driven Porting. In: Agile Development Conference (ADC 2005), pp. 259–266. IEEE Computer Society, Los Alamitos (2005) 5. Brodie, M.L., Stonebraker, M.: Migrating Legacy Systems. Morgan Kaufmann, San Francisco (1995) 6. Brugali, D., Torchiano, M.: Software Development, Case Studies in Java. Addison Wesley, New York (2005) 7. Hennessy, M., Power, J.F.: Ensuring behavioral equivalence in test-driven porting. In: Conference of the Center for Advanced Studies on Collaborative Research (CASCON 2006). ACM Press, New York (2006) 8. Lanubile, F., Mallardo, T.: Inspecting Automated Test Code: a Preliminary Study. In: Concas, G., Damiani, E., Scotto, M., Succi, G. (eds.) XP 2007. LNCS, vol. 4536, pp. 115–122. Springer, Heidelberg (2007) 9. Melnik, G., Maurer, F.: Multiple Perspectives on Executable Acceptance Test-Driven Development. In: Concas, G., Damiani, E., Scotto, M., Succi, G. (eds.) XP 2007. LNCS, vol. 4536, pp. 245–249. Springer, Heidelberg (2007) 10. Meszaros, G.: XUnit Test Patterns: Refactoring Test Code. Addison Wesley, New York (2007) 11. Mugridge, R., Cunningham, W.: Fit for Developing Software: Framework for Integrated Tests. Prentice Hall PTR, Englewood Cliffs (2005) 12. Reppert, T.: Don’t Just Break Software, Make Software: How Story-Test-DrivenDevelopment is Changing the Way QA, Customers, and Developers Work. Better Software 6(6), 18–23 (2004) 13. Varma, P., Anand, A., Pazel, D.P., Tibbitts, B.R.: NextGen eXtreme porting: structured by automation. In: ACM Symposium on Applied Computing (SAC 2005), pp. 1511–1517. ACM Press, New York (2005)