Formalizing Agility: An Agile Organization’s Journey toward CMMI Accreditation Steven W. Baker DTE Energy – Information Technology Services (ITS) Process and Skills
[email protected] Abstract
2. The ITS Organization The Information Technology Services (ITS) Organization provides leadership, oversight, delivery, and support on all aspects of information technology across the enterprise. Its 860 employees and contractors, with an annual operating budget of US $132 million, report to the Executive Vice-President – Chief Information Officer. ITS is functionally structured as depicted in Figure 1. The groups are:
Agile methods are compatible with formal process improvement frameworks. Rather than casting discipline and documentation to the wind, agile methods, when seriously applied, are actually very focused and comprehensive. Likewise, a framework such as the SEI Capability Maturity Model Integration (CMMI) need not be an overwhelming excess of paperwork and bureaucracy; when appropriately implemented, the CMMI encourages and enables significant and sustainable improvements. This paper describes the ongoing journey of DTE Energy's large IT organization in realizing agility and process improvement in a Fortune 300 corporation. From following waterfall-based approaches to embracing agile methods, and from early attempts with the CMM to a renewed commitment to continuous improvement by adopting the CMMI, it explores cultural and organizational changes that enable real-world process improvements.
Figure 1. ITS Functional Structure Information Office (IO) – The IO groups align ITS services with the needs of the business community. As liaisons and advocates, they understand the pressures faced by our businesses and how IT can be leveraged to provide value-added products and services. Solution Engineering – With a focus on software-based business solutions, these groups provide the skills, capabilities, tools, and methodologies to innovate, deliver, and support working software. Technology Operations – These groups design, build, and operate business and technical infrastructures ranging from cell phones, desktops, and printers to clustered servers, wireless networks, and data centers. Governance – In addition to centralized administrative functions and services, these groups oversee, measure, and analyze ITS performance and results. Together these IT professionals steward and promote technology-based solutions in partnership with our various business units. This report, exploring the ITS Organization’s experiences in continuous process improvement initiatives, begins with an overview of its cultural and methodological roots.
1. Corporate Context DTE Energy is a diversified energy company involved in the development and management of energy-related businesses and services worldwide. Headquartered in Detroit, Michigan, USA, DTE Energy is a Fortune 300 company with approximately US $7 billion in revenues in 2004 and approximately US $21 billion in assets at December 31, 2004. Each business in our diverse enterprise portfolio faces a unique blend of competitive, regulatory, environmental, and financial pressures. Our state and federal regulated utilities address these challenges and operate differently than our entrepreneurial unregulated subsidiaries. This diverse portfolio of regulated and unregulated businesses provides a significant challenge to the organizations delivering IT-based products and services.
1
3. An Agile Barrel over the Waterfall
4. Accelerating Process Excellence
We begin our story with a look at our historical culture at DTE Energy. Rooted in 100+ years of performance and experience, our regulated business units succeeded with a deliberate and measured approach to taking risks. This overarching corporate culture, long proven to be successful by those commissioned to engineer and maintain electrical and natural gas operations, organically extended to the application of information technology. Like a good Roman, the IT organization adapted to and internalized the pace and processes of the business. In 1992 the IT group formalized its solution delivery methodology with a tailored System Life Cycle (SLC) based on James Martin’s Information Engineering (IE) approach [1]. Consultants guided our customization of the waterfall methodology built on IE principles, processes, and procedures. The IE approach felt “right” to most of the IT professionals the organization. It offered a comprehensive framework for the entire software engineering project lifecycle that spans from commissioning the project to maintaining the production code. The rigorous focus on documentation and high-ceremony signoffs also appealed to our engineering-focused business partners. By the late 1990’s, however, another movement in the software engineering industry was gaining momentum. Pioneered at nearby Chrysler Corporation, an alternative approach to delivering software emerged that would later be christened “eXtreme Programming” [2]. Our CIO, formerly in senior leadership positions with Netscape Communications, Organic Online Inc., Xerox Corporation, and Chrysler Corporation, sought a meaningful opportunity to apply and learn from these nascent agile approaches here at DTE Energy. Meanwhile, our company began a ground-breaking initiative to transform the enterprise from a traditional regulated utility to a major force in the deregulated market. Like much of the U.S., Michigan pursued utility deregulation and we embraced the challenge to provide open access in 1998, two years ahead of final legislation. We could have selected an isolated, straightforward project on which to pilot agile methods. But to better position the business for the uncertainties involved with deregulation, and to keep pace with the rapid business changes that inevitably were to come, we chose to pilot and mature our incremental, iterative approach. From the beginning we viewed the Electric Choice program differently from anything we had undertaken before. Not only would it significantly change how our business did business, but it would also dramatically change how our IT organizations delivered technical solutions in support of our business.
Our cross-functional Electric Choice Implementation Team successfully met its organizational objectives and legislative challenges. Release after release, we enabled business results with our time-boxed, iterative, prioritydriven approach to delivering and enhancing software. Despite its early success, the agile approach remained an oddity – if not an outright threat – to nearly everyone in IT outside of the program. For instance, our Project Management Organization (PMO) advocated a traditional by-the-book application of project management discipline to best leverage our investment in Project Management Professional (PMP) certifications. We faced an additional cultural twist with the merger of DTE Energy and the MCN Energy Group, announced in October, 1999. This infused another history-rich, regulated IT mindset into the mix. The 100 new DTE Energy employees brought with them their experiences, methodologies, and cultural mindsets to an increasingly diverse ITS Organization. We recognized the need for greater process clarity and an elevated focus on continuous improvement to rally together, increase our software quality, and improve our organization's productivity. What better way to improve ourselves than to assemble and roll-out a suite of official process flows, best-practices from across the industry, and common templates for all? Some within the organization had exposure to or experience with the Capability Maturity Model (CMM) from the Software Engineering Institute (SEI). As a model it resonated with our comfortable IE mindset. This paper is not intended to be a primer on the CMM (nor the CMMI, discussed later). Many websites, articles, and books focus on the model and how an organization may benefit from its guidance [3,4]. One aspect that is worth highlighting, however, is that the CMM (and its successor, the CMMI) focuses on the goals that must be achieved but not how to achieve them. It describes the “what” but not the “how.” Our governance and process organizations, with years of experience enabling and supporting traditional IE projects, approached the strategic objective of continuous improvement by targeting a CMM Level II-caliber maturity. We assembled a cross-functional team that produced a traditional plan to achieve this goal. The plan was comprehensive – its three-year timetable called for dedicated full-time resources, part-time technical working groups (TWiGs), and representational focus groups; new policies, processes, and procedures; standardized artifacts, templates, and a central repository; and a detailed communication and training strategy to inculcate process excellence into the ITS Organization.
2
Meanwhile, our small but growing community of agile practitioners continued to deliver software that met the needs of the business. During this time we began to mature and formalize a “house blend” of agile methods and techniques that worked in our culture. This visible yet underground movement in agility was often perceived as a fringe or niche area. It seemed counter-intuitive that an organization investing in a growing community of PMPs and striving for CMM Level II-caliber maturity would at the same time embrace a radical movement with its own manifesto! Our continuous process improvement program was commissioned, staffed, planned, and executed with the best of intentions. The team believed that the ideal way to significantly change our culture was to follow the triedand-true IE approach that had served us in IT and our business partners across the enterprise so well in the past. The program had two phases - Project Definition and Project Implementation. The Project Definition phase, completed in September, 2000, was composed of three stages: Confirmation of our current state, Target Definition of the desired future environment, and Transition Planning to transform the current environment into the defined target environment. Hindsight, as they say, is “20-20.” In its three-year run, the team peaked at 15 full-time resources focusing on ten relevant Key Process Areas. They authored ten durable policy statements, dozens of valuable artifacts and usable templates, and shepherded two formal CMM assessments. Yet the program provided less sustainable value than we initially anticipated. The program schedule inevitably slipped. The program team roster fluctuated. The pilot project teams struggled to adopt the new templates. The training sessions were sparsely attended. The focus group lost focus. In what we later learned was a common mistake, the team embraced the “what’s” described in various CMM materials too literally. The team strove for perfection in each policy statement and process flow definition. They selected and customized every template they could find. The team planned on a “big bang” delivery while the paradigm rug was being pulled out from under their feet. One source of contention was the methodological rub between the waterfall-based approach used by the process improvement team and the increasingly agile-based approach used by the very practitioners that were targeted for formal process improvement. But fundamentally, the CMM program was a success. It confirmed our passion for process improvement, raised our understanding of industry frameworks, revealed many of our organizational strengths and challenges, and established a solid foundation that would position us for growth and progress in the not-too-distant future.
5. Why Can’t We Be Friends? Our CMM initiative was not the only effort underway to increase the quality of our deliverables and to improve our productivity. By 2001 the agile mindset was spreading throughout the solution delivery groups in ITS. Rallying around our track record of delivering business solutions with the deregulation program, our group of agile practitioners expanded, matured, and continued to deliver and support working software of sufficient quality in a timely manner. With an eye towards continuous improvement down the road, we recognized the need to formalize the “current state” of our agile processes. We created a set of brochures, “job aid” and “jump-start” documentation, and a set of templates to form a baseline for growth. Meanwhile, our technology operations groups such as database design, server engineering, and network and telecom continued to operate traditionally. It took time to negotiate new ways of working together. Likewise, our exploits in agility drew the attention of the CMM program and despite each group’s best efforts to reconcile, we found ourselves further apart. Into early 2002, the ITS Organization remained almost exclusively focused on the two regulated utility businesses. But as the enterprise focus on the subsidiaries sharpened, we increased our span of coverage to provide IT support to our unregulated and non-utility businesses. We soon opened a satellite office in Ann Arbor, Michigan, USA, literally across the street from the offices of several unregulated DTE Energy subsidiaries. Dubbed the “Software Factory,” this agile Petri dish successfully applied and extended agile methods in competitive, timepressured, entrepreneurial environments. From this learning we wrote a brief “hitchhiker’s guide” about our blend of agile methods and held workshops and training sessions for both the Ann Arbor community and the organization-at-large in Detroit. By late-2002, our various infrastructure groups had become accustomed to, if not downright comfortable with, our incremental and iterative approach. But we continued to struggle in reconciling our agile methodology with the waterfall-inspired process improvement program under its banner of the CMM. It came down to the solution engineering groups versus the governance groups. Relationships, while professional, were strained. Emotions ran high. Spirits were low. Something had to give. In the spring of 2003, our senior level leaders decided enough was enough. The organizational tension between the groups was unhealthy and unsustainable. The groups needed to talk and figure out how to proceed as a team. We needed to play nice together.
3
6. Enter the SEPG
7. Improving our Process Improvement
At the dawn of a potential methodological turf war, we turned to the software engineering industry for ideas and inspiration. A quick web search led us to the SEI’s whitepaper describing the Software Engineering Process Group (SEPG) [5]. We found a way to merge the tribes! Leaders from the solution engineering and governance groups met to discuss a foundation for working together. Leveraging industry SEPG concepts, we sought to ensure more than mere representation for each impacted area. We wanted a “round table” where each voice would be heard – and each voice was required to speak. Building on the agile philosophy of team engagement, we encouraged a similar level of active participation from everyone in our new SEPG. Physical attendance alone was insufficient – for a healthy culture of teamwork and a shared approach to continuous improvement, we needed true collaboration [6]. Our SEPG grew to include representation from: Project Management Practice – Our matrixed pool of professional IT project managers Core Development Practices – Our matrixed pools of developers, testers, and business analysts Information Office – Our assigned groups representing the diverse needs of the business community Project Management Office (PMO) – Our centralized group overseeing the enterprise project portfolio Process Management and Measures (PMM) – Our centralized group of process measurement analysts Positive organizational change on this scale could not happen overnight. Yet we reminded ourselves of our “big bang” approach to CMM maturity, which by then had been brought to closure. In finding an achievable balance, we proposed a “kindler, gentler” SEPG that embraced the spirit, if not the letter, of SEPG definitions and structures throughout the IT industry. We opted for working sessions scheduled once a week from 11am to 1pm. But each SEPG meeting actually began at 11:30am. We promoted the first half-hour as an opportunity to grab lunches, relax, and eat together; this ice-breaker became a popular approach to get to know one another and connect as people. Through the SEPG, we broke through most of our barriers and assumptions over time. We realized that all of us had been pushing for what we believed was best for ITS and shared a passion for process improvement. Like a group in fierce agreement, we were trying to pull the rope differently but in the same direction. By the fall of 2003, the SEPG was a fully-functional and empowered decision-making body. We owned the software solution delivery process and began to make lasting progress toward meaningful process improvement.
ITS was not alone in seeking to improve the quality of our solutions and the productivity of our employees. The company had launched its DTE Energy Operating System in the spring of 2002. This framework of philosophy and technique focused on continuous improvement across the enterprise. By enabling us to streamline processes, eliminate waste, and reduce costs, it encouraged us to raise our performance and foster a culture of change as a way to improve and learn. Our SEPG continued to grow and mature as a process stewardship and improvement body. As our meetings morphed into focused working sessions, we formalized roles such as the champion, chair, secretary, and facilitator. We managed a feature backlog list with the variety of opportunities we uncovered, and commissioned small projects to focus on prioritized items. While not all of the lead-lined opportunities turned to gold, we struck a rich vein with our suite of Standards and Guidelines books. Leveraging a consistent structure and “look and feel” (but with distinctively colored covers such as the pioneering “Green Book” for Project Management), each book carefully described the “musts” and “shoulds” for a given role within our solution delivery community. Our intent with the books was not educational; rather, our audience was IT professionals, familiar with industry trends and best-practices, who simply needed to know the local rules and customs – what they must, should, should not, and must not do on our application delivery projects. In the shadow of the CMM initiative, our focus on process compliance had been downplayed for too long. But with a fresh batch of Standards and Guidelines books in hand, in late-2003 we embarked on a communication campaign across our organization with a new message – it was time to tidy up the ship and raise the bar. The project teams had their Standards and Guidelines books and knew what was expected. Each book included a “Certify Your Application” checklist with which the practitioner attested that she/he read, understood, and applied the material while producing her/his work products. The checklist contained the names of every section, each having two boxes: “Yes” and “N/A.” In mid-2004 we introduced a new governance organization to enable (by means of coaching and handson mentoring) and ensure (a delicate way to say audit) the quality of our software products. This Quality Management Group (QMG) included Technical Architects and Quality Architects that were assigned to part-time roles on multiple in-flight projects. In just over a year we went from significant internal pressures and turf wars to being positioned to revisit a formal process improvement framework.
4
In planning the budget for 2005, we set aside money for full-time and part-time resources to lead the organization to CMMI accreditation. We knew we would be more successful with an experienced tour guide on our side rather than going it alone. Early in 2005 we formally began our journey by retaining a CMMI Assessor – a “defense attorney” if you will – who had recently guided a manufacturing company to Level II accreditation. We reiterated our aspirations for accreditation at Level III as a means to realizing a sustaining culture of process improvement. This desire for Level III was rare; in fact the SEI recommended Level II accreditation first. We felt Level III, with its organization-wide focus over project-toproject coverage, would align well with our direction. As we began our plans, the local chapter of the Project Management Institute (PMI) announced a presentation by an automotive supplier on their recent advancement from CMM Level II to CMMI Level III. Synchronicity was such a wonderful thing! With keen interest we attended the dinner meeting and confirmed our assumptions: This was not going to be easy. While our organization was about one-third their size, we lacked their experience in formal CMMI accreditation. They had been through the drill and knew how it felt. This was not going to happen overnight. Their plans called for 24 months to step from CMM Level II to CMMI Level III. We sought to advance from an informal Level II-caliber maturity to formal Level III accreditation in 18 to 24 months. This was not going to stop us. We had support from senior management, a healthy SEPG, an enterprisewide DTE Energy Operating System focus on continuous improvement, and a resolve to succeed. With our eyes wide open, we structured our program into a series of time-boxed incremental, iterative releases. We felt an agile approach – similar to our IE-shaking Electric Choice program – was a prudent way to mature our culture and respond to the challenges ahead. We commissioned a seven-person steering committee to oversee the program. The steering committee, a crossfunctional subset of SEPG representatives, met each week to review progress, evaluate alternatives and risks, and decide where and how the program would proceed. Early in the spring we began our communication effort. We identified a set of target audience and the messages that each audience needed to hear. (Our road show continues this summer and into next spring.) We had a consistent theme: we seek to foster and mature a healthy, sustainable culture of continuous improvement. Like most professional organizations, we were not “checking the box” for a framed certificate to hang on the wall; we were not seeking plaque buildup.
8. Preparing the Organization for CMMI The success of our SEPG rested largely on the shoulders of those directly involved. We knew this was neither sustainable over time nor healthy for our culture. Without losing focus on our goal of continuous improvement, and without letting entropy consume the progress we had made, we needed a sustainable way to keep our organization headed in the right direction. It had been a while since the letters “CMM” were last whispered in the ITS Organization. We had kept tabs on progress in the industry, of course, and saw the mature advancements in process improvement frameworks. Perhaps it was time to revisit a formalized approach. We began by educating ourselves on the relatively new Capability Maturity Model Integration (CMMI) [7]. To us the model was a reasonable maturation of the CMM and its enhancements seemed to make sense. It was worth a try. This time we could do it! Interestingly, as we began to socialize the notion of CMMI within the ITS leadership community, we incited much less internal conflict. With perhaps too little ceremony we had turned the corner from “If it ain’t broke, don’t fix it” to “We can do better.” At some point in the previous year we reached a balance with the staid and measured waterfall mentality and the pioneering and nimble agile spirit. In early-2004 we began a more concrete focus on the CMMI by reading books and articles on the model and its application. We put together some informal handouts and quick reference guides for our SEPG that covered: A glossary that promoted common CMMI terms The five levels or stages of the CMMI The four CMMI categories and 25 process areas We spent several SEPG sessions discussing the CMMI and how it might serve as a springboard for our continuous improvement framework. We found that a theoretical discussion of the CMMI was insufficient; we sought to fully internalize the model. Over the summer we performed an informal selfassessment of our solution engineering groups to learn the model and to gauge how we would fare. We assessed our applicable process areas in light of the continuous representation, and advanced our shared understanding of the CMMI framework and its terminology. By the fall of 2004 the ITS Organization published its “ITS Strategic Plan for 2004-2008.” Covering the waterfront, this document established strategic goals, objectives, and action plans for the entire organization. A dominant theme was continuous improvement, and we were ready to put it in writing – we would strive for CMMI Level III accreditation!
5
The assessment team, drawn from the PMO, quality, and measurement groups, performed the pre-assessment over three weeks. In addition to the project teams, they interviewed a diverse representation of our ITS middle management and also our senior ITS leadership. When they were finished, the assessment team had: Held 11 interviews Reviewed five projects Assessed 17 process areas Assessed 25 specific goals Assessed 12 generic goals Assessed 78 specific practices Gathered 321 data points The assessment team captured and analyzed the results, and prepared and presented a set of materials for different audiences (the SEPG received more details than the general leadership population, and so on). We were pleased with what we learned about our organization and also what we learned about CMMI and SCAMPI. Our culture was further along with some specific goals than we had hoped, and was not as far behind with other goals as we had feared. “Project Monitoring and Control” was solid – perhaps not surprising given our seemingly genetic predisposition to rigorous project oversight and tracking. The assessment confirmed some major opportunities with the “Configuration Management” process area. We had previously focused on establishing source code baselines and tracking mechanisms; this review reaffirmed our need to manage other artifacts in a similar manner. More surprising was our relative strengths in Level III specific process areas. For instance, we had strong results with “Organizational Process Focus,” “Organizational Process Definition,” and “Verification.” Our solution engineering pools and our centralized governance groups positioned the organization well for “Establish Team Composition” and “Govern Team Operations,” specific goals of the “Integrated Teaming” process area. In his overall observation, our CMMI Assessor wrote, “ITS Solution Engineering is performing at Level II and Level III in multiple Process Areas and is well positioned for maturation in others. “Strong commitment to Process Improvement by both the leadership and the technology teams, along with maturing SEPG, QMG, and PMM teams, will all contribute to accelerated Level III performance once process enhancements in Configuration Management, Measurement, and Process Training are implemented.” By no means were we ready for a full-scale Class A assessment. We had some hard work ahead of us. But we were an organization in transition – and this time we were agile in our philosophy and united in our focus.
9. The SCAMPI Shuffle Our informal self-assessment was, in agile terms, a sufficient “spike” to confirm we were ready, willing, and able to proceed. Yet we had little insight into the formal method – the Standard CMMI Appraisal Method for Process Improvement (SCAMPI) – that our organization would follow for CMMI accreditation. Beginning with the SEI’s SCAMPI website [8], and with the guidance of our internal CMMI Assessor, the steering committee and SEPG came up the learning curve on the method. We incorporated new materials into our communication program and continued the road show. It was at this point we recognized the need to shift our focus from the continuous model to the staged model while not losing sight of our “pull ahead” efforts in certain areas. And rather than jump straight to CMMI Level III accreditation, we needed to take a quick swing through a Class B Level II assessment on the way. With renewed determination we completed our Release Zero efforts and commissioned an internal CMMI team with a deliberate plan focused on four milestones: Pre-Assessment Complete (April 2005) – An informal SCAMPI assessment, focusing on both Level II and Level III with unofficial ratings, by a four-person internal team with a lead assessor. Class B Assessment Complete (December 2005) – A formal SCAMPI assessment, focusing on Level III with unofficial ratings, by an expanded internal team. Class B Assessment Complete (March 2006) – A formal SCAMPI assessment, focusing on Level II with official SEI-filed ratings, by the internal team. Class A Assessment Complete (June 2006) – A formal SCAMPI assessment for Level III, with official SEIfiled ratings, led by an independent SEI-certified Lead Assessor with assistance from the internal team. We targeted 17 of the 21 process areas in Level II and Level III. Because we delivered and supported IT solutions internally, we deemed “Supplier Agreement Management” (Level II) and “Integrated Supplier Management” (Level III) out of scope. And without any overt Integrated Product and Process Development (IPPD) initiatives, we identified “Integrated Project Management” and “Organizational Environment for Integration” (both Level III) out of scope. The steering committee identified five active software projects for the SCAMPI pre-assessment. We sought representation from a diverse mix of methodologies, technologies, business areas, and levels of project team experience. From each team we asked the project manager, a lead developer, and a business analyst or tester to participate in the interviews.
6
We expanded our pre-assessment team with additional resources to form a seven-person “bootstrap club.” The group was chartered to (a) educate the selected projects on specifics of the CMMI effort, (b) jump-start our SIGs for both short-term traction and long-term success, and (c) serve as the independent panel for the assessments. Our “bootstrap club” is now receiving in-depth bursts of SCAMPI training and is clarifying its role as subject matter experts, project guides, and impartial assessors. As we look back over the last six years, and look ahead to the spring of 2006, we have a deeper understanding of CMMI, SCAMPI, and many of the challenges that await us. We are now better positioned to achieve Level III accreditation and enable advances in product quality, process adherence, and developer productivity through continuous improvement. Our earlier CMM initiative had indeed served us well; we have the will to advance from where we are, rather than set our sights too far into the future. Our agile methodologies, proven to be successful on software projects, have enabled quicker “fail fast” learning and a “just enough” process mindset. Our SEPG has a two-year track record of defining, communicating, training, and managing significant process improvements across our organization. Our Information Office, Solution Engineering, Technology Operations, and Governance groups share a committed relationship of partnering and trust. Our applications are of higher quality thanks to our adherence to published Standards and Guidelines, and the oversight of our Quality Management Group. Our CMMI steering committee continues its work under the watchful eye of our designated project manager – this is a multi-release program, after all! We believe – and now have practical experience – that agile methods are an accelerator to process improvement when applied appropriately and meaningfully. Rather than abandoning discipline and foregoing documentation, we find that agile methods reinforce the need to be deliberate about where we focus our efforts and invest our time. A popular saying in the DTE Energy ITS culture is, “anything that takes time and costs money is scope – and scope is testable.” We view CMMI as a guideline for scoping process improvement, and SCAMPI as a test suite to validate and verify our progress and success. Fundamentally, the CMMI provides a comprehensive framework of “what’s” – and agile methods provide an empowering philosophy of “how’s.” The final chapters of this story have yet to be written. With our clarity and experience we are poised to realize Level III accreditation (the “means”) and significant and sustainable process improvements (the “ends”).
10. To Accreditation – and Beyond! By late-May, 2005, the SEPG and the CMMI steering committee leveraged the findings and recommendations, and began to prepare for a SCAMPI Class B assessment. Rather than tackle everything identified by the assessment team at the same time, we focused on a small but fundamental foundation for improvement and growth. This foundation began with Process Management, Configuration Management, Project Planning, and Measurement and Analysis. These core elements of Level II would be heavily leveraged as we strove for Level III. In the early days of our SEPG we began chartering small, short-term Special Interest Groups (SIGs) to analyze options, offer recommendations, and/or produce a discrete set of deliverables. With our renewed focus on CMMI we formalized our understanding of SIGs to better align with the industry definition. We wrapped up our old SIGs (after politely renaming them Temporary Action Groups) and commissioned our first “real” SIG, charged with the ongoing ownership and stewardship of Process Management. Our new SIG followed an agile, time-boxed approach. After the Process Management SIG’s first release, producing a small, “just enough” suite of SIG processes and templates, we then chartered the Configuration Management SIG, and we plan to continue the staggered SIG launch with the Project Planning SIG in July, 2005. For the scope of our pre-assessment we decided to examine a variety of projects to survey our procedural landscape. But for our formal efforts we needed to focus on our core areas of in-house software development. The CMMI model targets an “Organizational Unit” as the subject or scope of the appraisal. The model allows for a subset of an organization to be targeted. In early June, 2005, the SEPG clarified our scope to focus on active projects expected to exhibit these behaviors: Adherence to our agile methodology Creation of a web-based Java application Funded and delivered into production in 2005 Compliance with published Standards and Guidelines Staffed with traditional team composition and roles These attributes emphasized our strategic direction for methodology, technology, standards, and staffing. The SCAMPI method requires at least three projects to be inspected; we opted to select six for additional learning and improvement opportunities. We again refined our communication materials and made the rounds. A new theme was that we anticipated a minimal impact on the projects. This was an important message to reassure our Information Office who had made commitments to their represented business units.
7
11. References Steven W. Baker is a software methodologist at DTE Energy. Leveraging his extensive background in both agile and traditional solution delivery methodologies, he leads and enables the brewing of DTE Energy’s “house blend” of agile and adaptive methods. As a charter member of DTE Energy’s Software Engineering Process Group (SEPG), he oversees measurable improvements across the board.
[1] James Martin, Information Engineering: Introduction, Prentice Hall, ISBN 013464462X, February 1991 [2] Kent Beck with Cynthia Andres, Extreme Programming Explained: Embrace Change (Second Edition), Addison-Wesley, ISBN 0321278658, November 2004
Mr. Baker began recreational programming in 5th grade, locking in his career path at an early age. He has worked as a software engineer with Ford Motor Company, delivering ISO 9001, QS-9000, and ISO 14001 quality and process improvements and applications. Prior to that, he was a lead developer with Parke-Davis Pharmaceuticals and, earlier still, a systems operator with SecureData Corporation.
[3] Software Engineering Inst. Carnegie Mellon Univ., The Capability Maturity Model: Guidelines for Improving the Software Process, Addison-Wesley, ISBN 0201546647, June 1995 [4] WikiPedia, Capability Maturity Model, http://en.wikipedia.org/wiki/Capability_Maturity_Model [5] Priscilla Fowler and Stanley Rifkin, Software Engineering Process Group Guide, http://www.sei.cmu.edu/publications/documents/90.report s/90.tr.024.html
Mr. Baker holds an M.S. in software engineering and a B.S. in computer science from Oakland University. He is a practicing Project Management Professional (PMP) and Certified Computing Professional (CCP), owns two precocious Labrador retrievers, and still finds time to enjoy theoretical physics, write and record original music, and predict outcomes of Law and Order reruns with his fiancé, Nicole.
[6] Steven W. Baker and Steven B. Ambrose, “Measuring Engagement and Predicting Project Performance: The TEAM Score,” Cutter IT Journal, Vol. 18, No. 7, July 2005, pp. 5-11
Mr. Baker can be reached at
[email protected].
[7] Software Engineering Inst. Carnegie Mellon Univ., CMMI Web Site, http://www.sei.cmu.edu/cmmi/ [8] Software Engineering Inst. Carnegie Mellon Univ., Standard CMMI Appraisal Method for Process Improvement (SCAMPI), Version 1.1: Method Definition Document, http://www.sei.cmu.edu/publications/documents/01.report s/01hb001.html
8