CMMI Accelerated
CMMI Accelerated
CMMI Accelerated
December 2010
SPECIAL REPORT
CMU/SEI-2010-SR-032
http://www.sei.cmu.edu
This report was prepared for the
The ideas and findings in this report should not be construed as an official DoD position. It is published in the
interest of scientific and technical information exchange.
This work is sponsored by the U.S. Department of Defense. The Software Engineering Institute is a federally
funded research and development center sponsored by the U.S. Department of Defense.
NO WARRANTY
Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder.
Internal use. Permission to reproduce this document and to prepare derivative works from this document for
internal use is granted, provided the copyright and “No Warranty” statements are included with all reproductions
and derivative works.
External use. This document may be reproduced in its entirety, without modification, and freely distributed in
written or electronic form without requesting formal permission. Permission is required for any other external
and/or commercial use. Requests for permission should be directed to the Software Engineering Institute at
permission@sei.cmu.edu.
This work was created in the performance of Federal Government Contract Number FA8721-05-C-0003 with
Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research
and development center. The Government of the United States has a royalty-free government-purpose license to
use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so,
for government purposes pursuant to the copyright license under the clause at 252.227-7013.
For information about SEI publications, please visit the library on the SEI website (www.sei.cmu.edu/library).
Table of Contents
Acknowledgments vii
Executive Summary ix
Abstract xi
2 Using AIM 10
2.1 Securing and Maintaining Executive Sponsorship 10
2.2 Characterizing Current and Future Capability and Performance 11
2.3 Identifying, Training, and Launching Pilot Projects 12
2.4 Identifying, Training, and Launching the PG 13
2.5 Evaluating Pilot Projects and Planning Further Rollout 14
2.6 Building a Culture of Excellence and Continuous Improvement 15
CMU/SEI-2010-SR-032 | i
Appendix A: Activity Charts for Selected Processes 30
References/Bibliography 87
CMU/SEI-2010-SR-032 | ii
List of Figures
CMU/SEI-2010-SR-032 | iii
CMU/SEI-2010-SR-032 | iv
List of Tables
CMU/SEI-2010-SR-032 | v
CMU/SEI-2010-SR-032 | vi
Acknowledgments
The authors wish to acknowledge the following people for their early and critical work in using
TSP to implement first the CMM for Software and then CMMI-DEV, for enabling and
championing this work at critical junctures, and for innovating and validating many of the
methods described in this report: the extended process improvement teams at the NAVAIR SSAs
for AV-8B, P-3C, and E-2C, especially Jeff Schwalb and Lisa Pracchia; David Webb and the
many other pioneers at Hill Air Force Base; the talented people at Advanced Information Systems
and Quarksoft; Jenna Fleshman and Jason Huibregtse and the team of CGI Federal Inc. Software
Engineering Division; Dr. Fernando Jaimes Pastrana and Rafael Salazar Chávez of Tecnológico
de Monterrey; Oscar Mondragon Campos of the Tecnológico de Monterrey SIE Center, and Pedro
Beltran Ortiz and Edgar Fernández Rodríguez of SILAC Ingeniería de Software; and our friend
and former teammate Noopur Davis of Davis Systems. To you and to the many unnamed others in
these and other organizations who have worked both harder and smarter than us in this area, we
say thank you.
CMU/SEI-2010-SR-032 | vii
CMU/SEI-2010-SR-032 | viii
Executive Summary
This report describes the Accelerated Improvement Method (AIM), which implements CMMI
practices rapidly, reliably, and with high performance. AIM combines the Team Software Process
(TSP) and tailored SCAMPI appraisals with elements of Six Sigma and other techniques to
achieve typical project productivity gains of 30% while reducing delivered defect rates by 80%
and conforming to CMMI maturity level 3 practices, nominally within 18 months for small-to-
medium sized organizations. AIM starts up quickly with known costs and proceeds project-by-
project until the entire organization has been transformed. This document provides guidance to
AIM implementers but also provides background information for executives, line managers, and
other affected parties. It is a companion to the Guide for SCAMPI Appraisals: Accelerated
Improvement Method (AIM) [Miluk 2010].
CMU/SEI-2010-SR-032 | ix
CMU/SEI-2010-SR-032 | x
Abstract
This report is a description of and aid for implementing the Accelerated Improvement Method
(AIM), and is a companion to the Guide for SCAMPI Appraisals: Accelerated Improvement
Method (AIM). The intended audience is anyone responsible for implementing CMMI using the
Team Software Process (TSP), Six Sigma, and other methodologies—management sponsors and
champions, line and support management directly affected by such changes, process group leads
and members responsible for implementing such changes, and the team leaders and developers
enacting such new methods in concert and combination with their existing practices. This guide is
not exhaustive; rather it is a starting point on the road to using CMMI and related technologies to
help organizations achieve business objectives using world-class process management techniques.
CMU/SEI-2010-SR-032 | xi
CMU/SEI-2010-SR-032 | xii
Structure and How To Use This Document
Section 1 of this report provides a context for and a general description of AIM and its
components. Section 2 describes a series of overlapping role-based “execution threads” for AIM
usage; the roles are those typically found in an organization implementing AIM. Section 3
provides specific guidance to process and development groups on AIM implementation issues that
they are likely to face. Four appendices are included. Appendix A contains a series of process
flow diagrams tying together specific roles and activities for certain process elements of TSP+,
the main implementation component of AIM. Appendix B contains a few important process
elements in TSP+. Appendix C contains sample templates from the GQ(I)M paradigm that is
central to Goal Driven Measurement, as described in a course taught at the Software Engineering
Institute1, and in several SEI publications [Park 1996, Goethert 2004]. Appendix D is a series of
Process Improvement Proposals (PIPs) that were used to identify gaps in CMMI coverage in
previous versions of TSP.
• for management sponsors and champions, all of Section 1 and at least Sections 2.1, 2.2, and
2.6
• for line and support management, all of Sections 1 and 2
• for most process group leads and members, the entire report excluding Appendix D, with
special attention to Section 2 and parts of Section 3 and Appendices A, B, and C as needed,
in addition to the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM)
[Miluk 2010]
• for development team leaders and members, Sections 1 and 2, plus specific parts of Section 3
and Appendices A, B, and C as needed
• in addition to the above, for process groups and development teams with prior TSP
experience, Appendix D.
1
Implementing Goal-Driven Measurement, http://www.sei.cmu.edu/training/p06.cfm.
CMU/SEI-2010-SR-032 | xiii
CMU/SEI-2010-SR-032 | xiv
1 Introduction to the Accelerated Improvement Method (AIM)
This report describes the Accelerated Improvement Method (AIM), a rapid deployment of high-
performance CMMI practices—the Team Software Process (TSP), tailored SCAMPI appraisals, and
elements of Six Sigma—as the core technologies of an approach that addresses maturity levels 2 and
3, and provides significant support for the higher maturity levels. This approach builds upon field
experience by SEI staff, client organizations, and others that have recognized the potential in using
these technologies together. CMMI provides an organizational “what-to-do” viewpoint, while the
other practices bring organizational, project team, and individual “how-to-do-it” viewpoints as well as
critical feedback, with the following features.
• The approach allows more rapid implementation than CMMI norms, hence “Accelerated
Improvement Method,” or “AIM.” The nominal AIM timeframe for a small-to-medium-sized
development organization to achieve CMMI maturity level 3 is 18 months, or less than half the
time normally attributed to the IDEAL-based improvement approach.
• The AIM approach achieves excellent results in terms of measureable project performance
improvements, beginning with the first project. Predictable schedule and costs with a 30%
improvement in productivity and 80% fewer delivered defects are common results. Initial pilot
projects can begin within weeks of a decision to proceed.
• CMMI implementation proceeds on a project-by-project basis, rather than using a maturity level
(ML) or process area (PA) approach, although certain groupings of PAs are naturally addressed
by AIM. The project-by-project approach assures constrained, identifiable, and incremental
costs, with measureable results that justify those costs.
• AIM provides a path to sustain and improve upon excellent—and in some cases, world-class—
results, while building the internal capability to support the new way of working.
AIM thus addresses the ongoing debate in the CMMI community of performance vs. compliance.
Does an organization implementing CMMI target the achievement of a particular CMMI maturity
level? Or does the organization instead use CMMI as a guide for improving performance in terms of
critical business measures such as cost, schedule, and quality? AIM recognizes that framing the
debate as an either-or proposition creates a false, perhaps even dangerous, choice, and instead affirms
that an organization can and must do both to gain maximum advantage from any such technology
investment.
However one chooses to frame this debate, and whatever method one might employ, CMMI
implementations generally face multiple related issues that influence the choices that organizations
make when implementing CMMI. How much will this improvement effort cost? How long will it
take? How much better will the organization’s performance be once a certain CMMI maturity level is
achieved? What is this change in performance worth to the business? What must happen to sustain
these changes once they are made? AIM provides answers to these questions based on customer
experience not on academic projections.
Many CMMI improvement efforts rely to some extent on the IDEAL change model [McFeeley 1996].
IDEAL dates back at least to the early 1990s when it was associated with implementing the original
CMU/SEI-2010-SR-032 | 1
CMM for Software, or SW-CMM [Paulk 1994]. Newcomers to CMMI-based improvement inevitably
learn IDEAL principles since it is the one change management approach mentioned in the standard
SEI Introduction to CMMI training. But this also implies that newcomers almost inevitably accept the
historical assumptions and limitations built into many early IDEAL implementations, especially the
inherited staged-model mindset of the SW-CMM.
Assuming that SEI data is reflective of early improvement approaches, the semi-annual CMMI
Maturity Profile exposes some of those limitations. One such historical limitation is that CMMI
(staged) implementation almost necessarily proceeds maturity level by maturity level, and associated
with that procession is an average of one to two years elapsed time per level [SEI 2010]. The
improvements from the earlier Software Maturity Profile based on the SW-CMM, most notably a
reduction in the time from ML1 to ML2 (from 19 months average over 175 instances, to 4.5 months
average for just 6 instances), are probably due to most companies starting on this path not bothering to
pay for an appraisal that they expect to show ML1. This is evidenced by the very small number of
“official” ML1-to-ML2 transitions.
Paradoxically some of IDEAL’s limitations are due to the fact that it is a generic model for
improvement and change. IDEAL makes few (if any) assumptions either about where an organization
begins or about what methods it uses. As such, even the best of implementations can require a lot of
time to gather data, survey a situation, set goals, and build a plan of attack. Lacking specific guidance
on methods, or when the “one-to-two-years-per-level” mindset colors planning assumptions, an
improvement team such as a process group (PG) can consume many months acquiring or developing
this information. In contrast, the AIM approach uses known, trainable methods with known
performance characteristics, and which seem to work in a wide variety of development environments.
AIM is composed of five key elements that work together to accomplish its goals.
These five elements blend together to produce superior results, beginning from the first pilot project
through deployment across the organization.
CMU/SEI-2010-SR-032 | 2
1.2.1 Rapid Deployment
1. Use proven methods with known performance characteristics (TSP, CMMI, SCAMPI, and Six
Sigma).
2. Characterize current performance and gather information on existing process assets.
3. Train developers and their direct managers quickly in TSP methods on a project-by-project basis.
In parallel, train PG and other involved personnel in TSP and CMMI as appropriate.
4. Identify pilot projects and launch each project as a TSP team as soon as training for the team is
complete. One of the early teams trained and launched this way, although preferably not the first,
is the organization’s process group (PG).
5. Gather data on both performance and conformance (the latter through a tailored SCAMPI B
appraisal), and make adjustments as each new project is trained and launched.
6. Use data and experience from early pilots to plan and implement broader adoption within the
organization. Introduce Six Sigma training and techniques as appropriate.
7. If necessary, confirm CMMI compliance with a SCAMPI A appraisal.
Even the smallest organizations can take advantage of early projects to expose potential internal
experts to the various technologies. Building internal capability is crucial to the long-term viability of
these techniques in any organization.
1.2.2 CMMI
CMMI, including its predecessor CMM for Software, is probably the most widely used family of
improvement models for the development of software and software-intensive systems. As a model of
best practices, CMMI describes characteristics of such practices, and though examples of
implementation techniques are numerous, by design the model itself avoids any recommendations of
how one should design and implement such practices. The intent of AIM is to provide just such a set
of recommendations, though not an exclusive set.
For example, CMMI does not specify an appraisal method, stating only that one should “appraise the
organization’s processes periodically and as needed to maintain an understanding of their strengths
and weaknesses” (OPF-SP1.2 CMMI-DEV V1.2). SCAMPI appraisals are most commonly used for
formal appraisal in this regard. Many others—some adapted from SCAMPI or other techniques, some
home-grown—are used both formally and informally, and often in conjunction with SCAMPIs of one
flavor or another, in order to implement the intent of the model. AIM specifies the formal use of
tailored SCAMPI appraisals in several modes, but other techniques can and should be used to
supplement such activities.
AIM’s scope with respect to CMMI is limited and specific: CMMI-DEV V.1.2 maturity level 3,
excluding Supplier Agreement Management (SAM). However, this scope does not imply that one
cannot implement SAM or higher maturity processes within AIM, only that AIM practices make no
specific claim that they implement any of that PA’s practices.
This also does not imply that AIM might somehow become obsolete with the next version of CMMI-
DEV, or even that, for example, one should avoid attempts to implement CMMI-SVC practices using
AIM. As of this writing, CMMI V.1.3 is in its final stages of development, with minimal changes at
maturity levels 2 and 3. Furthermore, the “constellation” architecture of the CMMI models specifies
CMU/SEI-2010-SR-032 | 3
16 of the 22 PAs in CMMI-DEV as “core PAs” that are common to CMMI-DEV, CMMI-SVC
(CMMI for Services), and CMMI-ACQ (CMMI for Acquisition), 12 of those at ML3. All 12 core PAs
are within the scope of AIM, as well as the structure provided by all 12 generic practices at capability
levels 2 and 3 that are common to all PAs, perhaps making AIM a good starting point to address not
only CMMI-DEV (the current focus) but CMMI-SVC and CMMI-ACQ as well.
As implied by its name, the Team Software Process component of AIM deals directly with the most
generic units of implementation—project teams and their members. In some sense, the most
fundamental objective of CMMI is to improve the performance of project teams by changing the
behavior of those teams’ individual members. Beginning with training in the Personal Software
Process (PSP) and carrying over into actual development projects, TSP team members implement a
task-oriented framework of processes and management and measurement frameworks that provide
ongoing performance feedback to both the individual and the team. This approach enables rapid and
profound changes in behavior, as reflected by measured performance improvements.
PSP and TSP were designed to guide not only software development, but any structured intellectual
activity. For example, TSP has been adapted for use in the development of video games, for which a
typical team includes game designers, graphic designers, writers, artists, and musicians, all of whom
outnumber the software developers [Bala 2007]. PSP and TSP principles have also been used to create
most of the training, presentation, and print materials used to spread TSP knowledge. As of this
writing, TSP pilot projects exist for services applications and for development teams using SEI
architecture methods. All of these are in addition to the expected uses for TSP in more traditional
software development venues such as banking and finance, industrial and embedded control systems,
shrink-wrapped software, and IT applications.
TSP has been in use for over 10 years with excellent results [Sasao 2010, Nichols 2009, Wall 2007,
Davis 2004]. Capers Jones recently identified TSP/PSP as one of the top development methods in use
today across small, medium, and large systems, and in fact, is the only method that ranked either first
or second in all project size categories [Jones 2009]. TSP’s built-in performance measurement
framework is used continuously to verify implementation and to identify opportunities for further
improvement.
1.2.4 SCAMPI
As mentioned above, SCAMPI appraisals are a key component of AIM implementation. If necessary,
an early SCAMPI B/C appraisal may be used to survey organizational practices. At intermediate
stages, SCAMPI B/C is used to verify implementation of other AIM practices, to identify both
potential weaknesses to be corrected and unique adaptations that have the potential to improve the
common practice, and also to verify that, where AIM and organizational practices have combined,
those combinations are CMMI-compliant.
To verify the attainment of a particular CMMI maturity level, the only official option is the SCAMPI
A appraisal led by an SEI-certified Lead Appraiser. The Guide for SCAMPI Appraisals: Accelerated
Improvement Method (AIM) provides a practice-by-practice breakdown of expected AIM artifacts
through CMMI maturity level 3 [Miluk 2010]. These artifacts are the expected results of TSP
implementation, without consideration of any other valuable practices that might exist on a team or in
CMU/SEI-2010-SR-032 | 4
an organization. Obviously, these expectations should be adjusted based on an organization’s
particular implementation, as well as the results of any intermediate SCAMPI B/C appraisals.
Six Sigma methods have a documented history of use in CMMI-based improvement, especially for the
attainment of higher maturity levels [EB 2008, Habib 2008, Siviy 2008, Siviy 2005]. However, the
use of these methods in AIM is not to address CMMI maturity levels 4 and 5, but rather to provide a
framework for evaluating the often voluminous data provided by TSP teams, and to then take action
on the opportunities presented by that data.
The Six Sigma concentration on the Voice of the Customer (VoC) is usually the most immediately
useful method within the AIM framework. In addition to the obvious use in Requirements
Development (RD) and Validation (VAL) within CMMI, VoC is a valuable tool for eliciting
stakeholder needs in preparation for a TSP launch. Beyond that, Six Sigma deployment typically
proceeds after some significant number of teams and projects have completed cycles and projects in
an organization. Both DMAIC (Define, Measure, Analyze, Improve, Control) and DFSS (Design for
Six Sigma) methods are useable within AIM. The general recommendation is to begin with DMAIC
unless an experienced Six Sigma Black Belt is available to lead DFSS efforts.
While the five major AIM components must work together and be balanced properly for any particular
organization, it is fair to say that TSP implementation is the key to AIM effectiveness. By themselves,
TSP practices address a large majority of CMMI specific practices. In addition, there are five
significant distinguishing features of the TSP which, while consistent with CMMI practices, go
significantly beyond the requirements of the model. The total effect of these practices is to approach
the CMMI ideal of the right functionality delivered defect-free, on time, and on budget. These TSP
features are:
These features reflect the view that software-intensive systems are produced by knowledge work. As
Watts Humphrey observed, the primary rule for managing knowledge work is that managers can’t
manage it; the workers must manage themselves. In order to manage their own work, software-
intensive development teams must be motivated properly, make their own plans, negotiate their own
commitments, track these plans and commitments, and manage their own quality. These five features
of TSP are essential for its use by and integration into AIM. Therefore, it is essential that those
implementing AIM understand something of these features.
It is fair to say that TSP, the Team Software Process, would not exist without PSP, the Personal
Software Process. PSP development preceded that of TSP by several years [Humphrey 2000]. But
CMU/SEI-2010-SR-032 | 5
more than that, both in philosophy and in operational detail, PSP training and practices are
foundational to TSP.
PSP, as the full name implies, teaches process skills at the individual level. PSP is taught as a series of
progressively more comprehensive operational development processes, beginning with a “plain
vanilla” framework usable by any competent software developer, with basic measurements of time
and defects. Developers write a series of small programs, gathering data along the way on their own
performances, adding size measurement and estimation practices, test and design documentation,
personal reviews, and personal planning (a partial list). Each of the PSP practices are used, relied
upon, or built upon in the TSP.
The TSP measurement framework is the first obvious way in which TSP builds upon PSP skills.
Strictly speaking, TSP measurement adds no new data to the set specified by PSP (time on task,
defects, product size, and task completion date). However, dates are not used in any substantive way
in the PSP training, whereas, of course, they are critical to most real-world projects.
These basic data requirements pre-date the PSP, finding their source in the early days of SEI
measurement research [Carleton 1992]. Unique to TSP, these data originate with individual
developers in real-time. The data then aggregate at the team level, and are used at least weekly for the
purposes of project and quality management, both by the project team collectively and by the team
members individually. Thus, the measurement framework provides enormous leverage to project
teams to manage their own work and their own projects, and to the teams, projects, and organization
as a whole to analyze and improve process performance.
The questions about what data to report to what layer of management, and how often to report, are, of
course, questions that must be answered by any CMMI effort. In TSP, however, this becomes more a
question of how to store, select, and summarize data, rather than the more usual case of having to
figure out what data to collect and how to collect it in the first place. Another SEI technology, Goal-
Driven Measurement (GDM) [Park 1996], which is also a descendant of the original SEI measurement
research, may be useful within this context as well, by providing documented coverage of CMMI
Measurement and Analysis (MA) practices [Goethert 2004] and a convenient way to summarize TSP
measurement usage within an organization (see Appendix C).
TSP teams are self-directed, but this does not mean that there is no identified leader of the team. The
TSP team leader is designated by and responsible to the management chain, just as in more traditional
authority structures. However, TSP is based on the idea that because software is knowledge work,
such knowledge work can be managed effectively only by the people who are doing the work. The
structure of the TSP team thus is not the traditional one with a team leader exerting a command-and-
control management style, but rather one that recommends a more egalitarian approach with the team
leader as a first-among-equals directing and coordinating the team.
Self-directed TSP teams rely on their PSP skills and the TSP measurement framework to plan and
track their own work, usually more effectively than with “traditional” team leadership. The most
visible vehicle for planning is the TSP launch, an intense series of meetings lasting from one to five
days, depending on the size and number of team members and project teams involved, as well as on
the scope and duration of the project.
CMU/SEI-2010-SR-032 | 6
Figure 1: TSP Launch Meetings (Sourc
ce: [Humphrey 2010])
During a launch, a TSP team chooses team roles, defines its own objectives for achieving man nagement
goals, identifies work products, definees a technical strategy, selects or defines a development process,
p
and builds both an overall plan for thee project and a detailed near-term plan (down to the levell of
several weekly tasks for the individuaals on the team). The TSP team also identifies and evaluaates risks
to the plan, prepares a briefing report for management, and then presents the plan to managem ment for
negotiation and approval. After the lau unch, the team generates weekly reports of the data that they
collect using the measurement framew work (both individually and aggregated as a team), and uses
u the
data to track progress and risks and too identify and respond to issues proactively, including rep
planning
as necessary. All of these activities are recognizable practices within various CMMI Project
Management PAs.
While effective TSP team leaders are often effective coaches to their teams, the TSP defines a separate
role for a designated coach. The TSP coach is critical to the success of the self-directed team in i the
launch, periodic relaunches, and execu ution of the project. The role of the coach is perhaps bestt
explained by analogy to sports teams. A coach typically does not take the field, but rather focu uses—
especially before and after the game— —on the skills and processes used by the team and its mem mbers.
While it was originally thought that th he need for an independent coach would diminish as a teaam
leader gained experience with TSP meethods, in practice, launches and relaunches (at a minimu um)
seem to demand an independent proceess expert role that cannot be filled effectively by the teaam
leader, especially for larger teams. On ne notes, however, that an effective team leader seems to be able
to function, at least in routine situation
ns, as a coach outside of the launch environment, allowin ng scarce
TSP coaches to extend their circle of influence
i more widely.
Both CMMI and TSP have their rootss in the manufacturing quality movement. The fundamenttal idea
that rework is waste that can be measu
ured and intelligently reduced goes back at least as far ass Dr.
Walter Shewhart [Shewhart 1980] (orriginally published in 1931), and carries on through the work
w of
W. Edwards Deming [Deming 1982], Philip Crosby [Crosby 1980], and many others. The idea was
used by Watts Humphrey and his colleagues at IBM to address software quality issues; he bro ought this
with him when he founded the SEI’s Software
S Engineering Process Management (SEPM) pro ogram in
1986 and led the development of the original
o CMM for Software. The formulation of Total Qu uality
0-SR-032 | 7
CMU/SEI-2010
Management (TQM) [Deming 1982] and Six Sigma practices [Motorola 2010]—other systems of
practice derived from manufacturing quality ideas—was also underway in this same general period.
The systematic identification and elimination of waste underlies the fact that “faster, better, cheaper”
is not so much a choice of alternatives as a recipe for improvement. In other words, while one might
be able to choose “faster” or “better” or “cheaper” at the expense of one or both of the others, there is
almost always a way to choose “better” that is also “faster” and “cheaper.” Significantly in software,
the ways that are at once better, faster, and cheaper can be known and implemented only by those who
actually perform the work in question—that is, by the knowledge workers themselves. This idea is
fundamental to AIM methods and results.
When TSP teams produce overall and detailed plans in a launch, they don’t just plan what to do, they
also plan how well they are going to do it—when they are likely to inject defects and how many, when
they are likely to discover and remove those defects, and how many defects will leak through the final
stages of integration and testing, and ultimately will be delivered to the customer. This planning and
its subsequent execution is not a utopian exercise dedicated to some impossible ideal of perfection, but
rather a hard-nosed business practice that recognizes that the largest variations, and therefore the
largest controllable costs and some of the biggest project risks, are embedded in the issue of quality.
The theme of quality management is woven throughout the various threads of the CMMI-TSP-Six
Sigma tapestry. In PSP training, every defect found is cataloged and later analyzed for use in building
and updating review checklists. The defect density in test and the amount of time spent finding and
fixing defects in test (i.e., rework) are primary indicators of both product and process quality. This
idea is carried through intact into the TSP measurement framework, and, as mentioned previously, is a
primary focus of self-directed team planning, execution, and coaching throughout the development
life cycle.
When enough data accumulates across projects to define relationships between effort, schedule, and
quality, Six Sigma methods analyze those relationships by looking for opportunities to make process
changes that will improve performance across all projects. These changes can take the form of
eliminating common issues with current process execution (for example, by formalizing the different
perspectives to be covered in a design inspection) or by modeling or piloting the use of a new tool or
method (which can be as trivial as specifying the use of a hitherto unused feature in an existing
development environment, or as substantial as specifying a method for architectural design).
Many CMMI implementation efforts attempt to move an entire organization up the maturity ladder
level by level, or perhaps PA by PA in slightly more enlightened instances. Part of the rationale for
this “lock-step” approach is that moving one part of an organization very far above another can result
in organizational dysfunction, for example, by imposing different operational requirements upon other
otherwise similar projects, or by having closely cooperating projects working in dissimilar ways. The
TSP approach, as depicted in Figure 2, is to move entire projects to new, more efficient behaviors
representing significantly higher maturity levels very rapidly.
CMU/SEI-2010-SR-032 | 8
Select the
team(s)
Train
managers
Repeat
and
developers
Refine and
Launch and
evaluate the
coach
approach
The experience of TSP projects in organizations at all maturity levels is that the potential issues
related to this approach are easily managed, for four reasons. First, pilot projects by definition are
trying something new in the organization, and clear communications concerning what is being
attempted and why typically result in a willingness to accommodate necessary changes, at least on a
trial basis. Second, because TSP and now AIM offer an operational system of practices, the critical
internal interfaces required for smooth project operation are well-defined. A third and related reason is
that TSP mechanisms are also defined for multiple team projects, so that dealing with interface issues
in such cases does not require case-by-case definition. Finally, because TSP is a defined and measured
operational system, each project serves as an example of model behavior, not to mention potentially
exemplary results.
The project team focus is the key factor enabling the rapid deployment that is a hallmark of AIM. A
significant majority of CMMI practices are project-level practices. In AIM, many of the practices that
apply to organizational activities are implemented by the PG team. Thus, by training and launching
team-by-team as quickly as training resources and project cycles permit, and by including the PG as
one of those teams, maximum diffusion of AIM methods is achieved in minimum time.
CMU/SEI-2010-SR-032 | 9
2 Using AIM
Many who wish to use the AIM technologies only want to know one thing—what to do. This section
describes six overlapping execution threads that generally begin in the order presented, then run in
parallel with one another over the course of months and years as an organization adopts these methods
and adapts them for use in its unique circumstances. Although each thread is described separately
here, various threads constantly interact during the course of AIM introduction and ongoing execution,
as will become apparent in the following descriptions.
One way in which AIM is very like other improvement initiatives is in its need for executive
sponsorship. Sponsorship demonstrates explicit interest on the part of senior management to create
and sustain an organization that exhibits the high-performance characteristics of TSP teams using
methods that will be recognized as CMMI-compatible by a competent SCAMPI appraisal team. For
AIM, serious interest is signaled by attendance at one-day seminars such as Implementing CMMI for
High Performance2 or the TSP Executive Seminar3.
Each seminar conveys some of the information in the present document as it builds the case for
implementing CMMI according to these recommendations. When presented at a customer site, a
seminar is often coupled with a half-day planning session that begins the process of formulating
specific, measureable goals for the AIM efforts, identifying any relevant baseline information in the
organization, and identifying likely candidates for pilot projects and the PG.
However, active participation in such a seminar or funding a proposal to implement AIM is ultimately
not enough to demonstrate management’s commitment to these methods. It is their ongoing
involvement in these efforts that ultimately signals to the organization, “This is how we do things
now.” Sustaining sponsorship can be seen in at least two ways: thoughtful policy statements (both
official and informal), and regular reviews. It is no accident that each of these are reflected in CMMI
generic practices GP 2.1 “Establish an Organizational Policy” and GP 2.10 “Review Status with
Higher Level Management,” which apply to every process area in the CMMI.
The AIM recommendation is started with a very broad policy statement from management that
indicates the general direction in which the organization is heading over the next several months and
2
Implementing CMMI for High Performance, an Executive Seminar, http://www.sei.cmu.edu/training/p22b.cfm.
3
TSP Executive Strategy Seminar, http://www.sei.cmu.edu/training/p22.cfm.
CMU/SEI-2010-SR-032 | 10
years with respect to disciplined practices in the areas of the CMMI process categories—project
management, process management, engineering, and support activities. As the pilot projects and PG
work begins to instantiate AIM practices within the organization and these results are reviewed with
senior management, more specific policies tailored to the organization’s best results are crafted,
usually by the PG, but always with strong input from project personnel. In other words, the people
most affected by the policies (guiding their use of these methods) will write those policies, while
regular reviews of the results from usage experience will explain and justify the policies for senior
management to approve and then advocate across the organization.
When formulating specific performance goals for almost any endeavor, the question usually arises:
Compared to what? “Increase productivity by 50%,” “Reduce delivered defects by 50%,” and
“Decrease cycle time by 20%”—all performance goals—presume that one knows currently how
productive a project or organizations is, how many defects are currently delivered, and what the cycle
time currently is, respectively. Even “Achieve CMMI Maturity Level 2 by date X” (a process
capability goal) implies that, if measured by a SCAMPI A, an organization is currently at ML 1.
As implied above, both project performance and process capability are valid targets for improvement.
AIM takes the view that these are not just compatible but complementary, at least when performance
is pursued using the known-capable methods recommended in this report. How well the organization
and typical projects perform currently should inform any such goals and also act as a starting point for
realistic improvement efforts. While many software-intensive organizations cannot put a number on
productivity or delivered defects or cycle times, that does not imply that such numbers do not exist; it
may mean simply that they need to be retrieved from their current hiding places.
It may be that performance and capability can be assumed at the start and validated as the effort
progresses. For example, it was a significant advance for the CMMI community to recognize that
there is little value in doing a full SCAMPI A in an organization that is clearly at maturity level 1,
especially when the same starting-point information could be gathered with a less formal, and less
costly, SCAMPI B. When doing SCAMPIs as part of an AIM effort, it often makes sense to delay
even the formality of a SCAMPI B until sometime after pilot projects have started, and then to include
in the appraisal, projects both inside and outside the current scope of AIM. This contrast provides
clear context for the organization’s pre-AIM capabilities.
Characterizing performance can be more difficult. Postmortems that gather and begin to summarize
and analyze individual project performance are a standard feature of AIM projects, but knowing or
even making an intelligent guess as to where the organization was prior to the AIM effort can be
challenging. Perhaps the development part of the organization doesn’t have its productivity or
delivered defect numbers, but the testing part of the organization is likely to have some relevant
information from which one might be able to derive a useful performance baseline, however narrow
that might be. Basic accounting information such as timecards and actual budget and schedule results
can be used to estimate, for example, the percentage of time and effort typically spent in formal testing
compared to the overall project. Even qualitative instruments such as surveys can be used, especially
before and after pilot projects, to help assess performance improvements from the adoption of AIM
techniques.
If an organization desires or requires CMMI validation recognized by the wider world, SCAMPI A is
the only official method. As stated previously, the AIM target is CMMI maturity level 3, but the
CMU/SEI-2010-SR-032 | 11
implementing organization must understand the implications of preparing for and successfully
executing a SCAMPI A, as well as the limitations of AIM “out of the box.” The most apparent path to
this is a series of SCAMPI B and C activities, both formal and informal, and of varying scope, tailored
to assessing how well the organization has implemented AIM with respect to CMMI, including a
search for problems with current practices and opportunities for improvement.
SCAMPI A requires an SEI-certified SCAMPI Lead Appraiser (LA), and the selection of the LA is an
important step in this process, regardless of whether the organization is using AIM or some other
approach to implementing CMMI-compatible practices. The LA and the organization should be on the
same page from the beginning, including letting the LA know up front how the organization is
approaching CMMI implementation. The LA may or may not lead the SCAMPI B/C activities,
depending on a host of factors including cost and scheduling, but the data gathered and the lessons
learned in these activities usually inform and guide SCAMPI A preparations. The Guide for SCAMPI
Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010] provides CMMI-referenced
information on “standard” AIM practices that should prove useful to Lead Appraisers, appraisal team
members, PG members, and other personnel involved in the effort.
The execution thread of identifying, training, and launching pilots is perhaps the best understood
thread, in that it has been used for over a decade for TSP introduction, and has extensive description
elsewhere [Humphrey 2011]. The key challenge is identifying pilot projects that are both likely to
succeed and also likely to be seen as valid examples for the rest of the organization to follow. The
number, kind, and scope of pilot projects varies according to the organization’s size in terms of project
mix, the range of project sizes and types, typical durations, and type and numbers of development
staff.
Once pilot projects have been identified, they can proceed rapidly. Training for the various roles
involved in development projects is well-defined and can be taught in parallel, although there is a
preference to deliver Leading a Development Team first to make managers and team leaders aware of
the training that their personnel will soon receive.
Table 1: Role-Based Training
Course Target Audience Duration
Leading a Development Team4 Team leaders, line managers 3 days
PSP Fundamentals*5 Software developers 5 days
6
TSP Team Member Training All other development team personnel 3 days
*PSP Advanced7 is highly recommended for software developers once they have had some experience
applying the skills and principles taught in PSP Fundamentals.
Project teams should be trained together whenever possible, and project launches should follow
closely after the training. One should note that preparation and the actual launch events typically
involve managers above the first level, often up to the senior executive level, who explain the
4
Leading a Development Team, http://www.sei.cmu.edu/training/p17b.cfm.
5
Personal Software Process (PSP) Fundamentals, http://www.sei.cmu.edu/training/p18b.cfm.
6
TSP Team Member Training, http://www.sei.cmu.edu/training/p16b.cfm.
7
Personal Software Process (PSP) Advanced, http://www.sei.cmu.edu/training/p19b.cfm.
CMU/SEI-2010-SR-032 | 12
organization’s goals and reasons for AIM implementation as well the goals and particulars of each
project being launched. This is an intended overlap with the Executive Sponsorship thread and the
Training and Launching threads. Regular reviews with sponsoring management must also become a
regular AIM feature (and part of the Continuous Improvement thread), at least quarterly, although
monthly is common and bi-weekly or even weekly reviews, especially during pilot projects have
occurred.
The process scripts TOPS and TOPS7 in Appendix B provide the best summary of activities
surrounding the preparation, launch, and execution of AIM development teams.
The minimum initial training for the PG is Introduction to CMMI-DEV and either PSP Fundamentals
or TSP Team Member Training. Highly recommended is Implementing Goal-Driven Measurement.
Depending on the size and needs of the organization and the preferences of individual members, some
members of the PG may further develop their CMMI expertise. Some may choose additional
coursework leading to becoming certified to teach the Introduction to CMMI8 class or even become a
SCAMPI Leader Appraiser or SCAMPI B/C Team Leader, while others may choose to follow the
path to become PSP Instructors or TSP Coaches.
Planning and developing for the scope, implementation, and internal capabilities of the target
organization to deliver training and coaching services is a paramount responsibility of the PG under
AIM, and maps directly to the strategic training needs under the CMMI process area Organizational
Training (OT), as well as tactical needs identified by projects (see PP SP 2.5 and GP 2.5 for relevant
PAs). Typical initial PG goals include addressing the organization’s strategic training needs, including
AIM-related skills. Having some level of internal TSP coaching capability, for example, is a hallmark
of success for most organizations, so either hiring or identifying and developing internal coaches is a
critical activity. Internal resources also help to speed the rate of implementation through the
organization.
Another major PG responsibility is the establishment and maintenance of organizational process assets
as described in the CMMI process area Organizational Process Definition (OPD). Obviously, the AIM
process assets become part of this definition; however, even the most chaotic of organizations
typically have some core processes that can rightly be identified as good practices that should be
preserved and built upon. In fact, the AIM approach to the Engineering process category (see below in
Section 3.11) specifies that the organization’s existing development practices are treated as the
starting point in those areas. The initial pilot projects in any organization will typically identify and
perhaps formalize these practices, which should in turn simplify the PG’s task of capturing these
practices and related execution data for use by other projects.
8
Introduction to CMMI for Development v1.3, http://www.sei.cmu.edu/training/p91.cfm.
CMU/SEI-2010-SR-032 | 13
The final major responsibility for the PG relates to the CMMI process area Organizational Process
Focus (OPF), which deals with identifying strengths, weaknesses, and improvement opportunities for
the organization’s process capabilities, planning and implementing process improvements, and
deploying process assets across the organization, while incorporating the lessons learned by the
projects. In essence, this puts the PG in the position of coordinating all of the threads of AIM
execution on an ongoing basis. A general description of PG startup and ongoing execution can be
found in process scripts POPS and POPS7 in Appendix B.
Certainly after the presumed successful use of AIM methods on pilot projects, and sometimes even
earlier (especially on larger projects), an organization may decide to adopt AIM across the
organization, or to proceed to a second round of pilot projects reflecting a broader range of projects in
the organization. In the smallest organizations, broad adoption may be an accomplished fact because
the entire organization has already been trained and is using the methods, or it may simply be a matter
of training the rest of the personnel and launching one or two more teams. Larger organizations have
the problems of training and sequencing ongoing training requirements, plus when and how to launch
TSP techniques with remaining teams. And all organizations have the ongoing issue of providing the
skilled TSP and CMMI expertise necessary to maintain AIM implementation, and eventually broaden
the scope of data analysis, possibly with Six Sigma techniques.
All AIM projects, and even cycles within those projects as depicted in Figure 3, end with postmortem
(PM) events. These postmortem events characterize project performance for use by the team doing the
next cycle or project, and which feed into the organization’s long-term memory for use by future
teams and possibly by the PG or other group for further analysis. A general description of a TSP cycle
can be found in process script CYCLE in Appendix B. While fulfilling particular CMMI practices
(notably GP 3.2 for many PAs), an important part of the PM is gathering PIPs (Process Improvement
Proposals) that reflect problems and observations made by development teams in the course of doing
their work. These can range from complaints on defect tracking that can lead to a simple standard for
using an existing field in an existing tool, to a suggestion to develop an entire new process discipline
such as formal architecture methods in order to avoid entire sets of problems encountered by large-
scoped projects.
CMU/SEI-2010-SR-032 | 14
Figure 3: The TSP Development Cycle
e Structure
Planning and executing a broad rollou ut of AIM methods, especially in a larger organization, caan fall
either to the existing PG or to a related
d team of outside resources that have the capacity, for ex
xample,
to train a large number of personnel quickly
q and then to launch those people as TSP teams, using
process assets that may or may not haave been updated by pilot project teams. Even in the casee of
outside resources, the PG should be in nvolved with every step of the process, if only because th
hey will
have to live with the results. The PG can
c also warn of potential pitfalls that only organization insiders
would recognize, and help to navigatee around them. In the best case, the PG will be heavily in nvolved
in training and launching new teams, especially in preparing and advising them on the use of the t
organization’s standard process assetss.
Ideally, there comes a time when the entire existing organization has been trained, mechanismms have
been implemented to train new develo opers and managers, projects new and old contribute to th
he
organization’s process assets, the orgaanization has been officially appraised at CMMI ML3, annd
management talks the talk and walks thet walk. While this is a significant achievement, AIM can
c and
should be used to do more. If an organ nization says “we’re good enough” then, in some sense, a
business opportunity may be passed up,u and a potentially significant risk may be forming.
The business opportunity is in the pottential of the organization to go far beyond its previous
performance, based on detailed quantiitative knowledge of its own capabilities. This could meaan using
Six Sigma methods to achieve a high CMMI maturity level, or it could mean simply using tho ose
methods to focus narrowly on the aspects of project execution that matter most to the businesss. Is
there opportunity to “lean” the processs to achieve significantly faster throughput, and thereforre,
significantly higher productivity? Is there
t an opportunity to reduce industry-best defect densitties to
CMU/SEI-2010--SR-032 | 15
levels that are truly world-class, and thereby differentiate a product in an extraordinary way? Only an
organization with hard data that relates their processes to their performances can answer these
questions adequately, and then execute based on the answers.
On the opposite side of the business opportunity is the risk. If an organization decides not to pursue
improvement and wants current performance “merely” to be sustained, what signal does that send to
employees and customers alike? In a dynamic and competitive world, is it even possible to maintain
performance when new competitors, new technologies, and new circumstances constantly redefine
where the bright line of excellence is drawn? The AIM philosophy is that continuous improvement is
not an option, because the world in which we live moves so quickly that one must move quickly ahead
or be left behind.
Building a culture of excellence is the most difficult and also the most important long-term execution
thread in AIM. Consider what can happen in its absence: A key executive leaves, the replacement
focuses on short-term gains, a middle manager nearing retirement decides to go with the flow, a team
leader loses focus because her boss doesn’t seem to care, her team loses focus because its leader is
distracted. If people revert to even some of their former behaviors, much of the performance
improvement and business benefits of implementing AIM can be quickly lost. It is the job of all
parties—sponsoring executives, middle and line managers, team leaders, the PG, the developers—not
just to do their jobs, but also to demand excellence of themselves and their co-workers. This job
demands, in turn, not just executing the existing AIM methods, but adapting and improving the
methods for better performance on the next project as compared to the current one. CMMI can provide
a framework, TSP can provide implementation excellence, SCAMPI can verify compliance, Six
Sigma methods can help with analysis of the data and the formulation of new goals and process
changes to achieve them—but the motivation to use these methods to find new potential
improvements in cycle time or delivered defects or project predictability beyond what was achieved
just last year comes from the people using the methods.
CMU/SEI-2010-SR-032 | 16
3 Specific Guidance for AIM Implementers
This section provides specific information for those charged with implementing AIM practices. Like
the blind men describing the various parts of an elephant, in some cases there is a strong conceptual
thread that is obviously carried from one topic to another, while in others the only common thread is
that the description is of another aspect of AIM. The following topics need not be implemented or
even read in the given order; however, the authors recommend reading them all through once and then
taking action on the ones that apply, considering specific information from the appendices, and from
other relevant sources. This is generally the work of the PG in any organization, possibly with the use
of expert resources from within or outside of the organization.
The Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010] is a
necessary companion document. AIM exists because of the many documented instances of TSP being
used to implement CMMI (and previously, SW-CMM) practices, most often in concert with other
“known good” practices like Six Sigma or Goal Driven Measurement or locally grown procedures
[McHale 2004, Saint-Armand 2007, Seshagiri 2009]. The basic combination of a proven starting point
of operational practices and a deliberately general set of best-practice descriptions allows broad
choices for implementers. Having the clear, documented links between the two provides both an
introduction to the main implementation vehicle in CMMI terms, and a “worked example” of high-
performance CMMI practices and artifacts.
It is important to work with certified experts in PSP, TSP, CMMI and SCAMPI, and Six Sigma when
implementing AIM. While the ultimate goal of AIM is for an organization to achieve self-sufficiency,
every success story that the authors can point to involves the proper training, coaching, consulting,
and other expertise in preparing, coaching, executing, and evaluating the recommended practices that
make up AIM. In other words, while there are known cases where the path to expertise and excellence
has been shortened, there are no examples where the path can be short-cut. Many of the failures lack
one or more of these apparently critical requirements for success, and all of them rely on the use of
qualified experts.
Perhaps, as time passes, a target organization will customize and adapt AIM practices until they are
almost unrecognizable as what began as something called “AIM,” and the organization’s own people
will become the experts. However, that progression seems somewhat risky, if not downright unlikely,
if the starting point is very far from the basics of the component technologies. The path described in
this report involves both a known start and a known end. One of the basic tenets of TSP applies here,
namely, that the fastest and cheapest way to implement is usually to do it right in the first place.
AIM identifies three common use cases for combining CMMI and TSP technologies.
1. CMMI has some significant implementation in the organization, and TSP is being introduced.
2. TSP has some significant implementation in the organization, and CMMI is being introduced
more explicitly.
CMU/SEI-2010-SR-032 | 17
3. The organization has no significant implementation of or experience with either CMMI or TSP,
and therefore, both must be introduced.
The expectation is that it should be obvious to any organization that decides to use AIM which use
case (UC) applies in the current instance. AIM is designed generally for the third use case, an
organization without CMMI or TSP experience, with the first two UCs having already implemented
perhaps the only workable “shortcut” to the desired end.
Each use case presents its own unique profile of problems and opportunities, and in fact the three
mentioned above represent an entire class of usages across a spectrum. For example, the CMMI-extant
organization (UC1) can be at any CMMI maturity level, ML1 to ML5, and the precise details of the
existing practices strongly influence how TSP introduction and adaptation should proceed. TSP-extant
organizations (UC2) may have implemented such discipline across all development projects or just a
few projects, and may or may not have their own TSP coaches on staff. Even UC3 organizations that
have no formal CMMI or TSP programs in place may possess other existing good practices like Six
Sigma or Scrum, and even home-grown methods must be recognized and incorporated into the new
AIM practices. For all use cases, regardless of experience with CMMI and TSP, the size and culture of
the organization, and the average and extreme project durations must also be accommodated during
implementation. These are just a few of many relevant factors,
One important activity under any use case based on an existing implementation of either TSP or
CMMI is the verification of existing practices. One may assume from conversation and observation
that, for example, an organization has significant experience and expertise in TSP. However, the
quality of the TSP implementation, which can only be known by a detailed assessment of that
implementation, can make huge differences in the necessary course of action going forward. If TSP
implementation is in some way superficial or incomplete, or simply dates to a much earlier version of
TSP than the current one, planning and execution of AIM may be best served by first “upgrading” to
the current version, with particular emphasis on any uncovered shortcomings in the prior
implementation.
TSP has been extended formally for AIM as a result of a formal gap analysis, in the mode of a
SCAMPI C analysis, which was conducted at the SEI in 2008-2009. As a result of that analysis, a
series of Process Improvement Proposals (PIPs) were generated to address specific gaps or groups of
gaps. For example, the lack of policy statements in TSP resulted in a gap for GP 2.1 of every PA in
the scope of the analysis. The original observations noting these gaps, as well as revised observations
that point to the present document (for example, for policy guidance), are recorded in the Guide for
SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010]. A copy of the original
PIPs are available in Appendix D.
The PIPs were written in many cases referencing the basic process elements that make up TSP:
scripts that describe processes at an expert level, checklists that guide critical procedures or capture
critical information or both, forms that capture process data, guidelines that provide additional
information on specific topics, and specifications that describe things like reports, documents, or roles.
TSP also calls for the creation of local standards such as a coding or design standard, defect
classification standard, or documentation standard that are, in some sense, another kind of
specification. Another process element is the tool set one uses to implement one or more aspects of the
process, preferably in an automated, integrated way; however, such automation usually enhances or
CMU/SEI-2010-SR-032 | 18
takes the place of some step or steps in a script, or captures data specified for a given form or
according to a given standard. A list of the TSP process assets that were the target of the gap analysis
is available in the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk
2010].
The new version of TSP that resulted from these observations and PIPs is popularly called “TSP+,”
with the official designation “TSP+ 2010.09.” This version is available only to SEI Partners for TSP.
The popular naming derives from the idea that this version is everything that was present in both
released and unreleased versions (such as TSPm which can be used to address the IPPD aspects of the
CMMI) of TSP previously, plus the added and modified process elements necessary to address certain
practices or groups of practices in the CMMI. In several cases where an existing process element was
modified in accordance with CMMI guidance, the change that resulted also had the effect of
addressing a known problem or issue with TSP implementation.
The general approach to implementing any AIM practice is to first recognize that there may be
existing practices within the organization that are perfectly adequate as they stand, or that may be
partially workable in conjunction with AIM practices. A primary purpose of the pilot projects is to
highlight such existing practices and then to determine how to integrate AIM practices effectively
with them. Furthermore, it is not primarily the AIM experts who make such determinations; rather, it
is the pilot teams themselves, including the PG, guided by the experts, who make such determinations.
The bias in AIM implementation is that the working processes belong to the team, therefore the team
must decide what existing practices should be used within the AIM framework for the management,
measurement, and execution of their work. Such practices may be used as they are, may require
modification for use within AIM, or may actually be replaced by a relevant AIM practice. If a practice
is, in fact, not measuring up, the AIM measurement framework usually demonstrates that fact in a
literal sense. For example, if an inspection process exists and the team wishes to continue to use it
instead of the TSP inspection process, the team plans and tracks such inspections just like any other
piece of work. However, if that inspection process doesn’t work very well in terms of defect yield or
defects found per unit time or some other relevant measure of effectiveness, the team will have
objective data that not only justifies a change, but demands one.
Very often, a new AIM practice must learn to co-exist with existing practices. A fairly common
example is that of existing project management practices. A medium-to-large-sized organization often
sets up some kind of project management office (PMO) to ensure that adequate, consistent planning
and tracking efforts are applied to every project. AIM teams create and update their plans using the
TSP launch/relaunch processes, and track progress at the weekly team meetings. Someone, usually the
planning manager or team leader, must then provide the right information in the right format in the
right timeframe to the PMO [Chick 2006]. This is usually a matter of selecting the proper subset of
existing data needed by the PMO, since AIM teams typically generate far more detailed data than the
typical PMO requires.
Many of the remaining topics in this section cover CMMI PAs that should be approached in just this
way: look for overlaps and potential conflicts between existing organizational practices and AIM
practices, and use pilot project experience and expertise to determine how these practices will adapt
and co-exist going forward. Even after full implementation of AIM methods is achieved, this is a good
approach to take when implementing process changes. Attempting across-the-board implementation of
CMU/SEI-2010-SR-032 | 19
new practices in all but the smallest (one- and two-team) organizations does not usually end well.
Major changes to a project team’s process should be made during the project team’s launch or
relaunch as part of their project execution cycle (see script CYCLE in Appendix B). Forcing
organizational process changes in mid-cycle can be extremely disruptive to a project’s productivity.
By introducing such changes during a team’s natural planning phase, the team is able to account for
and anticipate the process changes into their negotiated commitments made with management and
other relevant stakeholders.
On the surface, there should be few issues surrounding implementation of the Measurement and
Analysis (MA) PA in CMMI in an AIM-based effort. Measurement is a hallmark of the TSP. Direct
process measurements at the most fundamental level—the individual developer—are introduced and
trained intensively in the PSP and demanded daily on a task-by-task basis of TSP team members, and
certain fundamental analysis of this data occurs weekly on every TSP project. Few existing
organizations have even attempted measurements at this detailed level. Despite this the coverage was
not perfect initially for the CMMI-specific practices (SPs) of this PA, due largely to the lack of
explicitly documented measurement objectives, the subject of MA SP 1.1. (See PIP MA-1 in
Appendix D.)
Fortunately, there are at least two non-exclusive methods available to deal with this issue. First, Goal
Driven Measurement training provides a template that captures the information needed for the Goal-
Question-Indicator-Metric, or GQ(I)M, paradigm [Park 1996]. In fact, the indictor templates capture
much more data than is actually needed for the narrow purpose of meeting MA SP 1.1, as explained in
Applications of the Indicator Template for Measurement and Analysis [Goethert 2004]. The additional
information also provides an organizational guide for many activities of the PG with respect to the
measurement repository (OPD SP 1.4) and the many related specific and generic practices, especially
planning for and monitoring project data (PP SP 1.3 and PMC SP 1.4), and using and contributing to
organizational process assets (IPM SPs 1.2 and 1.6 and the corresponding GPs 3.1 and 3.2, for any
activities planned and tracked by TSP teams).
A second, more narrowly focused method, used with good results on an AIM pilot project [Fleshman
2010], is to address the issue of measurement objectives in a policy statement. This method has the
advantage of making such a policy specific and actionable, while also addressing MA GP 1.1. As
implied above, the two methods can co-exist on any given implementation. This is one of the many
choices for implementation to be made in AIM, usually by the PG.
Finally, the basic TSP data and project-level analysis should be the starting point for MA in an
organization, not the end point. The indicator templates provide a known path to a wider world of
effective organizational use of measurement. This becomes more explicit for higher levels of CMMI
maturity (outside of the current scope of AIM), however almost any thoughtful application of Six
Sigma methods creates actions and artifacts that will show well during a SCAMPI appraisal and, more
importantly, provide value to the organization.
In a broader sense, the way in which AIM addresses MA is indicative of the general AIM philosophy
and implementation strategy. The core of the implementation, proven in the field for over a decade,
begins with PSP training and TSP implementation. The SCAMPI C observations in the Guide for
SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010] shows a potential
shortcoming, and a solution is provided (i.e., a gap is filled for MA SP 1.1) based on the existing
CMU/SEI-2010-SR-032 | 20
practices, or in this instance, by GQ(I)M. By using CMMI specific and generic practices and the
accompanying observations as a guide, an alternative and perfectly valid solution is formulated to
address not just this particular gap, but also the MA GP 1.1 gap as well. The SCAMPI appraisal that
verifies a particular method or methods as compliant with the model does not distinguish between
AIM and locally developed methods.
The subject of organizational policy arises throughout AIM implementation, not only because there is
a generic practice for policy (GP 1.1.) applicable to every PA in the CMMI, but also for the links to
management’s intentions for usage of the AIM process assets and the data that usage of these methods
creates. The general AIM recommendation is that the policies related to AIM reflect the stages of AIM
introduction and an evolving understanding of how AIM practices are being used in the organization.
Therefore, policy statements must be revisited with reasonable frequency, probably every six months
or so, during the first few years of AIM usage.
For example, an initial policy could be in the form of a memo from the executive sponsor, stating an
intention to pilot AIM usage within the organization, the overall goals for AIM implementation, and
naming a responsible manager and/or group. Six months later, related individual policy statements
might be issued: when pilot projects are well underway or finishing, when the PG is readying to
launch, and the organization understands how to integrate AIM practices with existing project
management practices, quality assurance practices, and parts of an engineering life cycle. This
experience will be reflected in a more sophisticated set of policies. In another six months, after one or
two cycles of PG efforts and a broadened set of development projects under the revised policies, the
PG would draft a small but comprehensive set of five to ten new policies that matched the
organization’s goals with the project management, process management, engineering management,
and the various support activities that represented the new standard of expected behavior.
Organizational policy under AIM should evolve as organizational practice evolves, and the
organization is best served if it develops and adapts such policy internally. AIM therefore offers no
generic policy statements. Rather, as the organization pilots, uses, and adapts AIM methods, it shapes
policy statements to direct the expected “new normal” behaviors.
In an early step in the TSP launch process that AIM teams use to plan their work, the team in question
must decide upon an overall strategy for their work. While there are many available options and many
decisions to be made, the AIM framework requires an initial strategy that is compatible with the
CMMI model, the SCAMPI appraisal method, and the TSP operational practices.
The two most obvious implementation strategies, built into CMMI itself, reflect the staged and
continuous representations of the model. However, both of these approaches are most often attempted
(perhaps after appropriate pilot activities) across an organization. AIM follows a project-by-project
approach, but this should not imply that the structure of the CMMI is ignored. Rather, the PG must
utilize both the structure of the model and the nature of AIM methods, in conjunction with any local
practices, to craft a reasonable strategy.
Inherent in the project-by-project approach of AIM is a strong emphasis on the project management
PAs as a group. TSP covers these reasonably well [McHale 2004, Davis 2002]. Concurrent with this is
a strong emphasis on the activities being planned and tracked, which for the noted references,
CMU/SEI-2010-SR-032 | 21
corresponds to the engineering PAs, especially at ML3. The largest concentration of “gaps” is
clustered in the ML3 process management PAs (OPF, OPD, OT). The AIM approach of launching the
PG like any other team in the organization exercises the project management strengths again, but
shifts the focus from engineering development to process development, thus addressing many gaps in
an organized way.
The PG models the desired behavior of teams within AIM in several ways. First, the PG is trained,
launched, and managed and coached just like any other development team in the organization. Second,
the nature of PG work forces an adaptation of TSP methods that serves as an exemplar to encourage
other teams to adapt and thus truly own their team processes. A process development process, for
example, is provided as a starting point for any such activities, along with unique additional team roles
that serve the unique purposes of the PG. Some of these are reflected in the charts in Appendices
dealing with Process Definition and Organization and Project Training.
Coaching the PG represents a challenge, partly because such a team is unique in an organization, but
also because of the breadth and depth of knowledge needed by the coach. Ideally the coach would be
both a qualified TSP coach and deeply experienced (if not formally qualified) in CMMI-based
improvement and appraisal. However, as of this writing, there are few individuals worldwide who
possess both skill sets. More likely, a partnership between a TSP coach and a CMMI expert will be the
normal minimum expert team. The Guide for SCAMPI Appraisals: Accelerated Improvement Method
(AIM) [Miluk 2010] and the current document are intended to help bridge that particular gap.
The PG coach should recognize that almost nothing the PG does affects only its own work. The
connections from the “target” PAs of the PG—OPF, OPD, and OT—correspond explicitly to generic
practices—GPs 2.5, 3.1 and 3.2—which span potentially all engineering activities of development
teams, not to mention the many related specific practices in the project management PAs. In addition,
some PGs take on parts of one or more of the support PA functions as a matter of efficiency, further
strengthening their connections across projects via the corresponding GPs (see below). Finally, the PG
must recognize that its behavior will be viewed by the rest of the organization as a model for
development teams. It therefore becomes even more important for the PG to function effectively—and
in some sense publicly—as an AIM team, since a valid measure of their success will be in how
effectively the other development teams function in the new environment.
The CMMI support PAs covered by AIM include Measurement and Analysis (MA, addressed above),
Process and Product Quality Assurance (PPQA), Configuration Management (CM), and Decision
Analysis and Resolution (DAR). During the initial gap analysis that formed the starting point for the
Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010], PIPs were
generated to address gaps found for each of these areas. These PIPs are included in Appendix D as an
aid to organizations executing UC2, namely implementing AIM on an existing base of TSP practices,
since this information should help to highlight changes in their baseline of TSP practices.
For both PPQA and CM, the PIPs reflect the fact that the project-oriented TSP processes basically
acknowledged the existence of and interactions with these PA practices, but did not attempt to directly
implement them. For example, CM is acknowledged by the designation of the TSP Support Manager
role as having principal CM duties for the team, and with direct CM items being planned during the
TSP launch, especially launch meeting 3. Quality assurance activities are spread across several roles
and are distinguished at least partially by process QA (TSP coach role, TSP team leader role, process
CMU/SEI-2010-SR-032 | 22
and quality manager roles) and product QA (TSP team leader, test manager role, and other roles
depending upon the products in question).
However, when applying SCAMPI-like tests for the existence and adequacy of process artifacts, it
became apparent that some activities that TSP considered optional became, if not required, at least
very strongly suggested in order to leave an artifact trail for a SCAMPI appraisal team. Thus, the
answer to the question of whether TSP team role activities related to these practices should be
included in a team member’s individual workbook changes from “it’s optional” or “it’s up to the
individual” to “yes, this work should be planned and tracked just like any other.” By planning and
tracking these tasks in individual workbooks, an artifact trail for specific and generic practices is
created as the work is done.
Configuration Management (CM) provides the best example of how AIM and existing practices
should co-exist. It is almost unthinkable that any modern software development organization has no
existing configuration management practice. Excellent CM tools are available commercially and
freely in open source, and rudimentary CM is built into many modern development environments.
However, it is almost impossible for any tool to fully implement even one specific practice in any PA,
let alone the seven within CM, without mindful direction from skilled practitioners.
The Configuration Management chart in Appendix A shows the formal AIM solution, a solution that
goes in a different direction than the one specified in PIP CM-1. The AIM solution includes primary
interactions between someone wishing to update or place an artifact under configuration control (most
typically a product owner) and the support manager role, with additional interactions with the design
manager and process manager roles, and the CCB (Configuration Control Board, an entity defined in
the TSP launch if not already in existence). Certain TSP+ process assets referred to in the chart
provide specific guidance and capture necessary information. If a team executes the actions on this
chart appropriately and consistently, along with other actions specified in the support manager role
description, TSP launch scripts, and when enacting well-composed development scripts, compliance
with CM practices is virtually assured.
However, AIM provides a purely paper-based solution, while real-world CM for almost any project is
going to use one or more of the aforementioned automated tools, and possibly well-established
procedures for their use. The AIM recommendation is not to throw out the existing CM, but rather to
understand the requirements of missing or inadequate CMMI practices and what is provided by the
relevant AIM specifics, to use that understanding along with whatever AIM assets make sense under
the circumstances, and then to devise a solution that adds value for the team and organization while
fully conforming to CMMI. The least-effort solution is that existing CM practices suffice both for the
team’s and organization’s purposes, and for CMMI—perfectly acceptable under AIM. In another case,
some element of CMMI is lacking in the existing implementation, and between the AIM
recommendation and the actual text of the CMMI, some intelligent selection or adaptation of AIM
parts will fit the need and provide benefit to the effort.
This is the preferred implementation mode for AIM practices beyond the basics of TSP introduction
and SCAMPI appraisals: understand what already exists, understand the intent of the model and what
is missing or inadequate, understand what AIM can provide, and then formulate and verify a solution
that provides value to the project team and to the organization.
CMU/SEI-2010-SR-032 | 23
3.8.2 Process and Product Quality Assurance (PPQA) and GP 2.9
The subject of QA (quality assurance) is often a sensitive one, especially with development teams. No
activity in an organization seems more prone to abuse, whether by a mindless check-the-box mentality
or by a punitive, audit-like approach. In fact, a common anxious question in PSP and TSP training is,
“What is management going to do with this data?”
The AIM approach to quality assurance is multi-layered and somewhat diffuse, and with proper
safeguards in place, should co-exist smoothly with existing QA practices and allay any concerns about
management’s use of detailed personal data. Rather than implementing the somewhat obvious
suggestion in PIP QA-1, AIM emphasizes the use of existing roles and processes in a manner fully
consistent with their intended use, to ensure that processes are consistently executed in order to deliver
excellent products.
As mentioned previously, process quality assurance and product quality assurance are addressed
separately but in similar manners. Both begin with the individual team member who executes a
defined, measured process to produce a product to a specified standard. Parts of that process typically
specify personal reviews by the developer, team inspections, and some level of testing by the
developer—product checks. The process manager role monitors compliance with the plan—process
checks.
The team leader, who is ultimately responsible to management for the product, and the TSP coach,
who is ultimately responsible for the process, each have their own overlapping checks. The team
leader runs the weekly data-driven team meetings, sees how the product development is progressing,
and hears reports from the process, quality, and test managers. The TSP coach, following a coaching
plan developed during the launch, sits in on early weekly meetings and occasionally may do so later in
the project, and reviews workbooks on a periodic basis. On request from a role manager or the team
leader, or based on the workbook review, the coach may conduct a TSP checkpoint, with or without a
formal report to management, in order to identify any issues with gathering data or following the
agreed-to processes. The coach may also do specific role-based coaching for the team leader and team
members in the use of a tool or specific processes, or with respect to executing a particular team role.
Significant artifacts for a SCAMPI appraisal include role descriptions for the TSP coach, team leader,
team members, and other relevant roles; workbooks that document and track the coaching plan and the
various team roles; weekly meeting minutes that record the role manager reports and any QA issues;
and the TSP coach’s checkpoint reports.
This, then, is the basic model for QA in AIM: three tiers of overlapping checks, first by the individual
team member against his or her own process and plan, second by the team roles, and finally by the
team leader and TSP coach. Non-compliances can be reported at any level and escalated as necessary;
however, the emphasis and preference is that non-compliances are corrected immediately by the
people involved. If a process cannot be followed consistently, it is changed. If a tool is being used
improperly, remedial instruction is provided immediately. If a team is generally unable to follow its
plan, usually this is because the plan is outdated or based on false assumptions, in which case a replan
or relaunch is necessary.
This approach is also consistent with a more traditional approach to quality assurance. The typical
AIM team creates detailed records of their activities as a matter of course and on a daily basis, and
early organizations implementing CMMI (and before it, SW-CMM) certainly took this traditional
approach utilizing existing QA personnel and procedures [Wall 2007]. However, this did not change
CMU/SEI-2010-SR-032 | 24
or diminish the approach described above. AIM teams take responsibility for the quality of what they
deliver, and recognize explicitly that such quality relates directly to the processes that they use, and to
the fidelity and discipline with which they execute those processes.
Decision Analysis and Resolution (DAR) is a somewhat odd process area. It has no obvious relation to
generic practices like the other support PAs. It lives alone in the support category at ML3. Yet in some
ways, it is the most useful and ubiquitous PA of all. Consider that an organization just starting out on a
process improvement effort must, rather obviously, decide what improvement model or models to use
as a guide, and what methods to use to implement those models. This seems like an important enough
decision to establish evaluation criteria and methods, identify and evaluate alternatives, and then select
from among those alternatives how to proceed. Most of DAR can be executed before the effort
formally begins.
In fact, this pattern repeats over and over again in a process improvement setting, a project
management setting, or a development setting. Any decision important enough in terms of money,
personnel, strategic direction, or any number of business or technical considerations can benefit from
a formal decision analysis process consistent with DAR, and many are possible. TSP+ provides a
script and accompanying form to guide a fairly generic DAR capability, but many different decision
analysis disciplines may (and perhaps should) exist in any organization. Sometimes an entire project is
launched that recognizably fits the DAR specific practices, for example, to choose between two
competing high-speed communications products for a business-critical real-time transaction-
processing system.
The organization’s PG and its TSP coach should be aware of the DAR script and use it as appropriate,
while keeping in mind that it is one minimally acceptable example of formal decision analysis. When
building project plans for development teams and for the PG, decision points can often be identified
that would benefit from the use of DAR, while incidentally creating the artifacts of interest to a
SCAMPI team. Tool decisions, project strategy decisions, architectural decisions, AIM rollout
decisions—all of these are opportunities to exercise the DAR script and benefit from the formal
analysis.
PIP PP-3 in Appendix D points out a paper weakness in the previous version of TSP, namely that
there was no consistent documented method of planning the involvement of identified stakeholders.
The phrase “paper weakness” is used because, in practice, stakeholder involvement was typically a
strength of TSP because the transparency of its project status simplified the involvement of relevant
parties at critical points in the project. However, “typically” does not mean “always,” and almost
every active TSP coach can name an instance when some relevant stakeholder slipped through the
planning cracks, which caused a problem later on.
The Stakeholder Involvement chart in Appendix A describes the AIM approach to identifying and
planning interactions with important stakeholders by linking their involvement with specific TSP+
process elements (form RSIM, Relevant Stakeholder Assignment Matrix) to specific TSP+ team roles
(form SRAM, Stakeholder Role Assignment Matrix). (A complete list of TSP+ process elements is
provided in the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk
2010].) This reflects a common way that TSP teams have dealt with their stakeholders in the past, but
as with most of AIM, this is not intended as the exclusive method of dealing with this issue. Many
CMU/SEI-2010-SR-032 | 25
robust project management disciplines, formal and informal, explicitly recognize the importance of
stakeholder identification and involvement.
Generic practices (GPs) in the CMMI are, in some sense, part of the skeletal structure of CMMI.
Every PA includes them, and they form a logical, repeatable pattern that reflects an organizing
philosophy underlying CMMI. Whether the activities to be performed are engineering, project or
process management, or support, those activities should be assigned to properly trained and equipped
personnel, be planned and tracked based on practices that have worked before, have important
artifacts placed under appropriate configuration control, be subjected to objective scrutiny, be
summarized for and reviewed by senior management, and then be made available for analysis and
possible future use across the organization.
PSP training and TSP teams show their true power in this regard. The basics of using defined
processes that are planned, performed, measured, and tracked are fundamental to working on a TSP
team, and all of these are evident in the artifacts that such teams naturally produce. Obviously, this
reflects well when a SCAMPI appraisal team looks at such a team’s activities as measured against the
yardstick of CMMI. However, the team is not working in this fashion in order to look good in the
SCAMPI, it works this way because it gets the job done right the first time.
The issue of implementing engineering practices, especially in the light of a SCAMPI appraisal, is in
some ways the easiest for AIM to address, and in some ways the hardest. It is easiest in the sense that,
as will be explained below, there is relatively little in the way of standard AIM process assets to
understand and implement. But, it is hardest in the sense that the full implementation of “engineering
practices” as specified in the PAs for Requirements Management (REQM—5 SPs), Requirements
Development (RD—10 SPs), Technical Solution (TS—8 SPs), Product Integration (PI—9 SPs),
Verification (VER—8 SPs), and Validation (VAL—5 SPs), plus the associated generic practices,
could represent the largest amount of implementation effort by an organization using AIM.
PIPS ENGR-1, ENGR-2, ENGR-3, and REQM-1 (shown in Appendix D) describe the issues with the
engineering-related TSP+ scripts. The process scripts that cover an idealized engineering life cycle—a
CYCLE script (see Appendix B) that points to a DEV script for new development, a MAINT script for
maintenance work, each of which in turn point to several other subordinate scripts for particular parts
of each life cycle—are fairly high level and generic, and eventually refer to artifacts such as an
Engineering Requirements Specification (ERS), System Requirements Specification (SRS), or
Software Design Specification (SDS). These artifacts have no existence in TSP outside of these
scripts—no templates, no specifications, no exemplars, only names. The assumption is that these are
placeholder scripts and artifacts that would be supplanted by an organization’s existing engineering
life cycles, including templates, exemplars, and specifications as needed.
In most cases, this is exactly how the engineering scripts are used, with occasional reference to them
in terms of using the given scripts as checklists to ensure that the local engineering processes used by
the team didn’t miss any items deemed important enough to be mentioned by AIM. However, when
evaluating the baseline TSP or even TSP+ for SCAMPI C purposes, the “paper only” evaluation came
up short [Miluk 2010].
CMU/SEI-2010-SR-032 | 26
In deciding how to address these gaps, there is no intention within AIM to direct engineers,
developers, and the teams that they work on. Rather, it is part of the team’s job, as in “standard” TSP,
to define the way that it works—which can mean simply using the organization’s existing practices
within the TSP management and measurement framework—and then capturing those definitions. The
PG provides support as needed and then collects such definitions for possible inclusion in the
organization’s process asset library, along with the associated estimated and actual performance
measurements.
So, while a large majority of process management, project management, and support CMMI practices
are addressed directly by AIM, there are no shortcuts to implementing engineering practices in any
given organization. The development team, the PG, the TSP coach, and the CMMI expert must work
together, often with additional parties from within the organization. This is especially critical in larger
organizations. As a general rule, the larger an organization becomes, the less likely it is that
development teams or even the PG will have the entire scope of engineering practice within their
direct control. Marketing is likely to be heavily involved in requirements elicitation and validation. A
project management office may own critical pieces of the requirements management process. A test
department may own the later parts of the life cycle. In these cases, the relevant parts of the life cycle
guided by CMMI specific practices must be addressed by more conventional means; however, the
development team is well-advised to exercise the stakeholder involvement mechanisms described
above to engage other groups and individuals constructively.
The good news here is that competent development organizations, almost by definition, will already
be performing a significant number of the 45 CMMI engineering specific practices. AIM training and
methods provide a generic practice framework that are known to work with a wide range of methods,
from agile practices of all descriptions to the most demanding aspects of formal architecture,
implementation, and security methods. The first pilot projects are usually critical for characterizing
engineering practices by providing the disciplined framework that focuses on producing the artifacts
and activity trail for which SCAMPIs are looking.
For organizations with a pre-existing TSP implementation, even the best of such implementations
must look for the differences in TSP+. These begin with launch preparations, which have become
more extensive and explicit, in part to ensure that CMMI-visible practices move from the category of
“implicit/usually done” to “explicit/always done.” The experienced TSP coach, team leader, and
development team should pay particular attention to the updated PREPL checklist and more recent
PREPT script. It is also advised that the TSP coach and the team’s process manager take time to study
the changes made to existing TSP scripts, forms, specifications, etc. as depicted in the activity charts
shown in Appendix A.
In a similar vein, TSP role-based activities that were left to the discretion of teams and individuals to
plan and track (either formally or informally) are now directed to capture all of these activities
formally. The original role manager responsibilities, with only a few updates, are essential to the
performance of many CMMI specific and generic practices, as indicated throughout the Guide for
SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010]. This also puts added
emphasis on the roles of the TSP coach and team leader, and in the case of the TSP coach, a coach
role specification. Thus TSP+ corrects an historical irony, namely the lack of a TSP role description
CMU/SEI-2010-SR-032 | 27
for the coach. The TSP coach role is described in an entire book [Humphrey 2006] and in the five-day
SEI course TSP Coach Training9, but previously had no specification within TSP.
Six Sigma has a long pedigree in manufacturing, longer than CMMI has been used for software and
systems engineering (even if one includes the CMM for Software in the age of CMMI). Training
programs for the various colors of Six Sigma belts vary widely, but are nevertheless grounded in the
same basic statistical and procedural techniques incorporated into CMMI, especially at the higher
maturity levels, and are widely available. Tens of thousands have been trained in and used these
techniques with extraordinary results.
Yet, very little specific guidance is given here for the use of Six Sigma methods beyond the most
general suggestions of timing and expertise. The reasons for this are likely obvious to Six Sigma
Black Belts and other experts, who know only too well that the proper application of the methods
requires a sophisticated understanding of the particulars of the organizations, the projects, the
processes, and the data available. By the time such an expert knows enough about such things in order
to guide intelligently, that person needs no further general instruction from this document. However, a
few more words to others involved in such an effort may be useful.
As discussed below, Six Sigma provides a likely bridge to CMMI high maturity once the basic goals
of AIM—achieving high performance in business terms with solid CMMI ML3 conformance—are
met. However, some such techniques can be of great value in achieving those goals. For example,
Voice of the Customer techniques can be used to good effect for at least the Requirements
Development and Validation practices. This is, of course, only one example. Consult the local Six
Sigma Black Belt for further guidance.
As mentioned previously, the targeted CMMI scope of AIM excludes maturity levels 4 and 5,
collectively and commonly referred to as “high maturity.” The reasons for this exclusion are
historical and practical. Historically, while Six Sigma methods have been used successfully to move
organizations to high maturity and enhance the operations of those already at ML5 [Stoddard 2008],
the majority of experiences of TSP-using organizations that have achieved ML4 or better did so with
the CMM for Software (SW-CMM) as a reference model. This document relies heavily on the
experiences of TSP users for useful information. The experience base for CMMI high maturity using
TSP is simply insufficient.
Even assuming that the information was available, practical considerations argue against including
high maturity guidance here. CMMI high maturity requires that baselines and models specific to the
data and processes of a particular organization are accumulated and developed. It seems likely that
effective guidance for such activities will be quite lengthy, and in light of the discussion above
concerning Six Sigma’s use, possibly unnecessary.
However, it also seems clear that the detailed data provided by TSP teams provide a more solid
foundation for building the baselines and rich statistical models for CMMI high maturity practices
than measurement regimens built only on team- or project-level data. For example, individual time
and defect data are used to update personal review checklists using only the most basic statistical
9
TSP Coach Training, http://www.sei.cmu.edu/training/p21.cfm.
CMU/SEI-2010-SR-032 | 28
techniques. Not only is such data already an invaluable business and professional asset, the potential
of using such data, aggregated across and between project teams and properly segregated and analyzed
using Six Sigma and other techniques, seems enormous.
CMU/SEI-2010-SR-032 | 29
Appendix A: Activity Charts for Selected Processes
The following activity charts are a useful aid to understanding several of the new process capabilities
within TSP+. Note that they do not cover all of the new capabilities.
The charts generally show how affected roles (listed at the top of each chart) use a script, form, or
checklist to perform the intended activities. The activities have been designed to add value to team
activities while complying with relevant CMMI practices.
Table 2: Selected TSP+ Activity Flows
Chart Name Description Affected TSP Roles
Configuration Two charts documenting the intended Support manager, team member/product owner,
Management (CM) flow of configuration management quality and process managers, CCB (Configuration
activities Control Board, team level
CM—Configuration Documents the flow for formal change Requestor, support manager, affected project team,
Change Requests requests CCB, affected product owner or role manager
Stakeholder Documents the usage of the new Team members, team leader, process group, team
Involvement stakeholder involvement mechanisms management, process manager
Process Definition Four charts documenting the many Team members, process and support managers,
interactions between the various roles process group, management
involved in formal process definition
Organizational and Four charts showing the many Team members, team leader and/or management,
Project Training interactions between the various roles training manager (new role on the process group),
involved in planning and tracking training process group, training attendees and instructors
activities
Periodic Review of An additional chart for training activities to Team members, team leader and/or management,
Training Matrix close the loop on the training activities training manager (new role on the process group),
process group, training attendees and instructors
Decision Analysis One way to perform DAR, certainly not Team or team member, decision owner, decision
and Resolution the only way participants and stakeholders, process manager
CMU/SEI-2010-SR-032 | 30
Configuration Management
Team Member or Configuration
Support Manager Quality Manager
Product Owner or Process Manager Control Board
(expanded Role) (expanded Role)
Team (CCB)
Team Launch
Preparation
Checklist -
PREPT
Launch Meeting 6
Script LAU6
(Team)
Form CIBPS
(Update)
Launch Meeting 9
Script LAU9
(Team)
Or
Management
Briefing Guidlines
Form CIBPS
(Update)
Criteria met
to configure
item? (Team
Member)
Configuration
Identification
Release Form Obtain
Form CIR Approvals
Created by Yes
Quality Yes Approve or
following
Concur Product? Disapprove?
Yes Process?
with release?
(Product
Owner) No No
Approved or
Disapproved?
No
Disapproved
Rework
(Team Approved
Member)
A
CMU/SEI-2010-SR-032 | 31
Configuration Management
Team Member or Configuration
Support Manager Quality Manager
Product Owner or Process Manager Control Board
(expanded Role) (expanded Role)
Team (CCB)
Store
configuration
Item (CI)
No
Were
Configuration
Change Requests
Implemented?
Yes
Configuration
Change Request
Form CCR
(Update with
Implementation
date)
Configuration
Change Request
Log
Form LOGCCR
(Update CI Status)
Configuration Item
Log
Form LOGCI
(Update Log)
Form CIBPS
(Update status)
CM Audit
Store all forms
Script CMAUDIT
CMU/SEI-2010-SR-032 | 32
CMU/SEI-2010-SR-032 | 33
Stakeholder Involvement
Team Member or
Team Leader Process Group Management Planning Manager
Team
Team Launch
Preparation
Script PREPT
(Team)
Team Launch
Preparation
Checklist
PREPT
Team Leader reviews the RSIM with team, process group and management for completeness
Relevant
Stakeholder
Involvement Matrix
Form RSIM
(Update)
Begin mapping
individuals to the
roles identified in
form RSIM using
the project’s
SRAM.
Stakeholder Role
Assignment Matrix
Form SRAM
(Update)
Launch Meeting 8
Script LAU8
(Team reviews
SRAM and RSIM)
Relevant
Stakeholder
Involvement Matrix
Form RSIM
(Update)
Stakeholder Role
Assignment Matrix
Form SRAM Launch Meeting 9
(Update) Script LAU9
Management and the process group review the RSIM
and the SRAM for completeness and approval.
Relevant
Stakeholder
Involvement Matrix Store
Form RSIM approved
(Update) RSIM and
SRAM in
Project
Notebook
Stakeholder Role
Assignment Matrix
Form SRAM
(Update)
CMU/SEI-2010-SR-032 | 34
CMU/SEI-2010-SR-032 | 35
Process Definition
Standard Process
Deviation
Summary
Form SUMPD
(Update)
Make a plan
to obtain
process
deviation
approvals.
Form CIBPS
(Update)
Make plan to develop, modify, or obtain
each missing process element.
Launch Meeting 8
Script LAU8
(Prepare for meeting 9
in accordance with the
TSP Management
Briefing Guidelines)
Launch Meeting 9
Management and Progress Group Lead
Script LAU9
initially agree to proposed changes to the
(Present proposed
PSSP, pending formal process deviation
changes to the PSSP
approvals (Form SPDR)
using From SUMPD)
Document process
elements effected
by each process
deviation (found in
From SUMPD),
including rationale
for deviation and
evaluation criteria
for approval.
CMU/SEI-2010-SR-032 | 36
CMU/SEI-2010-SR-032 | 37
CMU/SEI-2010-SR-032 | 38
CMU/SEI-2010-SR-032 | 39
Organizational and Project Training (New Specification TRN)
Yes
Met all training
requirements?
No
Received
No
similar training or C
do not need this
training?
Yes
Approved Approved
Approved or Approved or
Denied? Denied?
Must
plan to Denied
take Denied
required
training
Team Member
Training Log
Form LOGTRNM
Notify team
members of
Satisfy
any
additional
additional
training
unsatisfied
requirements
training
requirement
CMU/SEI-2010-SR-032 | 40
Organizational and Project Training (New Specification TRN)
Training Survey
Update all Form
individual Form TRNSUR
training logs (Attendee)
and send all
attendees a
training
survey.
Team Member
Training Log
Form LOGTRNM
Notify team
members of
Satisfy
any
training
unsatisfied
requirements
training
requirement
Training Survey
Summary Form
Form SUMTRNS
(collect all forms Review
TRNSUR and completed
input data into Review
survey results,
form SUMTRNS) completed
form
survey
SUMTRNS, for
results, form
consideration
SUMTRNS
in future
(Instructor)
training
improvements
Review
Store completed
SUMTRNS, survey results,
TRNSURs, form
and associated SUMTRNS, for
TRNSI(s) consideration
in future
training
improvements
Training Request Log (Management)
Form LOGTRNR
(update)
CMU/SEI-2010-SR-032 | 41
Organizational and Project Training (New Specification TRN)
Can
requirement be
met my on-the-job,
computer-based or
self-directed
No training?
Yes
Approved Approved
Approved or Approved or
Denied? Denied?
Denied Denied
D
Team Member
Training Log
Form LOGTRNM
Notify team
members of
Satisfy
any
additional
additional
training
unsatisfied
requirements
training
requirement
CMU/SEI-2010-SR-032 | 42
Organizational and Project Training Periodic review of Training Matrix
The Training Manager must periodically review the Training Matrix (form TRNM) with teams, projects, the process group
and management for completeness and approval
Training Matrix
Form
Form TRNM
(Update)
Were
No
changes made
to form
TRNM?
Yes
Update all
individual
team member
training logs
to reflect
changes
made to the
TRNM.
Team Member
Training Log
Form LOGTRNM
Notify team
members of
Satisfy all
any
training
unsatisfied
requirements
training
requirement
CMU/SEI-2010-SR-032 | 43
CMU/SEI-2010-SR-032 | 44
Appendix B: TSP+ Scripts for Process Operation and Team
Operations
The high-level scripts guiding the process team and the other development teams in an organization
are included here to aid TSP coaches and TSP-trained process groups to understand their roles in the
overall AIM implementation process.
CMU/SEI-2010-SR-032 | 45
TSP+ Process Operations—Script POPS
General This script assumes a TSP/CMMI implementation. The use of Lean Six Sigma methods
and tools is optional.
1 Contact - Determine how the TSP and CMMI can address management’s concerns.
- Talk with working level engineers and managers to build interest.
2 Awareness - Expose senior management to the opportunity.
- Describe the potential benefits and the operating level interest.
- Provide credible references.
3 Obtain - Hold a one-day executive seminar for the senior managers and executives, which
Sponsorship include a CMMI overview.
- Hold a half-day planning workshop to identify initial areas for TSP trial use
- Identify a manager to be responsible for the transition plan and execution.
4 Develop Transition The responsible manager produces a plan for TSP trial use.
Plan
- arranges for qualified training and coaching support
- schedules managers and team members for training
- schedules initial TSP launches, checkpoints, and relaunches
5 TSP Trial Use The organization conducts trial TSP projects (TOPS script).
- trains the engineers and managers
- launches the teams and regularly reviews performance
- identifies internal candidates to be initial PSP instructors and TSP coaches
6 Evaluation
The organization assesses team and TSP performance and decides to proceed with the
process improvement initiative.
- identifies a manager to lead the long-term improvement effort and become the
process group team lead
- allocates initial resources
- issues a policy describing the process improvement initiative and its importance to the
business
- defines management responsibilities
7 Adoption: Process The process group team lead, with the help of the TSP Coach (POPS7 script)
Group Formation
- produces a process improvement plan proposal
- reviews this plan proposal with management and gets their approval
- recruits a staff and trains that staff in CMMI, PSP, and TSP
- launches the process group team to plan and execute the process improvement plan
proposal
8 Adoption: Working with project management, the process manager
Institutionalize TSP
- develops a TSP introduction plan and schedule for each project team
- assists projects in launching and running TSP projects (TOPS script)
9 Continuing Using the project’s needs as a guide and reviewing the Process Group Roles and
Improvement Responsibilities specification, set priorities and build and execute a plan (script CYCLE)
based on organizational business objectives for
- defining team and organizational processes
- establishing and maintaining a process asset library
CMU/SEI-2010-SR-032 | 46
- providing continuing training and coaching guidance
- obtaining needed tools and methods
- providing tool and method training and support
- assessing the organization annually to identify further improvement needs
Apply Lean Six Sigma methods and tools to improve current process performance.
Continuing Review
Management annually reviews the organization’s software operations.
- analyzes cost and benefit data
- obtains customer, manager, and engineer feedback on improvement results
adjusts the improvement program to address identified problems and capitalize on new
improvement opportunities during Meeting 1 of the process group’s (re)launches (script
CYCLE)
CMU/SEI-2010-SR-032 | 47
TSP+ Process Group Formation—Script POPS7
Purpose To guide the process group (PG) team lead in planning for and staffing a process
improvement group
Entry Criteria - The organization has initiated a software process improvement program.
- Management has issued a process improvement policy statement and named a
responsible manager.
- Initial process improvement resources have been allocated.
1 Define The PG team lead, with help from the TSP Coach
Responsibilities
- documents the principal responsibilities of the job and proposed group (specification
Process Group (PG) Roles and Responsibilities)
- reviews PG roles and responsibilities with existing staff groups (configuration
management, quality assurance, and test for example)
- reviews PG roles and responsibilities with development department managers
- revises PG roles and responsibilities based on the review results
- reviews PG roles and responsibilities with senior management for approval
2 Develop The TSP Coach works with the PG team lead in developing a proposal for the process
Improvement Plan improvement work.
- tasks to be performed
- training, support, and assistance to be provided
- resources needed, both full and part time
- proposed recruiting schedule
- proposed task schedule
3 Obtain Plan The PG team lead
Approval
- reviews the proposal with existing staff groups (configuration management, quality
assurance, and test for example)
- reviews the proposal with development department managers
- obtains agreement from the development and staff groups to provide the needed part
time process improvement resources
- revises the proposal based on the review results
- reviews the proposal with senior management for approval
4 Recruit Initial Staff The PG team lead recruits the process staff.
- obtains a core staff of process experts (trained and experienced if possible)
- utilizes part-time support from the development groups where planned
- recruits experienced professionals from internal development groups where possible
- maintains a mix of engineering, process, technology, and management skills
5 Train Initial Staff Using internal skills where possible, the PG team lead trains the process staff in the key
process improvement technologies.
- the Capability Maturity Model (CMMI, P-CMM, and so forth)
- the Personal Software Process (PSP)
- the Team Software Process (TSP)
- Lean Six Sigma methods and tools for subsequent improvement analysis
- the principal tools and methods used in the organization
6 Process Group The process group launches in order to plan and execute the approved process
Team Launch improvement proposal (script LAU).
Exit Criteria The PG has successfully launched.
CMU/SEI-2010-SR-032 | 48
TSP+ Team Operations—Script TOPS
Purpose To guide managers, teams, and engineers in introducing and using the TSP process.
Entry Criteria—Trial Use - Senior management has participated in the TSP executive strategy seminar and
planning workshop and supports TSP introduction.
- An initial TSP trial program has been approved.
Entry Criteria—TSP To move beyond trial use and start broad TSP introduction
Adoption
- TSP trial use has been successful.
- A TSP adoption plan has been developed and approved.
- Initial PSP instructors and TSP coaches have been selected and are scheduled for
training.
Entry Criteria—TSPm Initial To launch a TSPm multi-team or distributed multi-team
Use
- TSP trial use has been successful.
- The organization has at least one authorized TSP coach on its staff and enough
additional coaches are to be trained to support team operation.
General To use the TOPS script, one or more of the entry criteria must be satisfied.
1 Team Formation For each TSP or TSPm team, a project is identified and the staff has been trained.
and Training
- All managers on each project or in its management chain are TSP trained before the
launch.
- All project software professionals are PSP trained before the launch.
- All other project professionals are trained in the personal process before the launch.
2 Launch Preparation - Prepare to launch each TSP or TSPm team (checklist PREPL).
- For each TSPm multi-team or distributed multi-team, also follow scripts PREP and
PREPW during launch preparation.
3 TSP Cycle Follow script CYCLE until project conclusion in order to guide the team in:
- launching the project
- executing the team’s detailed planning
- undergoing a checkpoint (script CHECKPOINT)
- conducting a phase, cycle, or project postmortem
- preparing for subsequent relaunches (if needed)
4 Multi-Team (TSPm) For multi-teams only, follow script TOPS4 concurrently for each sub-team following the
Project Operations TSP cycle (script CYCLE) in order to manage the sub-team interdependencies and
overall project.
Exit Criteria - Project completed with team and team member plan and actual data
- Project data filed in the project notebook (specification NOTEBOOK)
- Final project report prepared and presented to management
CMU/SEI-2010-SR-032 | 49
TSP+ Project Operations—Script TOPS4
Purpose To guide managers, team leaders and team members in managing a TSPm project
1 The Leadership Following the project (re)launch and under the guidance of a qualified TSP coach, the
Team Launch leadership team holds a one-day launch (script LTL).
- reviews the organization’s and this project’s goals
- develops the leadership team’s management strategy
- defines the role manager teams’ goals and responsibilities
- allocates tasks and responsibilities among the leadership team members
2 Leadership Team Following the leadership team launch, the leadership team
Operations
- manages the project and each sub-team in performing its work
- holds weekly meetings to review project status and issues (script WEEKL)
- provides guidance to the role manager team launches (script RTL)
- meets at least monthly with each role manager team (script WEEKLR)
- regularly reports to senior management and the customer on project status and
progress (specification STATUS)
3 Sub-team Following the team and sub-team (re)launches, each sub-team follows its defined
Operations process and detailed plan in doing its work.
4 Role Manager Soon after the team (re)launch and under the guidance of a TSP coach, each role
Team Launches manager team holds a two-day launch (script RTL).
- the leadership team defines its goals for the role manager team
- the role managers establish their strategy and plan to meet these goals
- the role managers allocate tasks and responsibilities among team members
- the role managers prepare and review their plan with the leadership team
5 Quarterly or The leadership team and senior management periodically
Monthly
- review project status, progress, and projections
Management
Reviews - assess the team for quality level performance
- assess the team for expert level performance
- identify issues and problems and assign responsibilities
6 Periodic Customer The leadership team regularly reviews status and progress with the customer.
Reviews
- planned versus actual performance
- outstanding issues and problems, actions planned, and assistance needed
Exit Criteria - Project or project phase completed with team and team member plan and actual data
filed in the project notebook (specification NOTEBOOK).
- Final project report prepared and presented to management
CMU/SEI-2010-SR-032 | 50
TSP Cycle—Script CYCLE
Purpose - To guide teams through the use of a defined and structured process, with repeatable
and measurable steps, which provides rapid feedback on the quality of the product
and progress towards completion
- To guide teams in the establishment of a shared understanding of the work and how it
is to be done, which includes a common understanding of the team goals, team
member roles, product or components to be produced, available resources and
existing constraints, and measures of success.
- To provide the mechanisms required in order for a team to practice self-management
Entry Criteria - All team members have been adequately trained in the use of PSP and TSP.
- All team members and the team leaders have been identified and allocated to the
project.
- A qualified TSP coach is available to guide and coach the team through the TSP
Cycle.
General - Depending on the size and needs of the project, a TSP Cycle can range from a period
of a few weeks to a few months.
- Depending on a project’s overall duration and needs, the team may choose to use
phases, cycles, or both in determining when it needs to conduct a (re)launch. A phase
represents a part of the development lifecycle such as the implementation phase, and
a cycle represents the time between planning horizons. A phase can encompass
several cycles, just as a cycle can encompass several phases.
1 Team (Re)Launch - During the launch, the team learns from management what it is supposed to do,
(LAU/LAUm or makes a plan for doing the desired work, and then reviews the plan with management.
REL/RELm) The two desired outcomes of the launch are an approved team plan for producing a
particular product, both the overall project plan and a detailed next phase plan, and a
jelled self-directed team.
- During a relaunch (script REL or RELm), the team members update their overall plan
and develop a new next-phase plan based on what they have done since the initial
launch or the prior relaunch. The team has already committed to management what it
intends to do and, if that commitment is unchanged, the members do not need to
repeat the management meetings. However, if the project has changed in any
significant way (such as changes to the product requirements, the team membership,
project schedule, project scope, etc.), then the relaunch should be regarded as a new
project launch and all of the meetings (script LAU or LAUm) and activities should be
held.
2 Plan Execution - The team executes the cycle plan created during the (re)launch, making updates or
changes to the plan as necessary.
- The team uses scripts DEV and MAINT to guide the team in developing, maintaining,
and enhancing software-intensive products.
- The team meets weekly (script WEEK) to ensure that all team members understand
current project status and know what to do next.
- The team leader conducts periodic management and customer status meetings (script
STATUS).
Checkpoint About a month into the TSP cycle or halfway through the cycle, whichever is shorter, the
TSP Coach leads the team through a checkpoint (scripts CHECKPOINT).
3 Cycle or Project - The cycle postmortem is held before any subsequent launch or relaunch and includes
Postmortem only the data on the work completed during the earlier project phases or cycles. The
focus of these postmortems is to evaluate interim project status and calibrate planning
parameters to revise goals and improve performance in subsequent cycles (see script
PM).
- The project postmortem is conducted at the end of the project and includes the full
product or project data. Organizational process baseline data may be updated at this
time (see script PM).
CMU/SEI-2010-SR-032 | 51
Exit Criteria - A completed high-quality product
- A project summary report (see specification SUMMARY)
- PIPs for all identified process improvements
CMU/SEI-2010-SR-032 | 52
Appendix C: Goal-Question-(Indicator)-Metric Examples
The following GQ(I)M templates are in no sense the minimum necessary to fully understand and
implement measurement under AIM. They are simply included here as examples of what can be done.
The first indicator is for TSP-style earned value management, while the second one gives a more
traditional view of earned value at the organization level. A reasonably complete set of indicators
would likely include indicators for time-on-task, planned vs. actual quality in several dimensions, and
the TSP Quality Profile Indicator.
CMU/SEI-2010-SR-032 | 53
Date: 6-Sep-2005
Indicator Name/Title: Earned Value
OBJECTIVE
To determine current schedule status of a project and to
estimate a likely completion date.
QUESTIONS
Where is the project with respect to its current and original
schedule? When is the project likely to finish?
VISUAL DISPLAY
100.0
90.0
80.0
70.0
Earned Value
30.0
20.0
10.0
0.0
8/16/2004
8/30/2004
9/13/2004
9/27/2004
10/11/2004
10/25/2004
11/8/2004
11/22/2004
12/6/2004
12/20/2004
1/3/2005
1/17/2005
1/31/2005
2/14/2005
Weeks
PERSPECTIVE
Project team
Team leader
Project manager
Program manager
CMU/SEI-2010-SR-032 | 54
Inputs
List all the data elements in the production of the Precisely define the data element used or point to where the definition can
indicator. be found.
DATA COLLECTION
How Manual
DATA REPORTING
Team leader
Project manager
DATA STORAGE
CMU/SEI-2010-SR-032 | 55
ALGORITHM Individual earned values computer as:
ASSUMPTION None
FEEDBACK GUIDELINES
CMU/SEI-2010-SR-032 | 56
Date: 6-Sep-2005
Indicator Name/Title: Earned Value Management (Cost and Schedule)
OBJECTIVE
To monitor contract performance for contracts that use Earned Value
Management (EVM). This indicator will track the Cost Performance Index
(CPI) and the Schedule Performance Index (SPI) in relation to the target
values.
QUESTIONS
Are the CPI and the SPI within their target areas?
VISUAL DISPLAY
Where
Target Area
CMU/SEI-2010-SR-032 | 57
PERSPECTIVE Project manager
INPUTS
List all the data elements in the production of the Precisely define the data element used or point to where the
indicator. definition can be found.
DATA COLLECTION
How ?
By Whom Specify who will collect the data (an individual, office, etc.).
Forms Reference any standard forms for data collection (if applicable) and
provide information about where to obtain them.
DATA REPORTING
Responsibility for Reporting Indicate who has responsibility for reporting this data.
By/To Whom Indicate who will do the reporting and to whom the report is going to.
This may be an individual or an organizational entity.
How Often Specify how often the data will be reported (daily, weekly, monthly, as
required, etc.).
DATA STORAGE
How Indicate the storage media, procedures, and tools for configuration
control.
ASSUMPTION Identify any assumptions about the organization, its processes, life-cycle
models, and so on, that are important conditions for collecting and using
this indicator.
CMU/SEI-2010-SR-032 | 58
ANALYSIS Specify what type of analysis can be done with the information.
INTERPRETATION Describe what different values of the indicator mean. Make it clear how
the indicator answers the “Questions” sections above. Provide any
important cautions about how the data could be misinterpreted and
measures to take to avoid misinterpretation.
PROBING QUESTIONS List questions that delve into the possible reasons for the value of an
indicator, whether performance is meeting expectations or whether
appropriate action is being taken.
EVOLUTION Specify how the indicator can be improved over time, especially as more
historical data accumulates (e.g., by comparison of projects using new
processes, tools, environments with a baseline; using baseline data to
establish control limits around some anticipated value based on project
characteristics).
CMU/SEI-2010-SR-032 | 59
Appendix D: Process Improvement Proposals (PIPs)
As part of the original effort to identify gaps in a previous version of TSP, Process Improvement
Proposals (PIPs), a standard process element in TSP, were used to capture areas for potential
improvements. These PIPs are included here mainly for the benefit of those organizations that already
have significant TSP implementation in place and wish to use AIM concepts in a more formal CMMI
implementation (UC2 noted above).
Note that not all suggestions to change TSP were implemented in the way suggested by the PIP, or
even acted upon in a way that implements a new or modified process element that would fill the gap.
For example, the ENGR-X PIPs did not generate massive changes and additions to TSP process
elements because the general philosophy of TSP is to be non-directive as to which development
methods are used. The ENGR-X PIPs are addressed in part, as suggested in ENGR-3, by this
document, as well as by many specific but relatively straightforward modifications and additions in
TSP+ that reflect the advice of experienced TSP coaches.
CMU/SEI-2010-SR-032 | 60
PIP # Filename Candidate Modified (new) TSP Process Elements
PP-2 PIP PP-2.doc LAU3
PP-3 PIP PP-3.doc Stakeholders PIP—potentially affects PREPL/PREPR, LAU3, LAU8,
roles, WEEK, stakeholder matrix (new)
Process review PIP Process Review.doc Remove references to Process Review Meetings.
QA-1 PIP QA-1.doc TSP QA Plan (new)
REQ-1 PIP REQM-1.doc REQ, ANA, SUMS, TASK, or new guidance
RSKM-1 PIP RSKM-1.doc LAU7
ROLE-1 PIP ROLE-1.doc Team role descriptions
CMU/SEI-2010-SR-032 | 61
TSP Process Improvement Proposal—Form PIP
Name Noopur Davis / James McHale Date 3/20/2008
e-mail nd@sei.cmu.edu / jdm@sei.cmu.edu Organization SEI
Project TSP Initiative Launch/Phase Project Mgmt.—ML2/3
Improvement Description
Briefly describe the improvement you suggest.
Ref. all PAs GP2.1 TSP should include policy guidelines/templates/samples that state that projects follow the TSP and
other process assets as defined in the OSSP. These policies should be specifically tailored to the organization as part of
broad transition, after piloting, to reflect the pilot project transition experience.
As organizations start adopting the TSP across the board, policies about its use would enforce the organization’s
commitment, and will also contribute to “that is how we do things here.” Would also increase CMMI conformance.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 62
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/20/08
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt—ML3
Improvement Description
Briefly describe the improvement you suggest.
Ref. all PAs GP2.9. Make the TSP Checkpoint Process assets part of the standard TSP download (i.e., available to all
TSP coaches) in order to address GP2.9 which states “Objectively evaluate adherence of the <x> process against its
process description, standards, and procedures, and address noncompliance.” This is a fairly good partial description of
the TSP coach role.
Add TSP Checkpoint Process to standard TSP package. Probably not a complete solution to the general QA issue but
this is a process asset that should be generally available for partners and coaches, especially as many coaches push
specific elements of a checkpoint down to team roles, e.g., process and quality managers.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 63
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 7-May-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase CMMI ML2/3
Improvement Description
Briefly describe the improvement you suggest.
Ref. GP 2.10—“Review the activities, status, and results of the <X> process with higher level management and resolve
issues.” Bring the Quarterly Review Checklist in Winning With Software Appendix D (a version of this already exists in
TSP for Multi-Teams as ‘Checklist REVIEW.doc’) into the standard TSP distribution).
Note: This should work well in conjunction with PSP OPF-2, running the EPG as a TSP team.
Under existing TSP Introduction activities, management does not necessarily receive any overview of the process
improvement effort as a whole. The intent is to give management a comprehensive review of product, process, and
improvement concerns, which should improve the quality of the implementation, reduce risk in general, and drive a
quicker implementation.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 64
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/6/08
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt.—ML2/3
Improvement Description
Briefly describe the improvement you suggest.
Ref. GP 2.6 and GP 3.2 (PP, PMC, IPM, RSKM PAs), IPM SP 1.6—Call out project management artifacts separately in a
standard TSP configuration management plan (to be defined). Include the aspects of project data management (e.g.,
project NOTEBOOK). Provide for capture of new and tailored project processes and data as organizational process
assets).
Possibly PREPL/PREPR checklists (e.g., to create an online project NOTEBOOK), LAUPM (to populate the NOTEBOOK
initially), script WEEK (to store/manage weekly data ‘appropriately’), script PM (to store project summary data). Add
script SCM and other process assets from Introduction to Team Software Process, App. B to the standard TSP release.
Improve consistency in actually keeping the project NOTEBOOK data (in whatever form it may take) up to date, reduce
the risk that something is inadvertently omitted from it, and improve CMMI conformance. For project process assets not
captured above (e.g., PIPs, new and tailored project processes), some level of configuration management must be
maintained for project use, and this should facilitate submission into the organization process asset library.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 65
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 17-Jun-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase Engineering PAs
Improvement Description
Briefly describe the improvement you suggest.
REQM SP1.3—“Manage changes to the requirements as they evolve during the project.”
REQM SP1.4—“Maintain bi-directional traceability among the requirements and work products.”
REQM SP1.5—“Identify inconsistencies between the project plans and work products and the requirements.”
Traceability is specified through several scripts (REQ or ANA, HLD, IMP) but the traceability is only one-way, upward.
The traceability is recorded in one of several documents—Software Requirements Specification (SRS), Engineering
Requirements Specification (ERS), Software Design Specification (SDS), or a component plan.
There are no templates or examples provided for the SRS, ERS, and SDS, and only an implied one (the PSP 2.1 or 3.0
Plan Summary) for a component plan.
Several possible implementations: specify in the Customer Interface Manager role specification; specify traceability in
both directions in scripts REQ, ANA, HLD, IMP; provide a traceability matrix template or a tool requirement for bi-
directional traceability; provide implementation examples or other implementation guidance for traceability; possibly
others. Note that these are not necessarily mutually exclusive.
Improves quality of Customer Interface role execution by leaving less to chance, could reduce cycle time dramatically
where changing requirements eat up trace time, and reduces risk of missing a necessary work product change when
requirements change.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 66
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 7 July 2008
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase Engineering PAs
Improvement Description
Briefly describe the improvement you suggest.
Ref. RD SP 1.1, SP 1.2, SP 2.3; TS SP 2.3; PI SP 2.1, SP 2.2 SP 3.1, SP 3.3—All of these CMMI specific practices deal
with interfaces in some way.
Direction in TSP scripts REQ, HLD, and IMP is extremely high-level and generally has no other documentation,
examples, or templates to fall back on. (Note: There is fairly explicit direction at the lowest implementation level in script
IMP6.)
Role manager specifications do not call out interfaces as a specific concern or responsibility, with the possible exception
of the Customer Interface Manager (not necessarily the kind of interface referred to in CMMI).
Potential modifications to the role specifications for the Customer Interface, Design, Implementation, and Test Managers.
Potential modifications to scripts REQ/ANA, HLD, and IMP (although IMP might be okay since IMP6 goes into relevant
detail).
Some level of operational guidance is advisable in the implementation guidelines for TSP-CMMI AIM.
Quality of TSP implementation and CMMI conformance should improve, as well as the quality of the product, if product
interfaces are properly specified, designed, implemented, and tested.
Attention to interfaces early on during TSP implementation should reduce CMMI implementation cycle time while
reducing technical risk both for development projects and for the TSP-CMMI implementation project.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 67
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 7 July 2008
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase Engineering PAs
Improvement Description
Briefly describe the improvement you suggest.
Ref. all 15 SGs and 45 SPs in the Engineering PAs (REQM, RD, TS, PI, VER, VAL)—TSP scripts that directly address
these CMMI goals and practices (DEV, MAINT, REQ, ANA, HLD, IMP, IMP6) and in general high-level and rarely
implemented closely by target organizations. (The lone exception is script INS that is often implemented and satisfies a
significant portion of the VER process area.)
See also PIPs ENGR-1 and ENGR-2 for examples of major groups of practices that are not well addressed for SCAMPI
purposes.
Scripts DEV, MAINT, REQ, ANA, HLD, IMP, and possibly IMP6 or their functional equivalent must be developed,
adapted, or otherwise instantiated at both the organizational and team levels. One possible solution is a new operational
guidance document, combined with relatively minor enhancements to the existing scripts and role manager
specifications, that provides explicit guidelines for involving working developers in defining, documenting, and changing
their own engineering process descriptions.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 68
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/6/2008
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt—ML3
Improvement Description
Briefly describe the improvement you suggest.
Ref. IPM SP 1.2—Modify LAU4 step 5 to reference relevant organizational historical data (if available) for estimation
purposes.
LAU4
Improve the quality of planning estimates, reduce the risks associated with possibly ignoring relevant organizational data,
and improve CMMI conformance.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 69
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/6/2008
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt—ML3
Improvement Description
Briefly describe the improvement you suggest.
Ref. IPM SP 2.2—Identify critical internal and external dependencies explicitly in LAU3 step 4 (strategy) and possibly
somewhere in LAU4 or LAU6.
Improve the quality of the project plan and reduce project risk by having a consistent place to deal with critical
dependencies, and improve CMMI conformance.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 70
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 14-Apr-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Support PAs—ML2
Improvement Description
Briefly describe the improvement you suggest.
Ref. SP1.1—“Establish and maintain measurement objectives that are derived from identified information needs and
objectives.”
Ref. SP1.2—“Specify measures to address the measurement objectives.”
While the measurement objectives of the TSP are well-known and discussed extensively in the literature, there is no
central location within the TSP artifacts where the objectives are made explicit; therefore there is no explicit link between
the standard TSP measures and those objectives.
One way to implement might be the Indicator Template (new to TSP, although examples were created for the “Jump-
Starting” class) as taught in the SEMA class “Implemented Goal-Driven Measurement” (this is an implementation of the
GQ(I)M paradigm. One nuance of this is that there can be one set for ML2 implementations (e.g., earned value charts for
a single project), and an additional set for ML3 implementations (e.g., showing CPI and SPI for multiple projects). Note:
there might be an opportunity here to specify standard ML4/ML5 indicators for TSP that fulfill the requirement for process
performance baselines.
Improved communications between the TSP team and management, and improved CMMI conformance.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 71
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 1-May-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase ML2—Proc. Mgmt & Supp
Improvement Description
Briefly describe the improvement you suggest.
Ref. OPD SG1—“A set of organizational process assets is established and maintained.” Included in this should be a set
of standard processes (SP1.1), life-cycle models approved for use in the organization (SP1.2), tailoring criteria (SP1.3), a
measurement repository (SP1.4), a process asset library (SP1.5), and work environment standards (SP1.6) which
includes things like PC specifications (hardware and software), facilities requirements, etc. Most of this already exists in
some form in the standard TSP process assets, plus some specific items in the current PSP and TSP books by Watts
Humphrey or in the TSP-MT (multi-team) process extension. A few additional items should be created, e.g., see OPD-2
regarding tailoring criteria and OPD-3 regarding a measurement repository. A baseline Organization Standard Set of
Processes (OSSP) should include all of this as well and provide guidance for local additions and extensions, and be
collected under an Organizational Process Notebook.
An annotated listing of TSP process elements, including extensions not currently part of the TSP baseline, should be
created as a guide or table of contents as to what is available. For an example, see Section 5 of CMU/SEI-2004-TR-014
Mapping TSP to CMMI (essentially an extended, annotated version of the index already in TSP) [McHale 2004]. This list
could be updated and extended, and the Organizational Process Notebook then built around this.
By extending and documenting the “official” TSP process assets in this way, all aspects of CMMI implementation using
TSP as the central implementation mechanism are improved—the quality of the results, reduced cycle time in achieving
those results, and reduced risk in achieving those results.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 72
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 1-May-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase ML2—Proc. Mgmt & Supp
Improvement Description
Briefly describe the improvement you suggest.
Ref. OPD SP1.3—“Establish and maintain the tailoring criteria and guidelines for the organization’s set of standard
processes.” Minimal guidelines for tailoring launch preparation materials, launch scripts, role descriptions, engineering
process scripts, and other TSP process assets should be created for review and use by a TSP team during launches and
relaunches. The most likely place to reference the tailoring criteria are in launch preparation materials, the process
manager role description, and especially in LAU3 for use when the team is defining its work processes.
The tailoring guidelines might also include a waiver process, including the ability to easily try a brand new process.
Tailoring guidelines should ensure that, if CMMI conformance is important in the organization, the tailored process is still
CMMI-conformant (e.g., by having the EPG review changes).
LAU3 (step 6 most likely), Process Manager Role Description, Launch Preparation Packages for the Team Leader and
Team Members. It may make sense to include an example of a tailored launch script (e.g., for launching an EPG)
Creates a standard organization for process assets that should enable better and more obvious conformance with CMMI
ML3 requirements, much faster startup for CMMI efforts, and therefore reduce the risk of non-conformance issues.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 73
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 2-May-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase ML2—Proc. Mgmt & Supp
Improvement Description
Briefly describe the improvement you suggest.
Ref. OPD SP1.4—“Establish and maintain the organization’s measurement repository.” The standard PM script specifies
what analyses to perform at a very high level, but has no detail on how to perform the analyses or what format the
resulting data should follow. The potential therefore is that each project will do it differently, making summaries,
comparisons, and other analyses at the organizational level difficult or impossible. Therefore TSP should specify at least
a default format for analyses and results, allowing the starting definition of the organization’s measurement repository to
be the collection of weekly consolidated workbooks plus the PM results.
Note: There is strong interaction also with GP3.2 “Collect work products, measures, measurement results, and
improvement information derived from planning and performing the <X> process to support the future use and
improvement of the organization’s processes and process assets.” While weekly consolidations from all projects are a
good foundation, having the PM results much more strongly and obviously supports the purpose of GP3.2, supporting
future use and improvement.
Note: There is also obvious interaction with PIP MA-1 and all of the practices in Measurement and Analysis, and
possibly with TSP Certification efforts.
Script PM modifications, possibly to the extent of providing an example minimum output, or even providing a default
standard format for results that lends itself to cross-project comparison and analysis.
Much improved uniformity of PM results making both target organization and SEI analysis better, quicker, and less
expensive. Also reduces the risk of problems in OPD evaluations during a SCAMPI.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 74
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 1-May-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase ML2—Proc. Mgmt & Supp
Improvement Description
Briefly describe the improvement you suggest.
Ref. OPF SG3 “The organizational process assets are deployed across the organization and process-related
experiences are incorporated into the organizational process assets.” The existing TSP Introduction Strategy from
Winning with Software App. F should be updated (and separately published as an SEI technical note or part of a
technical report?) to include suggested CMMI training, appropriate classes of SCAMPI appraisals and other evaluations
(e.g., TSP Organizational Certification) [Humphrey 2011].
Ref. OPF SP1.1 “Establish and maintain the description of the process needs and objectives for the organization.” There
is no standard way to document and update the organization’s process needs and objectives. At a minimum, examples
of good process objectives should be provided as part of AIM.
Faster, more consistent and persistent implementation of TSP in an organization, and reduced risk of poor CMMI
implementation of CMMI practices not previously covered by standard TSP.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 75
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 1-May-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase ML2—Proc. Mgmt & Supp
Improvement Description
Briefly describe the improvement you suggest.
Ref. OPFSG2—“Process actions that address improvements to the organization’s processes and process assets are
planned and implemented.” Also ref. OPF/OPD/OT GPs 2.2 (“Plan the process”), 2.3 (“Provide resources”), 2.4 (“Assign
responsibility”), 2.5 (“Train people”), 2.6 (“Manage configurations”), 2.7 (“Identify and involve stakeholders”), and GP 2.8
(“Monitor and control the process”), and probably others.
Provide guidance to train, launch, and manage the EPG or equivalent as a TSP team. This should include standard
LAU1 guidance to present the organization’s process needs and objectives (see PIP OPF-1) using TSP as the backbone
of such an effort. The scope of the effort should include directly addressing OPF, OPD, and OT at a minimum, possibly
extending to QA and/or CM, in addition to the ‘standard’ TSP focus on development teams which normally would provide
full CMMI coverage to Project Management and Engineering PAs.
Scripts POPS, POPS7, and POPS9 provide a good starting point, although these should be updated and possibly
extended. (For instance, CMMI should be referenced as the base model, not CMM.) Additional launch preparation
materials and role descriptions for the organizational process manager and the process group should be provided.
Scripts TOPS and TOPS4 might also be included as guidance.
Rapid and persistent implementation of TSP, and reduced risk of poor choices for CMMI implementation.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 76
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 2-May-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP-CMMI AIM Launch/Phase ML3—Process Mgmt. PAs
Improvement Description
Briefly describe the improvement you suggest.
Ref. OT SG 1—“A training capability, which supports the organization’s management and technical roles, is established
and maintained” and SG2 “Training necessary for individuals to perform their roles effectively is provided.” The specific
practices collectively provide the relevant guidance. As a default, training is initially provided by an outside agent such as
the SEI per the TSP-CMMI AIM Introduction Strategy (see OPF-1) and then, per that Strategy, transitioned to the
organization’s EPG, if only as an agent for securing outside training resources. Thus OT concerns become an ongoing
part of the EPG’s responsibilities as it operates as a TSP team.
Note: Ref. OT SP1.2—“Determine which training needs are the responsibility of the organization and which will be left to
the individual project or support group.” This practice in particular may be a good candidate for a DAR instantiation (e.g.,
document criteria and evaluation methods for making this determination, and then recording results accordingly as such
decisions are made on an ongoing basis).
TSP-CMMI AIM Introduction Strategy, specifically those parts dealing with planning, delivering, and evaluating training in
PSP, TSP, and CMMI (by default) and expanding to cover all organizational training needs. Implementation of this PIP
must address all relevant artifacts expected by the SPs.
Elevates the training needs, capabilities, and outcomes of the organization early in the Introduction Strategy, which
should help to ensure a quicker buildup in internal capability and more of a quality focus earlier. Should also somewhat
reduce the risk of poor CMMI implementation choices being made.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 77
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/6/2008
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt.—ML2
Improvement Description
Briefly describe the improvement you suggest.
Ref. PMC GP 2.2—Add an explicit line or bullet item in LAU8 or LAUPM for the team leader and team to establish a set
time or schedule for weekly meetings.
LAU8 or LAUPM
Improves the fidelity and uniformity of TSP implementation by ensuring that the weekly meetings are scheduled during
the launch. Currently this is left to the coach to check with the team sometime during the launch or afterward to ensure
that this happens.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 78
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/6/2008
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt.—ML2
Improvement Description
Briefly describe the improvement you suggest.
Ref. PP SP2.3—SP says “Plan for the management of project data” which is done on every project but has no specific
guidance in the TSP to plan for it. The project NOTEBOOK in TSP is supposed to contain this data. Suggest adding
guidance in PREPL/PREPR to set up the NOTEBOOK.
Ref. PMC SP1.4—SP says “Monitor the management of project data against the project plan.” Probably a good specific
checklist item for the process manager role.
See also PIP CM-1.
More consistent setup and maintenance of the project NOTEBOOK, and improved CMMI compliance.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 79
TSP Process Improvement Proposal—Form PIP
Name Noopur Davis / James McHale Date 11/14/16
e-mail nd@sei.cmu.edu / jdm@sei.cmu.edu Organization SEI
Project TSP Initiative Launch/Phase Project Mgmt.—ML2
Improvement Description
Briefly describe the improvement you suggest.
Ref. PP SP2.5, all GP 2.5s esp. in Engineering PAs—SP says “Plan for knowledge and skills needed to perform the
project.” Somewhere in LAU3, the team should plan for training needs for at least the near-term plan. “Conventional”
coaching guidance says to identify training needs as part of the support plan (step 8) thereby recording them on INV and
planning in LAU4, but LAU3 does not actually say this.
Script LAU3
Explicitly planning for training will improve the quality of the end-product by improving the quality of process execution,
and improve CMMI conformance.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 80
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/6/08
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt—ML2
Improvement Description
Briefly describe the improvement you suggest.
Ref. PP SP2.6—“Plan the involvement of identified stakeholders.” Add an item in the PREPL/PREPR checklist to
develop a stakeholder involvement matrix. The purpose of this matrix would be to try to identify all the stakeholders
before the launch, and invite the appropriate ones to meetings 1 and 9. This matrix could also be used later (LAU3,
LAU8, and/or WEEK) to determine who needs what status from the project.
PREPL & PREPR checklists, possibly Launch Preparation Guidelines, possibly LAU3, LAU8, and/or WEEK
Better outcomes for launch meetings 1 and especially 9, better communication with other relevant stakeholders, and
reduced risk of omitting a relevant stakeholder; also improved CMMI conformance.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 81
TSP Process Improvement Proposal—Form PIP
Name Tim Chick Date 5/20/2008
e-mail tchick@sei.cmu.edu Organization TSP
Project Launch/Phase
Improvement Description
Briefly describe the improvement you suggest.
All references to the “Process Review Meeting” should be removed from the TSP material as it is undefined
and some of the envisioned material is already covered in other PSP/TSP courses.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 82
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/6/08
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt—ML2
Improvement Description
Briefly describe the improvement you suggest.
Ref. PP GP2.9 (and other GP2.9s)—A separate TSP QA plan could address many QA issues between TSP and CMMI.
Currently there is no standard guidance.
See also PIP ALL-2.
TSP QA Plan (new), possibly including a TSP Coach role description that emphasizes preparation and training as well as
quality assurance, esp. process quality assurance, responsibilities.
Provide standard QA guidance to the team leader, the team, the organization, and the coach, while improving CMMI
conformance.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 83
TSP Process Improvement Proposal—Form PIP
Name James McHale / Gene Miluk Date 14-Apr-08
e-mail jdm@sei.cmu.edu / gem@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Engineering PAs—ML2
Improvement Description
Briefly describe the improvement you suggest.
Ref. SP1.1—“Develop an understanding with the requirements providers on the meaning of the requirements.” Scripts
REQ and ANA point to market study results, impact analyses, ERS (Engineering Requirements Specification), and SRS
(System Requirements Specification) but there is no specification for any of these within TSP. Also, while one may
assume that SUMS should reflect an understanding of the meaning of requirements (presumably through the conceptual
design), there is no explicit requirement in TSP for this.
Ref. SP1.4—“Maintain bi-directional traceability among the requirements and work products.” A good SUMS will have a
traceable thread to tasks in individual TASK plans, and vice versa. However, there is no explicit requirement or direction
in the TSP for making this so.
New process elements needed, or appropriate places found for the following:
1. Create some sort of minimal specification for documents that reflect an explicit understanding of requirements (e.g.,
through market studies, impact analyses, an ERS, and/or an SRS). Note: any implementation should allow for
“requirements” to be interpreted fairly broadly, e.g., “requirements” could be “contractual requirements” or it could be
“everything we understand that the customer wants.”
2. Consider specifying some sort of numbering scheme that a.) specifies the decomposition of requirements through
multiple levels, e.g., through a numbering scheme (like the part number on SUMS) that might be implemented via
automation; b.) links requirements to requirements/specification documents to design documents to code. Some of this
might be embedded in the solution to #1. Note: “bi-directional traceability” should be broadly defined to include use
cases, architectural descriptions, detailed designs, code, tests, internal and user documentation, etc. The question to be
answered is “Is everything required reflected in the code (and intermediate products), and is everything in the code
implementing something that is truly required?”
Should improve requirements quality on a TSP team, as well as enhancing requirements traceability, while improving
CMMI implementation.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 84
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/5/2008
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt—ML3
Improvement Description
Briefly describe the improvement you suggest.
Ref. RSKM SP1.1—Add recommendation in LAU7 and possibly the Launch Preparation Guidelines to reference
CMU/SEI-93-TR-6 “Taxonomy-Based Risk Identification” in order to a.) support richer brainstorming of risks and b.)
comply with referenced SP which reads “Determine risk sources and categories.” See especially p. A-2 of TR, Figure A-
1 Taxonomy of Software Development Risks.
LAU7, possibly the Launch Preparation Guidelines for the Team Leader and Team Members
Should make execution of LAU7 more consistent and comprehensive while improving CMMI conformance.
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 85
TSP Process Improvement Proposal—Form PIP
Name James McHale Date 3/6/2008
e-mail jdm@sei.cmu.edu Organization SEI
Project TSP AIM Launch/Phase Project Mgmt.—ML2/3
Improvement Description
Briefly describe the improvement you suggest.
Many SPs and GPs in CMMI are performed by TSP roles. However guidance is thin for planning these (one line in LAU3)
and the role manager descriptions often trigger long discussions. Many coaches already have their own ‘private’ cache of
role manager guidance (e.g., scripts and/or checklists). This guidance should be standardized (e.g., a checklist for every
team role, that would match up with TASK list items and the planned time to execute the role responsibilities).
Team role descriptions, including especially the addition of a sample checklist for each standard role
More consistent planning and performance of TSP roles, shortened discussions in and after the launch, and improved
CMMI conformance for many SPs/GPs (e.g., PP SP2.3, PMC SP1.4, GP2.2, GP2.3, GP2.4, GP2.6, GP2.8, GP2.9)
When completed and reviewed, submit to the Process Manager and keep a copy.
Do not write below this line.
PIP Control # Accepted
Received Returned
Evaluated Deferred
Effort involved Date done
Author notified
Reasons
CMU/SEI-2010-SR-032 | 86
References/Bibliography
[Bala 2007]
Bala, Karthik & Bala, Guha. Game On! An Industry’s Journey. Proceedings of the TSP 2007
Symposium. www.sei.cmu.edu/tspsymposium/2010/proceedings.cfm
[Carleton 1992]
Carleton, Anita D.; Park, Robert E.; Goethert, Wolfhart B.; Florac, William A.; Bailey, Elizabeth
K.; & Pfleeger, Shari Lawrence. Software Measurement for DoD Systems: Recommendations for
Initial Core Measures (CMU/SEI-92-TR-019). Software Engineering Institute, Carnegie Mellon
University, 1992. www.sei.cmu.edu/library/abstracts/reports/92tr019.cfm
[Chick 2006]
Chick, Timothy A. “Using TSP with a Multi-Disciplined Project Management System.”
Crosstalk, March 2006. www.crosstalkonline.org/storage/issue-archives/2006/200603/200603-
Chick.pdf
[CMMI 2010]
CMMI Product Team. CMMI for Development Version 1.3 (CMMI-DEV v1.3) (CMU/SEI-2010-
TR-033). Software Engineering Institute, Carnegie Mellon University, 2010.
www.sei.cmu.edu/library/abstracts/reports/10tr033.cfm
[Crosby 1980]
Crosby, Philip B. Quality is Free. Mentor, 1980 (ISBN 978-0451622471).
[Davis 2002]
Davis, Noopur; & McHale, James. Relating the Team Software Process (TSP) to the Capability
Maturity Model for Software (SW-CMM) (CMU/SEI-2002-TR-008). Software Engineering
Institute, Carnegie Mellon University, 2002.
www.sei.cmu.edu/library/abstracts/reports/02tr008.cfm
[Davis 2003]
Davis, Noopur; & Mullaney, Julia. The Team Software Process (TSP) in Practice: A Summary of
Recent Results (CMU/SEI-2003-TR-014). Software Engineering Institute, Carnegie Mellon
University, 2003. www.sei.cmu.edu/library/abstracts/reports/03tr014.cfm
[Deming 1982]
Deming, W. Edwards. Out of the Crisis. MIT Press, 1982 (ISBN 0-911379-01-0).
CMU/SEI-2010-SR-032 | 87
[EB 2008]
Executive Briefing: Technology Management Resource for Business Leaders. Pairing CMMI
and Six Sigma for Optimal Results, July 2008. www.executivebrief.com/cmmi/cmmi-six-sigma-
pairing-results/
[Fleshman 2010]
Fleshman, Jenna & Huibregtse, Jason. Integrating Software Development and CMMI Using TSP.
Proceedings of the TSP 2010 Symposium. www.sei.cmu.edu/tspsymposium/past-
proceedings/2010/IntegratingSWDevandCMMIusingTSP.pdf
[Goethert 2004]
Goethert, Wolfhart & Siviy, Jeannine. Applications of the Indicator Template for Measurement
and Analysis (CMU/SEI-2004-TN-024). Software Engineering Institute, Carnegie Mellon
University, 2004. www.sei.cmu.edu/library/abstracts/reports/04tn024.cfm
[Habib 2008]
Habib, M.; Ahmed, S.; Rehmat, A.; Khan, M. J.; & Shamail, S. Blending Six Sigma and
CMMI —An Approach to Accelerate Process Improvement in SMEs. Multitopic Conference,
2008, INMIC 2008. IEEE International, 23-24 December 2008.
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4777768&tag=1
[Humphrey 1995]
Humphrey, Watts S. A Discipline for Software Engineering. Addison Wesley, 1995 (ISBN 0-201-
54610-8).
[Humphrey 2000]
Humphrey, Watts S. The Team Software Process (TSP) (CMU/SEI-2000-TR-023). Software
Engineering Institute, Carnegie Mellon University, 2000.
www.sei.cmu.edu/library/abstracts/reports/00tr023.cfm
[Humphrey 2005]
Humphrey, Watts S. PSP: A Self-Improvement Process for Software Engineers. Addison Wesley,
2005 (ISBN 0-321-30549-3). www.sei.cmu.edu/library/abstracts/books/0321305493.cfm
[Humphrey 2006]
Humphrey, Watts S. TSP: Coaching Development Teams. Addison Wesley, 2006 (ISBN 978-
0201731132). www.sei.cmu.edu/library/abstracts/books/201731134.cfm
[Humphrey 2010]
Humphrey, Watts S; Chick, Timothy A.; Nichols, William; & Pomeroy-Huff, Marsha. The Team
Software Process Body of Knowledge (TSP BOK) (CMU/SEI-2010-TR-020). Software
Engineering Institute, Carnegie Mellon University, 2010.
www.sei.cmu.edu/library/abstracts/reports/10tr020.cfm
CMU/SEI-2010-SR-032 | 88
[Humphrey 2011]
Humphrey, Watts S. & Over, James W. Leadership, Teamwork, and Trust: Building a
Competitive Software Capability. Addison Wesley Professional, 2011 (ISBN 978-0321624505).
[Jones 2009]
Jones, Capers. Software Engineering Best Practices: Lessons from Successful Projects in the Top
Companies. McGraw-Hill, 2009 (ISBN 978-0-07-162161-8).
[McFeeley 1996]
McFeeley, Bob. IDEAL: A User’s Guide for Software Process Improvement (CMU/SEI-96-HB-
001). Software Engineering Institute, Carnegie Mellon University, 1996.
www.sei.cmu.edu/library/abstracts/reports/96hb001.cfm
[McHale 2004]
McHale, James; Wall, Daniel S.; Humphrey, Watts; & Konrad, Mike. Mapping TSP to CMMI
(CMU/SEI-2004-TR-014). Software Engineering Institute, Carnegie Mellon University, 2004.
www.sei.cmu.edu/library/abstracts/reports/04tr014.cfm
[Miluk 2010]
Miluk, Eugene; Chick, Timothy A.; & McHale, James. Guide for SCAMPI Appraisals:
Accelerated Improvement Method (AIM) (CMU/SEI-2010-SR-021). Software Engineering
Institute, Carnegie Mellon University, 2010.
www.sei.cmu.edu/library/abstracts/reports/10sr021.cfm
[Motorola 2010]
Motorola University. What is Six Sigma? Last accessed Jan. 2001.
www.motorola.com/web/Business/_Moto_University/_Documents/_Static_Files/What_is_SixSig
ma.pdf
[Nichols 2009]
Nichols, William R. & Salazar, Rafael. Deploying TSP on a National Scale: An Experience
Report from Pilot Projects in Mexico (CMU/SEI-2009-TR-011). Software Engineering Institute,
Carnegie Mellon University, 2009. www.sei.cmu.edu/library/abstracts/reports/09tr011.cfm
[Park 1996]
Park, Robert; Goethert, Wolfhart; & Florac, William. Goal-Driven Software Measurement: A
Guidebook (CMU/SEI-96-HB-002). Software Engineering Institute, Carnegie Mellon University,
1996. www.sei.cmu.edu/abstracts/reports/96hb002.cfm.
[Paulk 1994]
Paulk, Mark C.; Weber, Charles V.; & Chrissis, Mary Beth. The Capability Maturity Model:
Guidelines for Improving the Software Process. Addison Wesley, 1994 (ISBN 0-201-54664-7).
CMU/SEI-2010-SR-032 | 89
[Saint-Armand 2007]
Saint-Armand, David C. & Hodgins, Bradley. Results of the Software Process Improvement of the
Early Adopters in NAVAIR 4.0. Naval Air Warfare Center Weapons Division, 2007.
www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA482027
[Sasao 2010]
Sasao, Shigeru; Nichols, William; & McCurley, James. Using TSP Data to Evaluate Your Project
Performance (CMU/SEI-2010-TR-038). Software Engineering Institute, Carnegie Mellon
University, 2010. www.sei.cmu.edu/library/abstracts/reports/10tr038.cfm
[SCAMPI 2006a]
SCAMPI Upgrade Team. Standard CMMI Appraisal Method for Process Improvement (SCAMPI)
A, Version 1.2 Method Definition Document (CMU/SEI-2006-HB-002). Software Engineering
Institute, Carnegie Mellon University, 2006.
www.sei.cmu.edu/library/abstracts/reports/06hb002.cfm
[SCAMPI 2006b]
SCAMPI Upgrade Team. Appraisal Requirements for CMMI, Version 1.2 (ARC v1.2) (CMU/SEI-
2006-TR-011). Software Engineering Institute, Carnegie Mellon University, 2006.
www.sei.cmu.edu/library/abstracts/reports/06tr011.cfm
[SEI 2010]
Software Engineering Institute. CMMI for Development SCAMPI Class A Appraisal Results, 2009
Update. March 2010.
www.sei.cmu.edu/cmmi/casestudies/profiles/pdfs/upload/2010MarCMMI.pdf
[Seshagiri 2009]
Seshagiri, Girish. The Benefits of CMMI: Case Study of a Small Business CMMI Level 5
Organization. CMMI 9th Technology and User Group Conference, Executive Panel, 17 November
2009. www.dtic.mil/ndia/2009CMMI/ExecPanelSeshagiri.pdf
[Shewhart 1980]
Shewhart, Walter. Economic Control of Quality of Manufactured Product. American Society for
Quality, 1980 (ISBN 978-0873890762).
[Siviy 2005]
Siviy, Jeannine; Penn, M. Lynn; & Harper, Erin. Relationships Between CMMI and Six Sigma
(CMU/SEI-2005-TR-005). Software Engineering Institute, Carnegie Mellon University, 2005.
www.sei.cmu.edu/library/abstracts/reports/05tr005.cfm
[Siviy 2008]
Siviy, Jeannine M.; Stoddard, Robert W.; & Penn, M. Lynn. CMMI and Six Sigma. Addison-
Wesley, 2008 (ISBN 978-0321516084).
CMU/SEI-2010-SR-032 | 90
[Stoddard 2008]
Stoddard, Robert W.; Goldenson, Dennis; Zubrow, David; & Harper, Erin. CMMI High Maturity
Measurement and Analysis Workshop Report: March 2008 (CMU/SEI-2008-TN-027). Software
Engineering Institute, Carnegie Mellon University, 2008.
www.sei.cmu.edu/library/abstracts/reports/08tn027.cfm
[Wall 2007]
Wall, Daniel; McHale, James; & Pomeroy-Huff, Marsha. Case Study: Accelerating Process
Improvement by Integrating the TSP and CMMI (CMU/SEI-2007-TR-013). Software
Engineering Institute, Carnegie Mellon University, 2007.
www.sei.cmu.edu/library/abstracts/reports/07tr013.cfm
CMU/SEI-2010-SR-032 | 91
REPORT DOCUMENTATION PAGE Form Approved
OMB No. 0704-0188
Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions,
searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments
regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington
Headquarters Services, Directorate for information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to
the Office of Management and Budget, Paperwork Reduction Project (0704-0188), Washington, DC 20503.
1. AGENCY USE ONLY 2. REPORT DATE 3. REPORT TYPE AND DATES
COVERED
(Leave Blank) December 2010
Final
4. TITLE AND SUBTITLE 5. FUNDING NUMBERS
Implementation Guidance for the Accelerated Improvement Method (AIM) FA8721-05-C-0003
6. AUTHOR(S)
James McHale, Timothy A. Chick, & Eugene Miluk
7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION
REPORT NUMBER
Software Engineering Institute
Carnegie Mellon University CMU/SEI-2010-SR-032
Pittsburgh, PA 15213
9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING/MONITORING
AGENCY REPORT NUMBER
HQ ESC/XPK
5 Eglin Street
Hanscom AFB, MA 01731-2116
11. SUPPLEMENTARY NOTES
17. SECURITY CLASSIFICATION OF 18. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. LIMITATION OF
REPORT OF THIS PAGE OF ABSTRACT ABSTRACT
Unclassified Unclassified Unclassified UL
NSN 7540-01-280-5500 Standard Form 298 (Rev. 2-89) Prescribed by ANSI Std. Z39-18
298-102