CMMI Level 5 And The Team Software Process

Transcription

Software Engineering TechnologyCMMI Level 5 and the Team Software ProcessDavid R. Webb309th Software Maintenance GroupIDr. Gene MilukSoftware Engineering InstituteJim Van BurenThe Charles Stark Draper LaboratoryIn July 2006, the 309th Software Maintenance Group (309th SMXG) at Hill Air Force Base, Utah was appraised at aCapability Maturity Model Integration (CMMISM) Level 5. One focus project had been using the Team Software ProcessSM(TSP)SM since 2001. TSP is generally considered a Level 5 process; however, during the preparation for the assessment, itbecame obvious to the team that even the stringent process and data analysis requirements of the TSP did not completelyaddress CMMI requirements for several process areas (PAs). The TSP team successfully addressed these issues by adaptingtheir process scripts, measures, and forms in ways that may be applicable to other TSP teams.n July 2006, the 309th SMXG wasappraised at a CMMI Level 5. One ofthe 309th’s focus projects, the GroundTheater Air Control System (GTACS)project, had been using the TSP since2001. The team had achieved a four-foldincrease in productivity during that time,had released zero defects since the TSPwas adopted, and had been internallyassessed at a high maturity by the group’squality assurance team. GTACS teammembers felt confident they could meetthe rigors of a CMMI assessment andachieve their group’s goal of Level 5.Watts Humphrey, who is widelyacknowledged as the founder of theCapability Maturity Model (CMM )approach to improvement and who latercreated the Personal Software Process(PSP)SM and TSP, has noted that one ofthe intents of PSP and TSP is to be anoperational process enactment of CMMLevel 5 processes at the personal and project levels respectively [1]. CMM and laterthe CMMI were always meant to provide adescription of the contents of a matureprocess, leaving the implementer with thetask of definition and enactment of thesemature processes. Thus, CMM and CMMIare descriptive not prescriptive models.The TSP goal of being an operationalLevel 5 process implies that a team practicing TSP out-of-the-box should be veryclose to being Level 5.The 309th is a large organization ofnearly 800 employees, both civil serviceand contactors. The group level is comTable 1: SCAMPI B1 Noted WeaknessesRisk LevelTotal RisksHighMediumLow*Total1967086ProcessRisks115016* Low risks were not categorized in the first SCAMPI BDefenseWeaknessesSoftware Engineering16TableCROSS1TALKThe JournalSCAMPIB1ofNotedprised of five squadrons, each with a different focus or product line. 309th management and Software EngineeringProcess Group (SEPG) sets group policyand defines a group level process andmetrics framework. Each squadronapplies the group level process to itstechnical domain. So projects, likeGTACS, must ensure their detailed project processes are consistent with theirsquadron’s process and with group-levelguidance. The GTACS project is alsodivided into several sub-teams, all managed as one project. The GTACS software team, which performs most of theGTACS assigned technical efforts, usesTSP to support its work. A separateConfiguration Management (CM) teamprovides CM services. The project’s customer, the GTACS Program Office,retains systems engineering responsibilityand authority. This diverse organizationalstructure is important because several ofthe CMMI issues that need to beaddressed are clearly the responsibility ofthese other entities and were not GTACSTSP team issues other than alignmentand coordination.Assessment TimelineIn order to prepare for the assessment,309 SMXG conducted a series ofStandard CMMI Appraisal Method forProcess Improvement (SCAMPISM)assessments which included the GTACSteam. There are three kinds of SCAMPIassessments: A, B, and C. The mRisks128029A assessment is the final review duringwhich a CMMI level can be determined.SCAMPI Bs and Cs are less rigorous andare intended to prepare the team for thefull SCAMPI A. The 309th SMXG usedSCAMPI Bs to ensure compliance to themodel and value added to the enterprise.In general the SCAMPI B teams weretold to aggressively identify risks to asuccessful SCAMPI A appraisal. Whenthe SCAMPI B teams identified aprocess weakness, they assigned a high,medium, or low risk rating based on theseriousness of the noted weakness.From the perspective of the TSPteam there were four types of weaknesses: non-team, process, artifact, and document.The non-team weaknesses were those thatwere the responsibility of a team otherthan the TSP team, such as the group’sSEPG or the GTACS CM team.Examples include policy changes orchanges to the CM process. Process weaknesses indicate that the team had noprocess in place. An artifact weaknessmeant the assessment team found insufficient artifacts to pass the assessment. Adocument weakness meant the team’sprocess documentation needed to beupdated.The initial SCAMPI B for theGTACS focus project was held aboutone year before the SCAMPI A finalassessment and identified 86 weaknesses.A summary of the counts and types ofthese weaknesses is found in Table 1.Not all weaknesses were project focused.Some were organizational and somewere squadron focused. Of the projectfocused risks, many were the responsibility of one of the following: overarchingproject management (e.g., data manageSM Team Software Process, Personal Software Process, PSP,TSP, and SCAMPI are service marks of Carnegie MellonUniversity.Capability Maturity Model and CMM are registered in theU.S. Patent and Trademark Office by Carnegie MellonUniversity.April 2007

Form ApprovedOMB No. 0704-0188Report Documentation PagePublic reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering andmaintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, ArlingtonVA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if itdoes not display a currently valid OMB control number.1. REPORT DATE3. DATES COVERED2. REPORT TYPEAPR 200700-00-2007 to 00-00-20074. TITLE AND SUBTITLE5a. CONTRACT NUMBERCMMI Level 5 and the Team Software Process5b. GRANT NUMBER5c. PROGRAM ELEMENT NUMBER6. AUTHOR(S)5d. PROJECT NUMBER5e. TASK NUMBER5f. WORK UNIT NUMBER7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)8. PERFORMING ORGANIZATIONREPORT NUMBER309 SMXG/520 SMXS,7278 Fourth ST,Hill AFB,UT,840569. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES)10. SPONSOR/MONITOR’S ACRONYM(S)11. SPONSOR/MONITOR’S REPORTNUMBER(S)12. DISTRIBUTION/AVAILABILITY STATEMENTApproved for public release; distribution unlimited13. SUPPLEMENTARY NOTESCROSSTALK The Journal of Defense Software Engineering, April 200714. ABSTRACT15. SUBJECT TERMS16. SECURITY CLASSIFICATION OF:a. REPORTb. ABSTRACTc. THIS PAGEunclassifiedunclassifiedunclassified17. LIMITATION OFABSTRACT18. NUMBEROF PAGESSame asReport (SAR)619a. NAME OFRESPONSIBLE PERSONStandard Form 298 (Rev. 8-98)Prescribed by ANSI Std Z39-18

Low risks were not categorized in the first SCAMPI Bment and stakeholder involvement plans)or the CM group. The remaining issueswere the responsibility of the TSP team.Most issues were focused within theDecision Analysis and Resolution (DAR)and Causal Analysis and Resolution(CAR) PAs. The specifics of each ofthese are discussed in the PA sectionbelow.Based on the results of this initialSCAMPI B, the team continued its project work. The major focus was on executing the team’s CAR process andaddressing the documentation andprocess framework issues. Significantly,the team did not devote any specialresources to the CMMI preparatoryeffort. After this finding, preparatorywork was done by the team and led bythe team’s process manager (a standardTSP role) as part of normal work duties.About four months into this effort the309th realized that DAR could not besolely addressed at the organizationallevel and a new process requirement forDAR implementation was pushed downto the project level. The team’s TSPcoach developed a draft process scriptand team training was conducted. Noopportunity to execute the DAR processoccurred before the second SCAMPI B.The weaknesses and risks identifiedby the second SCAMPI B are identifiedin Table 2. It is important to note thatthe assessment team for the secondSCAMPI B was different than the firstand that this team chose to identify areasfor improvement in the low-risk areas,whereas the first team did not. Thesenew results gave the GTACS team a different and more thorough understandingof the remaining weaknesses.Of the weaknesses noted there werethree groupings: DAR (seven HighArtifact, three Medium Artifact, and oneLow Document); Organizational ProcessPerformance (OPP) (13 High NonTeam, one Medium Non-Team, and oneLow Non-Team); and Training (oneHigh Artifact, two High Non-Team, 12Medium Document, one Low Artifact,and one Low Non-Team). The otherweaknesses noted were scatteredthroughout the model. Of these, themost significant for the purposes of thisarticle were the seven Medium Processweaknesses. These reflected the fact thatthe team had a process gap. In theseseven weaknesses there were threeprocess gaps: 1) a lack of traceabilitymatrices in the team’s engineering workpackages, 2) a missing checklist item inthe team’s high-level design inspectionchecklist, and 3) the team’s implementaApril 2007Table 1 SCAMPI B1 Noted WeaknessesRisk sks0707Table 2: SCAMPI B2 Noted Weaknessestion of statistical process control (SPC)to monitor selected subprocesses. Of these,only the SPC issue required a majorchange in the team’s practices. It is discussed in detail below. The team’sapproach to requirements traceabilityhad previously been to include traceability information in the textual requirements, design, and test descriptions andto validate traceability via an inspectionchecklist item. It was straightforward tomodify the engineering work packagetemplate to include the traceabilitytables. The missing item in the team’shigh-level design inspection checklistwas added, although it had not causedthe team issues in the past.The Software Engineering Institute(SEI) has already performed a theoreticalmapping of TSP to CMMI and determined that DAR is partially addressed by theTSP, OPP is supported, QuantitativeProcess Management (QPM) is 90 percentdirectly addressed, and CAR is 60 percentdirectly addressed [2]. As the GTACS teamset about to shore up these weaknesses,they determined that these assessmentswere generally accurate; they also came upwith creative ways to update the TSP tocompletely address all of these PAs.Table 2: SCAMPI B2 Noted WeaknessesThe PAsIn addition to the weaknesses previouslydescribed, there were also minor weaknesses in requirements management, riskmanagement, and two QPM issues. Sincethe initial preparation for DAR had beenonly at the group level, there was noDAR process or practice in place for theSMproject.The team’s previous processimprovement discussions, during theirTSP post-mortems, had not producedthe artifacts necessary to meet CARrequirements. The TSP post-mortemprocess and PSP Process ImprovementProposal (PIP) process do not require thequantitative analysis that CAR and its linkto QPM does. The team had not formalized its requirements managementprocess and its documented risks management process was not consistent withthe TSP risk management process. TheQPM risks were labeled as medium risksand related to a lack of thresholds andcontrol limits.CMMI Level 5 and the Team Software eamRisks1581033DAROne of the innovations the team came upwith was in their approach to the Level 3requirement for decision analysis and resolution. Initially, GTACS addressed itsDAR requirements by adopting the organization’s DAR processes and forms.Organizational DAR training was held forthe team. GTACS created a draft operational process in the form of a TSP script.The DAR script was then used by theteam to analyze three different types ofissues: product design, tool selection, andprocess. The final DAR process was thenupdated and included in the team’s standard process (see Figure 1, next page).The SEI’s report on TSP and CMMIidentified all six DAR-specific practices aspartially implemented and identified various launch meetings as points where DARactivities are implemented. We believe thispartially implemented term underestimatesthe risk and resulting effort that TSPteams will face to meet DAR CMMIrequirements. A better characterization ofTSP’s implementation of DAR is that TSPis consistent with DAR philosophy but isnowhere near sufficient. DAR is, at itsheart, a systems engineering sub-processfor making and documenting formal decisions. In some ways it is as critical to thesystems engineering culture as inspectionsare to software engineering or personalreviews are to the PSP/TSP approach.CMMI has elevated DAR from a practiceto a full-fledged PA and although TSP isconsistent with DAR, TSP is insufficientto pass a CMMI assessment. A procedurelike that in Figure 1 is required to produceproper and meaningful DAR artifacts.A TSP team must also be trained in theapplication of DAR. Based on the background of the team members, this trainingmay involve getting software engineers tothink like systems engineers. For theGTACS team, this was surprisingly difficult. While a DAR process, like thatdetailed in Figure 1, may appear straightforward and obvious, software engineersmay question its applicability. For years wehave observed good systems engineersfollowing processes like this to make anddocument their systems designs anddesign tradeoffs. On the contrary, it hasbeen significantly more difficult to getwww.stsc.hill.af.mil17

Software Engineering TechnologyDAR Process Script To guide the team in making formal decisions.Either A critical measurement exceeds the thresholds defined in the GTACS DAR threshold matrix. A critical decision needing a formal analysis is identified. Critical decisions are ones that have potential impact on the project or project team. Issues with multiple alternative approachesand multiple evaluation criteria are particularly well suited for formal analysis. This procedure may be used to make and document other decisions.PurposeEntry eriaSelectionMethod56Rank EachApproach7Make aDecision8Post-MortemExit CriteriaDescription- A Point of Contact (POC) is assigned. The POC may be self-assigned if the POC is responsible for the critical decision. The team lead assigns the POC otherwise.- The team that will perform the DAR analysis and selection activities (the DAR team) is assigned.- The POC completes the Entry section of the MXDE Decision Analysis and Resolution Coversheet (section I).- A working directory is created to hold the DAR artifacts.- An action item is created in the Project Notebook to track the status of the DAR.- The approval signatures required for this DAR are determined. For DARs initiated because a critical measurement exceeds the thresholds defined in the GTACS DAR thresholdmatrix the approval signatures are documented in the Stakeholder Involvement Plan (SIP). For other DARs the GTACS Technical Program Manager is the approval authority.- The POC identifies stakeholders for this DAR activity. These include the following: Those who provide the alternatives, risks, and historical data. The DAR team. Those who will implement the decision the DAR results in.- The DAR team obtains input from the stakeholders. Alternative approaches. There is no limit to the number of alternative approaches identified. Evaluation Criteria and relative weighting. Key risks.- The DAR team determines the evaluation criteria and relative weighting after considering the input from all stakeholders.- The DAR team reviews the evaluation criteria with the stakeholders before finalizing the criteria.- The DAR team determines the ranking and scoring method. Suggested ranking and scoring methods are found in the DAR Tools document. The DAR team must agree on a scoring method, the scoring range, and have a common understanding of what thescores represent.- The selected approach is documented on the MXDE Decision Analysis and Resolution Coversheet (section II).- For each alternative, the DAR team must assign a score to each decision criteria, employing the ranking and scoringmethod previously selected.- The total weighted score for each alternative is determined.- The DAR team makes a decision and reviews it with the stakeholders making changes if necessary.- The stakeholders review is captured on the MXDE Decision Analysis and Resolution Coversheet (section III).- The final decision is captured on the MXDE Decision Analysis and Resolution Coversheet (section IV).- The effort expended on this DAR is captured on the MXDE Decision Analysis and Resolution Coversheet (section IV).- Approval signatures are obtained and recorded on the MXDE Decision Analysis and Resolution Coversheet (section IV).- DAR lessons learned are captured in the DAR notes.- All DAR documents are captured and archived per the GTACS Data Management Plan (DMP). The completed MXDE Decision Analysis and Resolution Coversheet. Scoring and analysis worksheets. CM is notified that the DAR is complete and that the DAR artifacts can be archived to the GTACS datamanagement repository.- The MXDE Decision Analysis and Resolution cover sheet is completely filled out.- The artifacts produced during the DAR activities have been archived in accordance with the GTACS DMP.1: The GTACS Team’s DAR Process ScriptFigureFigure 1: The GTACS Team’s DAR Process Scriptpurely software engineers to documenttheir design reasoning with the same rigor.It is, however, a basic engineering practicethat can be easily learned. Our experiencewith the GTACS team confirmed thisobservation that software engineers areunfamiliar with systems engineering techniques for formal decision making anddocumentation but can be easily trained touse these techniques.QPM and OPPOne contentious area surrounding CMMIHigh Maturity appraisals and organizations is the definition and operationalization of Maturity Level 4: QuantitativelyManaged. The formative book on CMMI:18 CROSSTALK The Journal ofDefense Software EngineeringGuidelines for Process Integration and ProductImprovement describes Maturity Level 4 asthe following [3]:Maturity Level 4: QuantitativelyManaged. At maturity level 4, theorganization and projects establishquantitative objectives for qualityand process performance and usethem as criteria in managingprocesses. Quantitative objectivesare based on the needs of the customer, end users, organization, andprocess implementers. Quality andprocess performance is understood in statistical terms and ismanaged throughout the life ofthe processes.For selected subprocesses,detailed measures of process performance are collected and statistically analyzed. Quality and processperformance measures are incorporated into the organization’smeasurement repository to support fact-based decision making.Special causes of process variationare identified and, where appropriate, the sources of special causesare corrected to prevent futureoccurrences.A critical distinction betweenmaturity levels 3 and 4 is the predictability of process performance.April 2007

CMMI Level 5 and the Team Software ProcessAt maturity level 4, the performance of processes is controlledusing statistic

mature processes. Thus, CMM and CMMI are descriptive not prescriptive models. The TSP goal of being an operational Level 5 process implies that a team prac-ticing TSP out-of-the-box should be very close to being Level 5. The 309th is a large organization of nearly 800 employees, both civil service and contactors. The group level is com-Cited by: 2Publish Year: 2007Author: David