Tuesday, February 24, 2009

Managing Capability and Maturity

Managing Capability and Maturity: Part 2

By: David Norfolk, Practice Leader - Development, Bloor Research
Published: 24th February 2009
Copyright Bloor Research © 2009

Page Tools

More from author
February 2009
Managing capability and maturity : 1
February 2009
You don't want a CMDB, you want a CMS
February 2009
Continuous Application Performance Management
January 2009
Something new in Perforce SCM
January 2009
Application Integrity: the Coverity approach.
December 2008
Compuware CU2008 - Day 2
December 2008
CU2008 Day 1
In part one of this article,we talked about informal maturity management and why it might be a good thing. Now we want to talk about "the real thing" (CMMI; Capability Maturity Management Integrated, from the Software Engineering Institute at Carnegie Mellon) and why it might be worth the effort. But first, a warning: CMMI is not about gaining a shiny medal and boasting about it. As one CMMI ML5 (Maturity Lebvel 5) company once put it to me (more or less) "our marketing department can claim all sorts of wonderful things and point to real case studies that back up their claims. CMMI is all about not fooling ourselves, making sure that we can actually deliver on our marketroids' claims".

And, another thing to remember. It's generally accepted "in the trade" that if you are at CMMI Maturity Level 1 (ML1: managing, possibly successfully, by the seat of your pants and relying on the right people being around because you're not sure exactly what they do), there's no point in looking at CMMI ML5 as a target. It is simply beyond your comprehension. Aim for CMMI ML2 or 3. This means knowing what you've got, where it runs, what it does and who looks after it; and also involves identifying a few "good practices" that are being used and provably deliver something useful to the business, and sharing them with everyone in the organisation. Now, as a goal, that might sound reasonable, and even achievable.

That said, we thought we'd talk to a CMMI ML5 organisation: Aricent (its appraisal record is here—you should always check a CMMI ML claim, to see what it really means) and attempt to convey a glimpse of the vision. If it all seems too airy-fairy, by all means go back to the previous paragraph and stop reading there.

We talked to R. Sathyavageeswaran, an Assistant Vice President for Quality at Aricent, which is one of the largest privately-held companies in Silicon Valley, focussed on providing strategic innovation, technology and outsourcing services to the communications industry.

Bloor: How is CMMI assessment used—for external marketing or internal process?

S: Aricent has had a long and robust Quality journey where we have aligned our Quality Management Systems to meet the requirements of various international standards and frameworks to meet our business needs.

To give a quick summary, we started off our Quality journey by getting certified to ISO 9001 in 1996, CMM Level 4 in the year 2000, BS7799 Information Security standards in the year 2002, CMMI v1.1 Level 5 in the year 2003, TL9000 (Telecom specific ISO 9001 standards) in the year 2005, ISO 27001 Information Security standards in the year 2007 and now, CMMI v1.2 Level 5 in the year 2008. These frameworks and standards help us in ensuring a very high level of quality in all our deliveries. Alignment to these international standards and frameworks has been primarily to improve our internal processes, while letting the customers benchmark our services with known international standards.

Bloor: OK, so what is the impact of CMMI on your front-line developers?

RS: During our journey towards CMMI assessment, we have been able to implement many process improvements which have resulted in improving schedule/effort compliance and the quality of our deliveries. Some major improvements that were demonstrated during assessment are given below:

Implementation of Monte-Carlo Simulation based Process Performance Models to identify sub-processes that have an impact on the project outcome and help the project team manage the projects better. This has helped us to improve our estimations and re-estimations.
Implementation of a suite of prediction tools to predict defects, schedule, SLA and effort. These include:
Test End Date Prediction using the Exponentially Weighted Moving Average (EWMA) technique (a paper on this technique was selected for presentation in a Software Testing Conference held in Bangalore (India) where it was well received—see abstract here).
Use of a Rayleigh distribution to predict defects during and after the delivery of the product.
Reliability assessment using the Gompertz technique.
Use of non-homogeneous Poisson Process (NHPP) based tools to predict post-release defects.
Use of Monte-Carlo simulation based prediction of effort and SLAs (for maintenance type of projects).
Use of Moving Range Control Charts (XmR charts) for tracking critical sub-processes identified at the time of estimations.
Implementation of a comprehensive Health Index to track the health of projects under maintenance. This has helped us to improve our SLA compliance significantly (by approx 24%) and obtain a corresponding improvement in our Customer Satisfaction Index by 15% during the same period. [The Customer Satisfaction Index is on a scale of +5 to -5 and measures satisfaction levels derived from formal feedback in customer surveys.] This was also selected for presentation in the Global TL9000 conference held in Denver, USA (Sep 2008) where it was well received and we got invitations to present in other forums as well. Our marketing team even covered this through a press note.
Use of basic and advanced statistical techniques to baseline performance, with probabilistic modelling capability.
Bloor: Hmm... with our interest in testing, we're rather impressed to find someone putting defect prediction into practice. We also like seeing someone placing numbers against the benefits delivered

RS: In addition, the following process improvements were successfully demonstrated in many instances during the assessment:

Tracking review effectiveness using 2x2 matrix across different phases of projects.
Implementation of klocwork, a commercially-available Static Analysis tool, in our Products' Business Unit, resulting in reduction in memory leaks in our products.
A suite of tools to plan, identify, and track risks in projects. This includes Risk Profiling and Risk Identification Tracking through intranet tools; and maintenance of a Risk Database.
Process Improvements in the area of QMS (Quality Management System) integration, Testing Effectiveness, Test Automation, Internal Auditing, etc.
All of the above have had a deep impact on our employees. Here's a list of a few benefits:

For Project Managers: The improved focus on Quantitative Project Management helps the project managers in managing the project better, right from planning till closure. The statistical process control mechanisms and process performance models that we introduced as part of our journey to CMMI helps them in predicting the quality and schedule, thereby reducing any negative surprises.
For Developers and Testers: The focus on a well defined documented system, along with training delivery mechanisms, helps the developers and testers to get quickly ramped up on projects. As part of the journey, we enhanced our process assurance on both upstream and downstream life cycle phases and this has resulted in improved quality of deliveries and improved customer satisfaction, which has a very positive impact on team morale and engagement.
For the Business team: It has helped the business team to predict customer satisfaction and retention (based on the various Quality-Cost-Delivery metrics) and also helped in putting in some internal controls on costs/efficiencies/defect rates etc.
In addition, as a robust measurement framework is available, all stakeholders in the project are able to see the relevant metrics and take actions at the appropriate time.

Bloor: So, exploring the measured business value delivered a bit further, what is the impact of CMMI ML5 on business outcomes and how will this be measured?

RS: During this journey, we have seen significant improvements in our ability to meet our plans, quality of deliveries, SLA compliance and most importantly, on our Customer Satisfaction Index. These are business critical parameters and we would continue to track these, along with productivity improvement and repeat business from customers.

Bloor: Ah, we like measuring Customer Satisfaction formally. OK, so what next—reassessment? When?

RS: We plan to expand the scope of the assessment to include more business units and more geographical locations. We will track the CMMI model continuously and upgrade our Quality systems to meet any changes that the SEI may incorporate into the model and plan to go for a re-assessment in the next 24–36 months.

Bloor: But, surely, there must be some downsides to all this?

RS: There are no major downsides in a model based improvement journey. However, as the company goes through inorganic growth through acquisitions of companies which may not have a mature quality system, we would like to keep a close watch on integration of processes to ensure that the processes in these acquired entities are enhanced to meet the business needs.

The other possible area to consider would be the suitability of CMMI for really short duration or low efffort projects. In such cases, it would be better to adapt the full-blown CMMI requirements to meet the requirements of the project—which the CMMI model allows anyway, through project-level tailoring.

Bloor: Yes, that last is a good point, a degree of flexibility is, indeed, built into CMMI. Thank you.

So, there you have it. Does this all sound useful? Well, no matter. You can start on the maturity journey without CMMI. Get rid of the "blame culture". Start managing your assets. Take a look at ITIL v3. Identify "good practice" and make sure everyone gets to use it. Start measuring business outcome in a way that makes sense to the business users (who, ultimately, pay your salaries) and relate your budget requests and project reviews to these metrics. And, please, analyse every delivered project, to see if you can learn things that will help you do it better next time.

Then revisit CMMI and see if there might be something in it after all. It does seem to be succeeding, and not just with US Defence suppliers—see here for a review of the formal "class A" CMMI appraisal results the SEI knows about. They make interesting reading and may dispel a few misconceptions.

Reader Comments

Do you agree with what David Norfolk is saying? Perhaps you feel, or even know, different? Why not post your opinion on this issue?