Wednesday, 10 November 2010

When Measurement Programmes go Viral...

I think it’s a fair comment to say that only a foolish manager or leader would try and run an organisation, department or even a project based on subjective judgment alone. Gut feelings and intuition should not be ignored but they need to be backed up and supported by facts and often the best facts are quantitative. I have come across plenty of managers who follow the ostrich tendency and genuinely believe that they understand how their organisations perform but reject any type of data collection. Luckily they are in a minority and they aren’t the focus of this post.

If ever you attend a course, seminar or conference presentation on the implementation of a metrics programme in an organisation there are usually several repeating themes:

  1. Metrics collection is only worthwhile if you use the numbers to take actions
  2. The cost of data collection must not outweigh the value of the data
  3. Data quality is paramount – poor quality data is usually worse than useless
  4. Start off by collecting the most useful data, based against your objectives, and build your programme on that basis
  5. Continuously review the data you collect and get rid of those measures which no longer add value
  6. Don’t use the numbers to beat up your people


    The Measurement and Analysis process area of CMMI is one of the easiest PAs in the model to understand. In the main, it is written in jargon free English and it follows both a logical and a chronological path. In short, it ain’t rocket science.

    Yet in almost every organisation I’ve worked in it brings fear to the heart of the process teams, fills project managers with dread and often has so-called metrics subject matter experts rubbing their hands with glee at the thought of the power they will be able to wield. In some organisations some managers will also delight in the prospect of new ways to command and control their workforce whilst others will react with complete apathy (“we’ve seen it all before and it won’t work”), and still others will share the same fear as their staff.

    So why do so many organisations get it so badly wrong?

    There are a few places that get a good balance with their measurement programmes, but too many fall into one extreme or the other. Some organisations simply pay lip service to the requirements and do just enough to think they’ll get through an appraisal. But in this piece I want to look at the organisations that go to the opposite extreme and overwhelm the organisation with useless, redundant and time consuming measurement activities – again with the simple objective of meeting the needs of an appraisal rather than focusing on the real needs of the business. These are the organisations where measurement programmes have gone viral.

    Imagine the scenario; an organisation has set a goal of achieving CMMI Level 3 within 24 months (I know – it’s a bad goal, but we all know it happens). The SEPG and steering committee agree that they need to ramp up the measurement programme because they don’t have enough going on to achieve Level 3 based on arbitrary perception rather than a genuine business need. Someone is appointed to head up the programme who is a naïve but ambitious PMO manager as opposed to a management or business expert (or even a software engineer).

    Within weeks, a deluge of new measures are mandated and the already overextended  project managers now have the burden of collecting and submitting each and every new measure through a system of manual data entry sheets within yet more arbitrary timescales. No explanations are available as to how the data is to be used or how it will benefit the organisation. Of course, none are necessary because collecting data is a “good thing” and a CMMI requirement, and therefore it must follow that more is better. A team is put together to generate a set of internal dashboards which are built using complex excel spreadsheet and macros and a compliance team is set up to monitor the whole activity to ensure that there are no missing values. The spreadsheets are published to senior managers and a series of compulsory review meetings is established. Each and every aspect of each and every project is examined and corrective actions are created to bring deviants back in line. CMMI requirements are therefore addressed and the outcome of the forthcoming appraisal is in the bag. Everyone is happy (except the PMs, but they don’t count).


    With this success story behind them, there is only one place to go for the measurement programme – the accumulation of more data, the development of bigger and better dashboards, and the total domination of the PMO across the enterprise. Yup – we’ve gone viral.

    Of course the reality is that behind the scenes, the project teams are making up the numbers to comply with the data collection process – putting something in is better than getting in trouble for failing to submit anything at all. Managers largely ignore the data, because they are overwhelmed by it, and don’t really understand what all the numbers mean anyway. Dashboard reviews become repeats of the other management review meetings, and no real or useful analysis can be performed at any organisational level because the data is just one humongous and homogenous blob.

    In many cases the organisation will achieve its goal of reaching Level 3 because they’ve done “enough” to get away with it, but everyone knows that it’s a bit of a sham. Emphasis on measurement falls away over the next six months (along with all the other non-institutionalised processes), and the pre-appraisal status quo is restored, until the next appraisal in three years time when the frenzy will start all over again.

    So what can you do to inoculate yourself against this behaviour?

    1. Look at the M&A Process Area in detail – nowhere does it tell you that a certain number of measures should be in place to be operating at any specific CMMI Level
    2. Align a few key measures against your specific business objectives, and focus on getting good quality data. Involve business leaders in the selection of the measures so that they can get nearer to the solutions to the problems that cause them pain
    3. Look for measures that will help projects and project teams not hinder them, and don’t overburden projects with demands for duplicate data or data that can be found elsewhere
    4. Provide explanations of how to interpret the data. If you can’t do this, you have no right to demand the data in the first place. You also need to remember that different groups of people will have different objectives so do not take a one size fits all approach
    5. Perform appropriate analysis and publish findings along with recommendations of collective actions that can/should or must be taken
    6. Review your measures on a regular basis and throw out redundant ones or ones that don’t add value to the organisation. Review the cost of data collection at the same time.
    7. Get reactions from the “shop floor”. If your metrics programme is causing your people pain, it’s probably doing something wrong and it needs fixing
    8. Align the expectations of managers and information providers – in others words perform some serious stakeholder analysis and relationship management
    9. Instead of worrying blindly about compliance in providing data, concern yourself with why the data may not be being provided, and instead of analysing missing data, examine outliers and significant deviations from expected values
    10. Stop treating CMMI as an objective and focus on doing the right thing for the business, with CMMI as a (one of many if possible) reference model to guide you

    Print this post

    2 comments:

    1. I'm all for valid, useful data. But I'd place the emphasis on "valid" and "useful", and deemphasise "data". By this, I mean, go seek out information on what's really happening and "how things really work around here" (go to the gemba) and use THAT knowledge as the basis for decisions.

      The few times I'd use numeric data (i.e. metrics) is to help the poor human brain see trends, such as in SPC Control Charts, burn-down charts, etc., and otherwise make opaque facts visible and accessible.

      HTH

      - Bob

      ReplyDelete