Wednesday 10 November 2010

When Measurement Programmes go Viral...

I think it’s a fair comment to say that only a foolish manager or leader would try and run an organisation, department or even a project based on subjective judgment alone. Gut feelings and intuition should not be ignored but they need to be backed up and supported by facts and often the best facts are quantitative. I have come across plenty of managers who follow the ostrich tendency and genuinely believe that they understand how their organisations perform but reject any type of data collection. Luckily they are in a minority and they aren’t the focus of this post.

If ever you attend a course, seminar or conference presentation on the implementation of a metrics programme in an organisation there are usually several repeating themes:

  1. Metrics collection is only worthwhile if you use the numbers to take actions
  2. The cost of data collection must not outweigh the value of the data
  3. Data quality is paramount – poor quality data is usually worse than useless
  4. Start off by collecting the most useful data, based against your objectives, and build your programme on that basis
  5. Continuously review the data you collect and get rid of those measures which no longer add value
  6. Don’t use the numbers to beat up your people


    The Measurement and Analysis process area of CMMI is one of the easiest PAs in the model to understand. In the main, it is written in jargon free English and it follows both a logical and a chronological path. In short, it ain’t rocket science.

    Yet in almost every organisation I’ve worked in it brings fear to the heart of the process teams, fills project managers with dread and often has so-called metrics subject matter experts rubbing their hands with glee at the thought of the power they will be able to wield. In some organisations some managers will also delight in the prospect of new ways to command and control their workforce whilst others will react with complete apathy (“we’ve seen it all before and it won’t work”), and still others will share the same fear as their staff.

    So why do so many organisations get it so badly wrong?

    There are a few places that get a good balance with their measurement programmes, but too many fall into one extreme or the other. Some organisations simply pay lip service to the requirements and do just enough to think they’ll get through an appraisal. But in this piece I want to look at the organisations that go to the opposite extreme and overwhelm the organisation with useless, redundant and time consuming measurement activities – again with the simple objective of meeting the needs of an appraisal rather than focusing on the real needs of the business. These are the organisations where measurement programmes have gone viral.

    Imagine the scenario; an organisation has set a goal of achieving CMMI Level 3 within 24 months (I know – it’s a bad goal, but we all know it happens). The SEPG and steering committee agree that they need to ramp up the measurement programme because they don’t have enough going on to achieve Level 3 based on arbitrary perception rather than a genuine business need. Someone is appointed to head up the programme who is a naïve but ambitious PMO manager as opposed to a management or business expert (or even a software engineer).

    Within weeks, a deluge of new measures are mandated and the already overextended  project managers now have the burden of collecting and submitting each and every new measure through a system of manual data entry sheets within yet more arbitrary timescales. No explanations are available as to how the data is to be used or how it will benefit the organisation. Of course, none are necessary because collecting data is a “good thing” and a CMMI requirement, and therefore it must follow that more is better. A team is put together to generate a set of internal dashboards which are built using complex excel spreadsheet and macros and a compliance team is set up to monitor the whole activity to ensure that there are no missing values. The spreadsheets are published to senior managers and a series of compulsory review meetings is established. Each and every aspect of each and every project is examined and corrective actions are created to bring deviants back in line. CMMI requirements are therefore addressed and the outcome of the forthcoming appraisal is in the bag. Everyone is happy (except the PMs, but they don’t count).


    With this success story behind them, there is only one place to go for the measurement programme – the accumulation of more data, the development of bigger and better dashboards, and the total domination of the PMO across the enterprise. Yup – we’ve gone viral.

    Of course the reality is that behind the scenes, the project teams are making up the numbers to comply with the data collection process – putting something in is better than getting in trouble for failing to submit anything at all. Managers largely ignore the data, because they are overwhelmed by it, and don’t really understand what all the numbers mean anyway. Dashboard reviews become repeats of the other management review meetings, and no real or useful analysis can be performed at any organisational level because the data is just one humongous and homogenous blob.

    In many cases the organisation will achieve its goal of reaching Level 3 because they’ve done “enough” to get away with it, but everyone knows that it’s a bit of a sham. Emphasis on measurement falls away over the next six months (along with all the other non-institutionalised processes), and the pre-appraisal status quo is restored, until the next appraisal in three years time when the frenzy will start all over again.

    So what can you do to inoculate yourself against this behaviour?

    1. Look at the M&A Process Area in detail – nowhere does it tell you that a certain number of measures should be in place to be operating at any specific CMMI Level
    2. Align a few key measures against your specific business objectives, and focus on getting good quality data. Involve business leaders in the selection of the measures so that they can get nearer to the solutions to the problems that cause them pain
    3. Look for measures that will help projects and project teams not hinder them, and don’t overburden projects with demands for duplicate data or data that can be found elsewhere
    4. Provide explanations of how to interpret the data. If you can’t do this, you have no right to demand the data in the first place. You also need to remember that different groups of people will have different objectives so do not take a one size fits all approach
    5. Perform appropriate analysis and publish findings along with recommendations of collective actions that can/should or must be taken
    6. Review your measures on a regular basis and throw out redundant ones or ones that don’t add value to the organisation. Review the cost of data collection at the same time.
    7. Get reactions from the “shop floor”. If your metrics programme is causing your people pain, it’s probably doing something wrong and it needs fixing
    8. Align the expectations of managers and information providers – in others words perform some serious stakeholder analysis and relationship management
    9. Instead of worrying blindly about compliance in providing data, concern yourself with why the data may not be being provided, and instead of analysing missing data, examine outliers and significant deviations from expected values
    10. Stop treating CMMI as an objective and focus on doing the right thing for the business, with CMMI as a (one of many if possible) reference model to guide you

    Thursday 15 July 2010

    New starters: What a Waste...?

    It never ceases to amaze me how poor organisations are at taking on new staff. I don't mean the recruitment process per se (don't get me started on that one), but the set of physical activities that need to take place in order to get a new employee (whether full or part time, permanent or contractor) up and running and being able to contribute to the organisation in as profitable manner as possible. I'm sure everyone has experienced the frustration of getting to work on the first morning of a new job, and finding a set of obstacles in your path.

    Examples include :-
    • finding that the people supposed to meet and greet you aren't available
    • there's nowhere for you sit
    • no computer is available
    • security passes and building access are not set-up
    • network access is not configured
    • you weren't expected for another week…
    Even if most of these elements are in place, very often the first few weeks are spent sitting around reading piles of mind numbing documents, meeting huge numbers of complete strangers who you'll probably never deal with again (and whose names you instantly forget), figuring out how to get an outside line on your phone, having lunch on your own, and generally waiting for something to happen so that you can start to feel useful.

    In my experience IT companies or departments are usually the worst places to start working in, generally for most of the technical issues listed above. It also seems to the case that the bigger the organisation, the longer it takes. Here, the problem is often exasperated by the fact that the HR, facilities management and corporate purchasing departments have been outsourced, increasing both the timelines, number of  communication lines and the number of issues needing resolution.

    Given that the recruitment process often takes several weeks - if not months, even for critical hires, there is really very little excuse for not having everything ready for the new starter on day one, or at worst day two. Sometimes signatures are required on contractual, legal and security documents and photographs need to be taken for ID cards, but at least time could be allocated on day one for these activities to take place.

    If an employee is to be productive he or she generally needs somewhere to work and some tools to work with. Computers can be pre-ordered if not already available. These machines can be pre-configured according to the new starter's role. Network access can be pre-arranged and appropriate shares to project data repositories and corporate tools allocated in advance of the start date.

    But for some reason, organisations seem to be quite content to have their new, sometimes very costly, resources hanging around twiddling their thumbs, pretending to look busy, and trying not to feel guilty for something completely out of their control. In these days of cost cutting and the insurgence of "lean management" it is a complete mystery to me why organisations are quite happy to waste tens of thousands of pounds, dollars or Euros by having keen and eager employees loitering with intent to become productive.
    .
    

    Wednesday 9 June 2010

    Criteria for Creating Project Artefacts

    If I take a look at the people I follow on Twitter (currently about 1100) they fall into roughly several main categories listed here but not in any specific order:-
    • Process and Quality Experts (including CMMI, ITIL and ISO specialists)
    • Apple technical experts (including magazines and developers)
    • Apple fanboys and girls
    • Business leaders (non-IT)
    • Musicians and other "celebrities"
    • Journalists
    • IT professionals
    Out of all the IT professionals, nearly all of them are self proclaimed Agile practitioners, managers or gurus. Why do I mention this? Well mainly because I don't believe that this reflects the real state of the larger world where I suspect the Agile sector is much smaller than the more traditional development approaches. Even within the Agile sector, I suspect there are only a tiny percentage of people "doing Agile properly". Of those who do, most of them seem to spend most of their time on Twitter, so I'm not sure how much they really practice what they preach!

    By and large businesses still run fairly old fashioned IT programmes and projects, and very often these projects follow quality management systems that are somewhat over-engineered and overblown. Personally, I think I fit somewhere between the two places - I don't count myself as an Agile practitioner although I was part of the DSDM movement many years ago, and I have an aversion to heavy weight development and management processes and methods.

    Even with heavy weight process sets, most standards encourage tailoring of the process and the project documentation so that it aligns with the specific requirements of the project. In other words, you do what is necessary and ignore what is irrelevant. Quality and process 'experts' are on hand to assist the project or programme manager make these 'difficult' decisions, and to help the project team members create the project artefacts that are deemed to be required.

    Which brings me on to the real gist of this post, namely that there is something wrong with this approach. If a project manager doesn't know what a specific project artefact or process is all about or cannot make a decision about whether or not a particular artefact should be created or process followed then there should be some serious questions about whether he or she should be managing a project in the first place.

    This isn't aimed solely at project managers, but anyone involved in the planning and execution of a project, including developers, testers and business analysts.

    For me, there are really only three criteria which should be used to judge whether a project artefact should be created or not.
    • Does it add value?
    • Will the project be at risk if it is not created?
    • Do you know why you are creating it? (because I've been told to is not a valid answer)
    If the answer to the last question is no, then the next question should be "Am I the right person to be doing this job?"


     .


    Thursday 20 May 2010

    8 Quality Principles that everyone should adhere to

    At various times in my career to date I've held the role of Quality Manager or Quality Leader either at project, programme or organisational levels. Over the years I've developed a set of eight simple Quality Principles that are easy to understand and more importantly, easy to achieve by everyone in the organisation. In fact these aren't so much Quality Principles as common sense principles which often fly out of the window in times of stress. However, if you follow these principles, and communicate them across the team, department or enterprise you may find that some of the things that cause the problems in the first place will start to disappear. Print them out and post them on your noticeboards, issue all members of staff with a laminated copy to keep on their desks. And make sure that you follow up on them by ensuring that at the very least you yourself adhere to them whatever pressure you find yourself under.

    Quality is a collective responsibility
    • Everyone must be actively involved
    • Your actions and what you produce reflects on all of us
    Honour your commitments
    • Create a realistic estimate prior to making the commitment and document your assumptions
    • Don’t commit to something you know you can’t achieve
    • Provide an early warning if you cannot fulfil a commitment you made
    Right first time, every time
    • If you don’t have time to do it right initially, when will you have time to fix it ?
    • Perfection is not a contractual requirement
    A second opinion is not optional
    • All contractually required deliverables that you create must be reviewed by a person other than yourself
    • Allow time for review and rework in your estimates
    If in doubt…ask
    • No question is stupid if it helps you to do your job, as well as helping others…
    • …but use your common sense
    If you see something wrong (or know of something better) …tell someone
    • All processes can be improved
    • We can’t fix things we don’t know about
    Plan your meetings
    • Make sure you have a clear purpose, a published agenda and the right participants
    • Record critical decisions and actions in meeting minutes
    Keep It Simple, Stupid (KISS)
    • Life is complicated enough – don’t make it any more so

    Thursday 15 April 2010

    The SPI Manifesto - What's It All About ?

    It's election time in the UK, and this week the major parties have all released their manifestos outlining their policies and plans for the next five years should they be elected into government. Earlier this year the SPI Manifesto was published; the work of a group of people who attended a workshop in late 2009 in conjunction with the EuroSPI Conference in Spain. The publication of the manifesto appears to have polarised the SPI community into staunch supporters and those who are rather more sceptical about it. For myself, the manifesto certainly raises more questions than it answers, and in this entry I'll try to explain why. However this is not going to be an in depth analysis of the manifesto - although that may come later!

    The Facts The SPI Manifesto is a 17 page pamphlet which is structured as three values and ten supporting principles. The three values are concerned with People, Business and Change, and each consists of a context and problem statement, a section explaining the value and a number of "hints and examples". The values are statements of what the authors truly believe. Four principles support the People value, three support the Business value and a further three support the Change value, and are explained as principles that the authors trust to support the values. Each principle consists of an explanation and an example. The front page summarises the values and principles, followed by an explanation of how the manifesto came into existence and what to use it for, and the final page is given over to the presenters, authors and reviewers.

    Manifesto or Manifest ?

    The dictionary definition of a manifesto is
    "a public declaration of policy and aims, especially one issued before an election by a political party or candidate" 
    The word originates from the 1644 Italian word "manifesto" which means a public declaration explaining past actions and announcing the motive for forthcoming ones". My first issue is trying to figure out what this document actually is. It is titled the "SPI Manifesto", but in the description of what it is, it suddenly becomes a manifest which, when used as a noun, is defined as a customs document listing the contents put on a plane or ship. Clearly this isn't one of those! Semantics aside, this is the first of numerous inconsistencies which percolate through the document.

    Three Unanswered Questions Despite having read through the document several times and having been in discussions with various knowledgeable colleagues I cannot find the answers to three fundamental questions. Quite simply these are:
    • What is the real purpose of the SPI Manifesto ?
    • Who are the intended audiences ?
    • Why was it thought necessary to create such a manifesto in the first place ?
    A basic principle of process improvement teach us that we undertake a change or improvement activity in order to fill a requirement or to meet a need. A second basic principle is to identify the stakeholders impacted by the change or improvement. However, the SPI Manifesto, at no point that I can see, addresses either of these principles. And without these critical issues being addressed, the manifesto is left in a state of limbo. In a real life business environment, such a document would never see the light of day without having a requirement to meet or a defined target audience.


    An Academic Exercise ? Given the failure of the manifesto to address the three questions I posed above, can we surmise that the undefined purpose of the document was to purely to complete an academic exercise to see whether such a document could be created? Without wishing to demean the contributors, when I cross-referenced the list on the back page of the manifesto, I wasn't surprised to find the vast majority appeared to be academics rather than practitioners. There is a certain irony in this because the first value regarding People talks about the failure of ivory towers in the drive for successful SPI!

    Methinks there may even have been an unstated desire to create something akin to the Agile Manifesto simply because nothing existed in the SPI space. Unfortunately it's probably 20 years too late! It is probably fair to say that most of what is written in the SPI Manifesto is available in standard industry texts on process improvement and change management. It may not be as concise, but it is often written in a more appealing way, better explained and, almost always, with a specified target audience in mind.


    Missing the Real Target
    When you look around at the attendees of SPI conferences, seminars and SPIN groups, it doesn't take long to realise that most of the attendees are either SPI or Quality consultants or people suddenly faced with the prospect of leading or participating in process improvement initiatives for the first time.

    Rarely do you see the CIO, CEO or CFO of an organisation, unless they are sponsors or key notes speakers at the event. In fact it is very rare for any decision making executives to turn out to these events. Clearly, they are busy people and cannot take four days out to attend conference. But these are the very people who we, as an SPI community, should be addressing. If I was a C-Level executive and this came to my attention I'm fairly certain my reaction would be along the lines of "So What?".

    The trouble is, even as an experienced consultant and practitioner, my initial reaction to the SPI Manifesto is also pretty much "So What?"...

    Monday 22 March 2010

    The Trouble(s) with Software Process Improvement

    Fifteen or more years of software process improvement efforts have not lead to the remarkable changes that people like Watts Humphrey may have envisaged when he wrote "Managing the Software Process". The reality is that despite our efforts software development projects continue to fail either completely or to meet their intended budget, time and quality objectives. Even high maturity organisations deliver poor quality software, and the return on investment from quality and other process improvement activities remains low. That’s not to say that SPI has failed and we should give up on it, but if we continue to perform the same actions and fall into the same traps we can expect the same relatively poor results. Even companies that were early adopters of CMM and CMMI and have reached the highest levels of maturity now find themselves struggling to retain their status and suffering the mediocre results of more naïve organisations. Some of the reasons for failure are:
    • Insufficient management sponsorship, ownership and responsibility - SPI is viewed as a technical activity and management directives are delegated to groups without sufficient authority to “Just go and do it”
    • Software Process Groups lack the discipline and management skills required to undertake improvement projects, with the apparent effect of causing a “Do as I say, not as I do” environment
    • Expectations of return are grossly over-exaggerated in an attempt to secure funding, and funding is removed before any real benefits are realised
    • Supplier management is poor, and outsourcing suppliers often mismatched against the organisation’s values and beliefs, especially with respect to quality
    • Focus on CMMI, maturity levels and specific goals rather than doing what is best and right for an organisation using diverse methods, tools and techniques
    • An unmanaged approach to change leading to employee resistance at all levels of the organisation
    • Failure to adopt a holistic approach across the organisation, with associated confusion, duplication of effort, and critical activities falling between the cracks
    • Decision makers are informed about SPI activities rather than consulted and involved
    • Waves of initiatives roll on without pausing to understand the effects of the previous initiatives and often undoing the good that has gone before
    • Improvement plans are based on perception rather than fact - data is not used to verify that the right areas are being targeted
    • Quality and process teams undermining their own efforts by failing to add genuine value to the organisation and its business
    If you pick up any book on process improvement you will see that these and other issues like them have been identified and understood for many years, yet some or all of them still abound in any organisation currently running SPI initiatives. Given that those same books often provide the solutions on how to avoid the problems, I can only assume that there must be something more fundamental going on which is causing the software industry to fail to rise to the challenge, namely the people problems I described in my last post on the 7 Deadly Sins of Process Improvement (or Change Management). Next time I'll describe the first of these - Arrogance.

    Thursday 25 February 2010

    7 Deadly Sins of Process Improvement (or Change Management)

    No-one ever said that Process Improvement was easy but there’s no reason why we have to make quite so hard. By understanding some basic principles it is possible to give ourselves a fighting chance of success. Gerald Weinberg famously said: “No matter what the problem is, it's always a people problem”, so it might make a bit of sense to start looking at some fundamental people problems which are often responsible for thwarting process improvement initiatives or indeed any other kind of organisational change programme.

    Typically, when we look at do’s and don’t of Process Improvement or Change Management we focus on tasks, activities and actions which are recognised as good practice. “Run the initiative as a project”, “Obtain visible and effective sponsorship”, and “Communicate, communicate ,communicate” are some of the more popular concepts. But as Peter Leeson suggested at last year’s SEPG conference in Prague, despite understanding what good practices and techniques we should be using, we are still largely failing in our Process Improvement programmes even after 15 or more years.

    One of the key issues is that while Process Improvement experts understand the problems, most of the people that they have to deal with don’t, so we need to step back and understand what makes these people tick, and how we need to approach them to help to better understand their roles and to adapt their behaviours to enable change to occur more smoothly.

    This posting takes a different approach to the traditional Do’s and Don’ts by looking at key behaviours of people which are often the real reasons why improvement and change programme fails. If we can understand what dysfunctional behaviours to look for, how to spot them in ourselves and others and how to take steps to prevent them from interfering in our process improvement efforts we may be able to eliminate some of the problems which continue to plague us.

    The 7 Deadly Sins described here are not the original Biblical sins which don’t translate too well in an SPI context, but are useful monikers to help describe the dysfunctional behaviour we are trying to eliminate. It’s important to appreciate that these terms, which could be considered somewhat emotive, are associated with behaviours, not individuals, although I’m sure we would all be able to recognise these traits in some of the people we have to deal with.
    1. Arrogance - typified by Ivory Tower and Not Invented Here Syndromes, but also failure to use data rather than perceptions and refusing to take internal or external advice. Most likely to affect change agents and teams
    2. Inertia - lack of momentum, analysis paralysis, fear of the unknown. Often associated with weak or ineffective leadership
    3. Ineptitude - using the wrong people with the wrong skills. May be as simple as not defining roles and responsibilities carefully enough, but sometimes due to leadership appointing the wrong people
    4. Impatience - dealing with unrealistic expectations. Specifically associated with executives who want instant results, but can affect all areas of the organisation
    5. Carelessness - failing to focus on the details. Poor planning and poor implementation are often to blame, the devil is in the details
    6. Ignorance - understanding what you don’t know, aligned to the attributes associated with arrogance, but often a common cause of resistance in the end user community
    7. Extravagance - getting carried away. Process Experts, especially from a technical background are just as likely as programmers to gold plate solutions. Senior Management may also play a part in demanding more than the organisation can absorb
    We'll examine these sins in more detail in future postings.

    Friday 19 February 2010

    Getting Value from the Quality Department (Part 4)

    In the last three posts I've been looking at some of the issues facing Quality Teams in an Application Development and Maintenance environment, and some of the ways that these teams and their organisations can get more value from quality related activities.

    In this final part I want to consider more problem areas, management and measurement. These two areas are crucially interwoven as we shall see later in the post.

    Quality activities in any organisation are always going to be at risk if they take place but no-one takes any notice of them. Unfortunately, in many of the organisations that I've been involved with, quality is all too often considered as a necessary evil, and the exploits of the quality team are left to percolate in the background.

    Executives and middle managers, only get interested when something nasty hits the fan. These situations generate knee-jerk reactions such as a review of quality activities (usually too localised), a commitment to prevention rather than cure, or policy word changes (but without enforcement), but these tend to be short lived and ineffective actions which fail to address the real problems in the same way that a sticking plaster cannot fix a ruptured artery.

    In almost every case where I have seen little real management commitment to quality, it turns out that managers have no realistic or measurable objectives set around quality. There are often collective objectives like "Maintain ISO 9000 compliance" or "Achieve level 3 of CMMI by quarter 3 next year" but these are fairly meaningless at the best of times. They are also Boolean objectives; "Achieved" or "Not Achieved".

    More useful quality related objectives might be "Improve resolution time of quality issues by 20%" or "Participate in 50% of quality incident reviews". Of course, this makes the assumption that quality reviews take place and quality issues are identified, but crucially they put the onus of responsibility onto individual managers and bring them into direct contact with quality activities. Failure to participate will have an impact on their bonus or salary review.

    Of course, an organisation that has a quality department almost certainly collects lots of data. The trouble is that this is often all that happens. Data collection is of no value unless the business actually does something with the data, and when I talk of the business in this context, I'm referring to the decision makers, not just the data collection team.

    In many cases the data collected is worthless even if anyone wanted to use it because it doesn't actually address any direct business requirements, or because the quality of the data is so poor that it is of no value. Historically, data collection, analysis and data based decision making has been seen as a good thing. Sadly, many organisations collect data that they think they need collect, without understanding what it is to be used for, who it is to be used by, or how it is going to be used. Vast amounts of time are spent providing numbers because the system says you must. Often the same numbers are demanded by different people, often in different formats and at different times.

    At one company I worked for we had three time recording systems, one electronic and two paper based (all of which required predicted clock-in and out times as well as actuals). To my knowledge only one of these was actually used to determine anything of any importance (namely overtime pay!), but that was the way things were done.

    Regardless of whether they like it or not, executives and managers need good diverse data to make good decisions. There is still an extraordinary number of managers who are either consciously or subconsciously oblivious to this fact. The real problem is that too managers believe that the only data of any importance is financial data, they are measured on their ability to manage P&Ls or to meet their financial targets. What they fail to understand is that financial data alone is useless in getting to the root cause of problems and trying to resolve them. For that, they need other information which can then be used in the context of the financial data to better understand why there are issues and what their causes are.

    So why are these two apparently unrelated issues of management and measurement related and what do they have to do with the quality department? In too many organisations I've seen lots of potentially useful quality data wasted because of a lack of imagination both on the part of the quality team and that of management.

    Data is presented in drab and meaningless charts which really only try to demonstrate that the quality team is doing stuff. At the same time managers fail to ask the necessary questions to be able to understand how quality data can help them improve their business.

    Quality managers need to initiate the dialogue with management and coach them into understanding how they can make data work for them. Think of different ways to present the data, and think of useful things to say about it. Quality data taken out of context is meaningless. For example present audit or defect data alongside financial data to highlight potential correlations.

    Encourage managers to ask the difficult questions about your improvement or quality programmes, and be prepared to lift yourselves out of the status quo. Only then will management begin to sit up and take notice.



    Monday 11 January 2010

    Getting Value from the Quality Department (Part 3)

    In the past couple of posts I've tried to set out my stall and ask the question about how organisations can get the best out of their quality departments. It's a double sided question, because it's primarily aimed at quality staff, namely how can you add value to the organisation through your activities?

    My overriding observation about quality teams that I've worked with over the years is that they tend to be rather one dimensional. Their focus is often to 'audit' projects and teams and they carry out this task with demonic resolve like ferrets in a rabbit warren.

    The audit schedule is defined at the start of the year so that all projects get audited at least once, preferably at different stages of the lifecycle. The auditors then spend their time trying to get access to the project teams, being rebuffed by the project manager at every possible opportunity because of delivery priorities. When they finally get into the project, they run through their quality checklists, ticking and crossing the boxes and passing a verdict on the way the project has complied to the quality management standards in play. The quality manager logs the findings and creates a pile of powerpoint slides which are presented to senior management at the end of the month. The management team asks some polite, semi-probing questions and accepts the answers provided, agrees some actions and the cycle continues.

    This kind of activity may be in keeping with the spirit of the old 1984 ISO9000 standard but it does little in terms of adding value to the organisation and nothing to enhance the reputation and standing of the quality team. Organisations and quality teams need to radically move away from this Dilbert-like approach to quality, and the battle lines between quality staff and project teams must be broken up. So what to do?

    1. Take the Initiative - It's vital that quality teams take the initiative and start to adopt a more collaborative approach to quality by working with project teams and management to establish what is really important to them. Doing things the way you've always done them doesn't make them right or useful in a modern environment. Engage with your quality lead and offer suggestions on how you can make your activities more valuable

    2. Become more generalised - I've worked with many quality staff who have no first hand experience of the domain in which they operate. They audit programme and project managers but have never been either. I remember being at a European quality meeting some years ago where I was the only person in the room who had any management experience. Without first hand understanding of the problems faced by the people you are dealing with you cannot realistically expect them to take you or your suggestions seriously. Quality staff need to stop thinking of themselves as purely quality staff and get stronger exposure to the roles and functions they audit

    3. End the Checklist Mentality - The best auditors use a checklist to guide them and help them remember key areas for investigation. If they find a particular area of concern during the audit they will throw away the checklist and pursue that matter more rigourously. (In order to do this, they have already followed step 2 above!). Ticking boxes is fine for a pure compliance audit but this type of audit does little to help improve an organisation, rather it serves as a stick to berate delinquent areas of the organisation

    4. Proffer Solutions - I've seen so many audit reports which highlight deficiencies and non-conformances but do nothing to help the subject of the audit to understand or fix the problem. In the case of some non-conformances, it makes little sense to fix them anyway, because the moment has passed, or the cost may be too high in terms of the return. Auditors will be taken far more seriously if they can offer explanations as to why something is a non-conformance, and more crucially why a non-conformance may cause a problem later on in the lifecycle, in which case they should be able to proffer a solution, or better still, help the subject derive their own solution. In this way the auditor becomes more of a coach and a collaborator and less of a witch hunter.

    Many quality staff and auditors are hard working and passionate individuals who are prevented from achieving greater things because the system itself is broken, and quality is seen as a necessary evil rather than a value add activity. This piece is not intended as a criticism of those individuals, but as a challenge to those who have the power to make things better. So it is a challenge to everyone who works in the quality field and everyone else who doesn't think they do.

    Next time I'll focus on management responsibilities and the challenges they need to rise to.