What is it about government IT projects that make them fail so spectacularly and so regularly? Only this week Joe Harley, the CIO at DWP, was lamenting that the failure rate is 70%.
Speaking at the Government IT summit, he said“For the government’s IT spend of £14bn, “7,000 primary schools every year could be constructed”, about 600,000 nurses could be employed or more than three million state pensions paid.”
I love that the number is still £14 billion – I calculated it at £13 billion or so some 4 years ago and wondered how much it would climb to once the NHS IT and the Home Office ID cards got going. Still I digress. I love the other numbers too – 600,000 nurses at £23,333 a year, 7,000 primary schools at £2,000,000 or 3,000,000 pensions at £4,666 per year. Eye catching numbers, if disturbingly round.
Joe notes that the £14 billion isn’t all project-related of course, there is a bundle spent on desktops – with a spend of anywhere from £700/unit to £2,400/unit (and, I imagine, far higher costs depending on who loads what into each unit price – one of the problems of comparing such numbers, like the “cost of a website” is different departments see costs in different ways”.
On the project front, Joe isn’t entirely clear what “failure” actually means but he did add:
Projects fail for a variety of reasons but sometimes have predictable weaknesses such as inadequate requirements. Even if a project or programme is three months late it’s doubtful “you could call this a success”.
The “inadequate requirements” point is interesting. With OGC Gate Reviews either about to be or now made public, I’m sure we’re going to see a lot more on that. OGC are already facing criticism for passing too many projects with status “green” that turn out later to be outright failures (otherwise known as the “how could you not spot that the horse they were putting together had 3 humps?” attack) and senior civil service management will likely take flak for ignoring too many “red” reports (known as the “what do they know about horses? ours needs those 3 humps for exta ballast” defence).
On the current trackrecord, I think most of us would be pleased if some thing was only 3 months late, depending on what occurred because of the lateness. Obviously, a missile intercepting an incoming nuclear warhead being fired 3 months late wouldn’t be much use; and an IT system designed to ship patient data around wouldn’t do much good for the patients that passed away during the 3 months that it wasn’t around – but, in general, a 3 month delay wouldn’t be terrible.
I’m intrigued by the 30% success rate figure. If success takes just 3 dimensions – meets requirements, is on time and on budget, then I’d venture that no government or business entity in the world has a hit rate of much greater than this figure. Filtering out the very short-term projects – the ones that are changes to existing code bases or the ones where time isn’t costed to the project, so establishing say a minimum spend of £10 million (small by government standards, moderate by corporate standards), how many companies would stand up and present a success rate much greater? Of course, the deck is loaded here. The Chaos Chronicles, published by the Standish Group, tell us the real story:
These numbers are a significant improvement over the previous survey conducted in 1995. That survey indicated that 80% of IT project failed. In 2003 this number shrank to 66%. Although fewer projects fail nowadays, the general trend is that more projects are delivered late; statistically this is at 82%, up from 62% in 1995. However, there is also good news: the average cost overrun is currently at 43%, down from 180% in 1995
The true genius lies in the solution
The most effective way to avoid cost and schedule overruns is to get better at making software cost estimates. Software has become more complex and increased in size, which makes estimation more challenging, however, several studies (e.g. Standish Group, Capers Jones) have shown that by using software cost estimation techniques alone, the probability of completing a project successfully doubles. Estimating the schedule, cost, and resources needed for the project is paramount for project success.
That is to say that if you doubled your estimate of time to complete, increased your budget by 75% and cut your requirements in half (all at the same time), you’d be absolutely certain to be successful. Sadly, more current data than 2003 doesn’t seem to be available on the web (indeed, the Standish Group website lists data only from 1994 and they want $5,000 for membership or $500 for just one quarterly report).
I’ve done more than a few presentations on the same topic, usually involving a story about wheels falling off wagons and using whatever was current from the press at the time, whether it was problems at Sainsbury, Nike, Cigna, the Inland Revenue the Department of Work and Pensions or wherever. Sometimes I ended with this quote, from, I think, an anonymous source:
“A carelessly planned project takes three times longer to complete than expected
A carefully planned project takes only twice as long”
This week I’ve been playing* Microsoft’s new Xbox 360 game, Halo 3. The whole game isn’t out until the end of September but Microsoft’s developer, Bungie, have thoughtfully releases a beta version. I have no idea how many people in total are accessing this beta but earlier this evening over 50,000 were online playing at the same time as me (not, fortunately, all in the same game universe as me). One way into the beta was to buy a specially flagged copy of another game, Crackdown, for about £40. Some people treated this as buying the game and getting the beta for nothing, others complained about having to pay £40 for a ticket to the beta (which only lasts three weeks or so) and planned to throw away what turned out, not surprisingly as the game was written by the author of the original Grand Theft Auto, to be a very good game.
When the beta kicked off, last Wednesday, those who had bought Crackdown anxiously awaited for it to begin at 1pm on the dot. Come that time, no one could access it. For hours afterwards, access was denied, and the Interweb forums filled with vitriol, bile and every known variety of swearwords, all helpfully, on bungie.net at least, replaced by -blam-. The user population was not happy. Bungie acknowledged that there was a problem after an hour or so and then stayed quiet for ten more hours or more. In fact, all of the companies involved stayed quiet. Only when it was fixed did they announce the problem had gone away and, gracefully, extended the beta for 4 days to make up for the 1/2 day or so lost. A good recovery (and isn’t the recovery from a problem so much more important than the actual problem itself?).
There are a few points here
1) Betas are meant to be buggy. That’s why there’s a beta, to iron out the problems. There aren’t, usually as far as I know anyway, problems actually starting the beta, but you’d certainly expect the game to be buggy (And so far, it seems pretty stable but there are certainly bugs)
2) Staying quiet once you know there are problems, perhaps known as the “Gordon Brown on Raiding Pensions” defence, isn’t a good tactic. It angers people more.
3) When you do find problems, addressing them in the right way is critical. Just like when you go to a restaurant and they make you wait 60 mins or more for the main course, if they bring you a glass of fine red wine, you wait patiently; if they don’t, you fume quietly in the corner and don’t leave any service charge.
SO what would happen if government ran beta tests of its major programmes? After all, we’re seeing them with the cashless society (being beta tested in September in Canary Wharf) and the Downing Street petitions website continues to be in beta, despite long since proving its mettle with the road pricing petition. In the corporate world, half of the things you can do with Google seem to have been in beta for months or years (hasn’t gmail only recently come out of beta after all?).
But, we haven’t ever seen one start or be proposed for, say, tax credits, or business taxation policies, or, shall we say, ID cards?
So how about we run some beta tests on government programmes, using these steps:
– Carry out limited beta tests, backed by sufficient but not too much IT to prove that the theory and practice worked. It doesn’t have to do everything on day one.
– Try out 2, 3 or even 5 different versions of the beta – i.e. different policies. One would have to be a control group – just like in medical tests, you need to be able to test against something that you understand and that isn’t moving.
– Invite a limited population – somewhere from 0.1% to 5% of the total target market (with 30 million employees in the UK, so a PAYE betea with 0.1% would be 30,000 people – plenty of a challenge).
– Make the aims of the beta, the kind of people involved in it, and the details of how it’s going including the final conclusions, entirely public
– Carry out the beta test for long enough to see how it worked through a full cycle. If that was PAYE, i.e. employee taxation through the payroll, that could be as much as a year.
– Start development of the wider programme only when all the tests had been evaluated against the original aims. This works for drug tests, it works for peer reviews of scientific papers, so it should work for government IT
I know, however, that the moment the first one of these was started, the business and technology press would erupt. The opposition parties would also have a field day, whichever set they were. The obstacles raised would include:
– What a waste of time. If you’re not sure of your policy, why not get sure in the first place and then start the work
– How dare you give these other people competitive advantage. Some of these new tax beta tests would doubtles involve companies paying less tax so that the impact on their profits, investment profile, employee count and so on could be assessed. Instead, all of these kinds of tests are carried out on complex models of the economoy built, doubtless, in Excel
– What about those people who are suffering without access to the beta? This is a variant of the “how dare you give the people in Ward A the drug that might cure them of their cancer, but not the folks in Ward B” – at the outset, of course, you don’t know which is going to work best.
– What about if the beta fails? How do you get people off the new process and back onto the old process without disadvantaging them?
I could go on for ten more pages about the list of objections that would be raised. You’ll note also that I am proposing beta testing both policy and IT systems – addressing the “inadequate requirements” point in the beta test is surely essential – and so opening myself up to double criticism. The Halo 3 guys are only testing their IT – and they know that Halo 3 is just Halo 3, they don’t have to test out if they need to make it more like Half Life 2 or more like Gran Turismo or whatever.
Nope, there would be many who would block the idea of beta testings, let’s call them the beta blockers (predictably, a group of drugs that slows down your heart rate – and therefore, doubtless, your speed of delivery). And that’s probably a shame. If the success rate really is only 30% then savaging the costs by 20% and working on “inadequate requirements” at a purely technical level isn’t going to solve the problem. Not for UK government, not for foreign governments and not for corporations around the world who appear to suffer from exactly the same problem.
* Yes, I play video games. I’ve been playing since Space Invaders in, what, 1978. Then all the way through a 256 byte (yes, byte) Mk14 through the Apple II, the ZX81 and on to the Atari ST, the BBC Micro, the Xbox and now the Xbox 360.