Why "armies" won’t work; and why it might

A couple of weeks ago I speculated whether governments, in general, could be persuaded to use tools already available (such as Facebook and Wikipedia) and replace Intranets or act as a publishing zone for all FoI requests and so on. I asked “how many armies does an e-government need?”

I think this idea has legs – especially for those governments who haven’t yet got very far in putting things online (and so haven’t proliferated the hundreds of websites and thousands of intranets that is common amongst both long-time online governments and, for that matter, corporations of all shapes and sizes. It’s possible, I think, that the number of intranets in a typical company is a positive, greater than 1, multiple of number of employees divided by number of countries operated in. So a big company with over 330,000 employees that operates in, say, 100 countries might have at least 3,300 intranets and probably more; unless they’ve adopted a process of consolidation (and more on that another day). Anyway, I digress.

There are plenty of reasons why this idea wouldn’t work, but here are the top few that I can think of:

  1. Data Scares. Governments around the world have to be waiting for the other “data shoe” to drop. Having seen recent UK and US problems, the idea that data should be anywhere outside the largest possible number of firewalls would scare anyone in government witless. Notwithstanding, of course, that most data seems to have been lost through process weaknesses (unencrypted laptops being stolen, data being cut to CDs and then lost in the post, USB sticks being lent to friends with viruses on their PCs, and who knows what else). This fear will pass of course. Processes will be bolted down, data will stop leaving government agencies or departments for a while, while everyone figures out how to make it more secure. The news stories will fade for a while – but I’m sure that they will come back when everyone has become sufficiently relaxed for the process breakdowns to recur. After all, aren’t the recent sub-prime issues, folded into their associated SIVs, no different from Enron’s off-balance sheet vehicles?
  2. Platform Proliferation. Over the last 10 years or more, governments the world over have acquired, developed or inherited hundreds of individual, lightly differentiated “platforms”. You name it, they’ve got it. One of everything ever made and more than a few that they wish were never made. The last thing anyone needs is yet another platform, or YAP.
  3. Internet Fear. Sounds weird to say it, but governments generally are afraid of their staff having access to the Internet, of what it might bring. When Windows was first introduced to government – 1998 or 1999 I think, there was a great debate about whether to remove “Solitaire” from all desktops – because it was felt that it would be detrimental to productivity. In the end it was kept, the winning argument being that it would help train staff in how to use a mouse. I make no comment on that debate, but the debate about whether the Internet will hurt productivity rages. Many UK departments that allow it put significant restrictions on sites that can be visited (I remember not being able to access Chinese news sources or, indeed, any news article with the word “terrorist” in it), or restrict it to only certain members of staff (at senior levels). Those that do not allow it sometimes provide a local wireless LAN that staff using their own laptops can access (god forbid that they access it using departmental laptops – think of the security risk!) and, in some cases, put one or two PCs aside; although last I saw those were still working with 58.8 modems. There are, though, a few [enlightened] departments that allow unrestricted access – recognising that its become part of how people live their lives and if they’re going to allow people lunch breaks, smoking breaks or flexi-time/work from home, then putting the Internet in the office is only a logical extension.
  4. Process Control. In some ways related to the Data Scares point but, in fact, existing long since before these became news. When we first developed a content management tool in OeE, we spent a fair amount of time thinking about a departmental customer requirement who wanted to use the same tool for both Intranet and Internet publishing – the theory was that most documents that were going on the Internet site would also be on the Intranet (see Internet Fear for why that means they’d need to be published to both) so all that was needed was a simple “publish to Intranet/publish to Internet” radio button or toggle switch. Our sense – and this was 2001 or 2002 – was that departmental processes weren’t mature enough to manage that toggle, i.e. that there was a great risk that information would be published on the Internet when it hadn’t been intended to. Other reasons included a theory that, pretty soon, everyone in departments would have Internet access so there wouldn’t be a need to dual publish, and that Intranets actually weren’t really about content – they were about applications (phone directories, payroll information, holiday bookings, expenses claims and so on), and that our content management tool couldn’t evolve to handle the infinite variation in those. I still think that lack of maturity is a big issue – and I see this point as the one that would cause the most concern in governments. Not every document needs to be published, nor should be published in real time; that’s why we have 30 year rules. Deciding what does and doesn’t go out is a tricky job that, some would argue, is best made difficult through lack of automation rather than click to publish technology. The fact that many documents escape into the wild perhaps belies this fact – on the basis that everything that isn’t about me, the individual (where “me” means all of us), is subject to FoI, then perhaps the sooner it’s out there the better (high grade protectively marked information excluded). Gordon Brown’s recent need to publish documents about his thinking on how to deal with inheritance tax perhaps exemplifies this. It isn’t as simple as that of course – the process for what goes to the web and what doesn’t is, I’m sure, not a radio button toggle issue.

But there are also some reasons why it might work:

  1. Refresh time. It’s already 5 years since most departments put their shiny new content management systems together; or, in many cases, evaluated a variety of products, found one that sort-of-seemed to work and went with it, only to find that it wasn’t really quite what they thought they wanted. With 5 years passing, many departments will be thinking of a refresh; they’ll be [rightly] under pressure to migrate much of their content to direct.gov and [not sure rightly yet, waiting to be convinced] businesslink.gov, and they’ll be thinking about what’s left (policy and special interest items?). Next on the list will be the intranet(s) which probably haven’t had a refresh for quite a lot longer and are starting to sag under the weight of many tens of thousands of pages of content that hasn’t quite been managed the way it ought to have been (see Gerry McGovern for inspiration).
  2. Tool and Talent availability. Truth be told – and this is no surprise to anyone – governments have learned a lot about getting into application development: they don’t like it and don’t want to do much more of it. They found that being out on the edge, developing applications from scratch using whatever the latest technique is (whether that be client-server, object oriented, .net, java, rad, XP or whatever) wasn’t that much fun. Making use of things that are already out there – already built and in use – comes as a safety net for some. Of course, they still worry about security and reliability, and deployment risk – and their IT partners will worry about their margins on the deal, but in the end, everyone will come to the conclusion that if it’s already there, and there’s a pool of developers out there – on the wide web – who are going to carry on updating it, then why shouldn’t government be there too. So they will look at Facebook and probably think about the risks of going with a sole provider (and one that plans to make money from advertising) and wonder whether that’s the right move; and instead, perhaps they’ll turn to Ning which, whilst still technically a sole provider, isn’t perhaps the same kind of thing as Facebook; it’s a tool of tools perhaps. And further on Ning’s side is the recent link to OpenSocial – as long as no one is too scared of Google and what it might become, privacy concerns and all.
  3. Rationalisation. With so many platforms in government, it might be really quite attractive for a few departments – probably driven by some smart, forward thinking local authorities – to get together and exploit something that is already there. The cost will be low, the risk low (we’re not talking about putting taxpayer or NHS data out there), implementation times will be short, early versions could be thrown up and tested by a few people. You know, some folks in the private sector might even offer up their existing platforms for government to exploit – it will help them add members and attract further people on the path to critical mass. No reason why these applications couldn’t be hosted inside the government firewalls and exploited by everyone with a secure connection. Pretty quickly, I’d venture, you could build a government-wide Intranet, LinkedIn, Facebook and Wikipedia, accessible to any department with a connection to the secure network; and available even to those who don’t have access to the public network.
  4. Updateability. If that is even an -ility. One of the great problems corporates/government departments have with their Intranets (and particularly their contact lists, disaster recovery contact details and so on) is that people don’t keep them up to date. I’m sent endless reminders that it’s been 3 months since the last update, but updating the details on a blackberry is near impossible and I rarely log in. Other users will claim to be too busy doing other things and won’t get round to it. But Facebook et al attract their users for dozens of minutes a week or month. What if government stole a bit of that time and made their own Intranet applications interesting enough to use for that kind of time – but productive, for the benefit of government time?

Leave a Reply