The 10 Year Strategy

In May 2008, on this blog, I wrote about Chateau Palmer (a fine Bordeaux wine) and, specifically, about how making wine forces a long term strategy – vines take years before they produce a yield that is worth bottling (my friends in the business say that the way to make a small fortune in wine is to start with a large one), more years can go by before the wine in the bottle is drunk by most consumers, and yet, every year the process repeats (with some variety, much caused by the weather).  It’s definitely a long game.

I wondered what would happen if you could only make decisions about your IT investment every 10 years, and them made a couple of predictions.  I said:
Cloud computing – This is going to be increasingly talked about until you can’t remember when people didn’t talk about and then, finally, people are going to do it. [If you read only this bit then perhaps I am a visionary strategist; if you read the whole of it, I got most of the rest wrong]
Application rationalisation – Taken across a single country’s government as a whole, the total number of applications will be a frightening number, as will the total cost to support them all. There are several layers of consolidation, ranging from declaring “end of life” for small systems and cutting their budgets to zero (and waiting for them to wither and die – this might take eons) to a more strategic, let’s use only one platform (SAP, Oracle etc) from here on in and migrate everything to that single platform (this too could take eons)
It feels, 11 years on, that we are still talking about cloud computing and that, whilst many are doing it, we are a long way from all in.  And the same for application rationalisation – many have rationalised, but key systems are still creaking, supported by an ever decreasing number of specialists, and handling workloads far beyond their original design principles.
Did we devise a strategy and stick to it? or did we bend with the wind and change year to year, rewrite as new people came and went? Perhaps we focused on business as usual and forgot the big levers of change? 

10 Years After 10 Years After

Strictly speaking, this is a little more than 10 years after the 10 year mark.  In late 2005,  Public Sector Forums asked me to do a review of the first 10 years of e-government; in May 2006, I published that same review on this blog.  It’s now time, I think, to look at what has happened in the 10 years (or more) since that piece, reviewing, particularly, digital government as opposed to e-government.

Here’s a quick recap of the original “10 years of e-government” piece, pulling out the key points from each of the posts that made up the full piece:

Part 1 – Let’s get it all online

At the Labour Party conference in 1997, the Prime Minister had announced his plans for ‘simple government’ with a short paragraph in his first conference speech since taking charge of the country: 
“We will publish a White Paper in the new year for what we call Simple Government, to cut the bureaucracy of Government and improve its service. We are setting a target that within five years, one quarter of dealings with Government can be done by a member of the public electronically through their television, telephone or computer.”
Some time later he went further:
“I am determined that Government should play its part, so I am bringing forward our target for getting all Government services online, from 2008 to 2005”

It’s easy to pick holes with a strategy (or perhaps the absence of one) that’s resulted in more than 4,000 individual websites, dozens of inconsistent and incompatible services and a level of take-up that, for the most popular services, is perhaps 25% at best.
After all, in a world where most people have 10-12 sites they visit regularly, it’s unlikely even one of those would be a government site – most interactions with government are, at best, annual and so there’s little incentive to store a list of government sites you might visit. As the count of government websites rose inexorably – from 1,600 in mid-2002 to 2,500 a year later and nearly 4,000 by mid-2005 – citizen interest in all but a few moved in the opposite direction.
Over 80% of the cost of any given website was spent on technology – content management tools, web server software, servers themselves – as technology buyers and their business unit partners became easy pickings for salesmen with 2 car families to support. Too often, design meant flashy graphics, complicated pages, too much information on a page and confusing navigation. 
Accessibility meant, simply, the site wasn’t.
In short, services were supply-led by the government, not demand-led by the consumer. But where was the demand? Was the demand even there? Should it be up to the citizen to scream for the services they want and, if they did, would they – as Henry Ford claimed before producing the Model T – just want ‘faster horses’, or more of the same they’d always had performed a little quicker? 
We have government for government, not government for the citizen. With so many services available, you’d perhaps think that usage should be higher. Early on, the argument was often made (I believe I made it too) that it wasn’t worth going online just to do one service – the overhead was too high – and that we needed to have a full range of services on offer – ones that could be used weekly and monthly as well as annually. That way, people would get used to dealing online with government and we’d have a shot at passing the ‘neighbour test’ (i.e. no service will get truly high usage until people are willing to tell their neighbour that they used, say, ‘that new tax credits service online’ and got their money in 4 days flat, encouraging their friends to do likewise).
A new plan
 • Rationalise massively the number of government websites. In a 2002 April Fool email sent widely around government, I announced the e-Envoy’s department had seized control of government’s domain name registry and routed all website URLs to UKonline.gov.uk and was in the process of moving all content to that same site. Many people reading the mail a few days later applauded the initiative. Something similar is needed. The only reason to have a website is if someone else isn’t already doing it. Even if someone isn’t, there’s rarely a need for a new site and a new brand for every new idea.
• Engage forcefully with the private sector. The banks, building societies, pension and insurance companies need to tie their services into those offered by government. Want a pension forecast? Why go to government – what you really want to know is how much will you need to live on when you’re 65 (67?) and how you’ll put that much money away in time. Government can’t and won’t tell you that. Similarly, authentication services need to be provided that can be used across both public and private sectors – speeding the registration process in either direction. With Tesco more trusted than government, why shouldn’t it work this way? The Government Gateway, with over 7 million registered users, has much to offer the private sector – and they, in turn, could accelerate the usage of hardware tokens for authentication (to rid us of the problems of phishing) and so on.
• Open up every service. The folks at my society, public whip and theyworkforyou.com have shown what can be done by a small, dedicated (in the sense of passionate) team. No-one should ever need to visit the absurdly difficult to use Hansard site when it’s much easier through the services these folks have created. Incentives for small third parties to offer services should be created.
• Build services based on what people need to do. We know every year there are some 38 million tax discs issued for cars and that nearly everyone shows up at a post office with a tax disc, insurance form and MOT. For years, people in government have been talking about insurance companies issuing discs – but it still hasn’t happened. Bring together disparate services that have the same basic data requirements – tax credits and child benefit, housing benefit and council tax benefit etc.
• Increase the use of intermediaries. For the 45% of people who aren’t using the Internet and aren’t likely to any time soon, web-enabled services are so much hocus pocus. There needs to be a drive to take services to where people use them. Andrew Pinder, the former e-Envoy, used to talk about kiosks in pubs. He may have been speaking half in jest, but he probably wasn’t wrong. If that’s where people in a small village in Shropshire are to be found (and with Post Offices diminishing, it’s probably the only place to get access to the locals), that’s where the services need to be available. Government needs to be in the wholesale market if it’s to be efficient – there are far smarter, more fleet of foot retail providers that can deliver the individual transactions.
• Clean up the data. One of the reasons why government is probably afraid to join up services is that they know the data held on any given citizen is wildly out of date or just plain wrong. Joining up services would expose this. When I first took the business plan for the Government Gateway to a minister outside the Cabinet Office, this problem was quickly identified and seen as a huge impediment to progress

More to come.

The Billion Pound G-Cloud

Sometime in the next few weeks, spend through the G-Cloud framework
will cross £1 billion.  Yep, a cool billion.  A billion here and a
billion there and pretty soon you’re talking real money.

Does
that mean G-Cloud has been successful?  Has it achieved what it was set
up for? Has it broken the mould?  I guess we could say this is a story in four lots.

Well, that depends:

1) The Trend

Let’s start with this chart showing the monthly spend since inception.

It
shows 400 fold growth since day one, but spend looks pretty flat over
the last year or so, despite that peak 3 months ago. Given that this
framework had a standing start, for both customers and suppliers, it
looks pretty good.  It took time for potential customers (and suppliers)
to get their heads round it.  Some still haven’t. And perhaps that’s
why things seem to have stalled?

Total spend to
date is a little over £903m.  At roughly £40m a month (based on the
November figures), £1bn should be reached before the end of February,
maybe sooner. And then the bollard budget might swing into action and
we’ll see a year end boost (contrary to the principles of pay as you go
cloud services though that would be).

Government no
longer publishes total IT spend figures but, in the past, it’s been
estimated to be somewhere between £10bn and £16bn per year.  G-Cloud’s
annual spend, then, is a tiny part of that overall spend.  G-Cloud fans
have, though, suggested that £1 spent on G-Cloud is equivalent to £10 or
even £50 spent the old way – that may be the case for hosting costs, it
certainly isn’t the case for Lot 4 costs (though I am quite sure there
has been some reduction in rates simply from the real innovation that
G-Cloud brought – transparency on prices).

2) The Overall Composition

Up
until 18 months ago, I used to publish regular analysis showing where
G-Cloud spend was going.  The headline observation then was that some
80% was being spent in Lot 4 – Specialist Cloud Services, or perhaps
Specialist Counsultancy Services.  To date, of our £903m, some £715m, or
79%, has been spent through Lot 4 (the red bars on the chart above). 
That’s a lot of cloud consultancy.

 
(post updated 19th Jan 2016 with the above graph to show more clearly the percentage that is spent on Lot 4).

With all that spent
on cloud consultancy, surely we would see an increase in spend in the
other lots?  Lot 4 was created to give customers a vehicle to buy
expertise that would explain to them how to migrate from their stale,
high capital, high cost legacy services to sleek, shiny, pay as you go
cloud services.

Well, maybe.  Spend on IaaS (the blue
bars), or Lot 1, is hovering around £4m-£5m a month, though has increased substantially from the early days.  Let’s call it
£60m/year at the current run rate (we’re at £47m now) – if it hits that
number it will be double the spend last year, good growth for sure, and
that IaaS spend has helped created some new businesses from scratch. 
But they probably aren’t coining it just yet.

Perhaps the Crown Hosting Service has, ummm, stolen the crown and taken all of the easy business.  Government apparently spends £1.6bn per year on hosting,
with £700m of that on facilities and infrastructure, and the CHS was
predicted to save some £530m of that once it was running (that looks to
be a save through the end of 2017/18 rather than an annual save).  But
CHS is not designed for cloud hosting, it’s designed for legacy systems –
call it the Marie Celeste, or the Ship of the Doomed.  You send your
legacy apps there and never have to move them again – though, ideally,
you migrate them to cloud at some point. We had a similar idea to CHS
back in 2002, called True North, it ended badly.

A
more positive way to look at this is that Government’s hosting costs
would have increased if G-Cloud wasn’t there – so the £47m spent this
year would actually have been £470m or £2.5bn if the money had been
spent the old way.  There is no way of knowing of course – it could be
that much of this money is being spent on servers that are idling
because people spin them up but don’t spin them down, it could be that
more projects are underway at the same than previously possible because
the cost of hosting is so much lower.

But really, G-Cloud
is all about Lot 4.  A persistent and consistent 80% of the monthly
spend is going on people, not on servers, software or platforms.  PaaS
may well be People As A Service as far as Lot 4 is concerned.

3) Lot 4 Specifically

Let’s
narrow Lot 4 down to this year only, so that we are not looking at old
data.  We have £356m of spend to look at, 80% of which is made by
central government.  There’s a roughly 50/50 split between small and
large companies – though I suspect one or two previously small companies
have now become very much larger since G-Cloud arrived (though on these
revenues, they have not yet become “large”).

If we
knew which projects that spend had been committed to – we would soon
know what kind of cloud work government was doing if we could see that,
right?

Sadly, £160m is recorded as against “Project
Null”.  Let’s hope it’s successful, there’s a lot of cash riding on it
not becoming void too.

Here are the Top 10 Lot 4 spenders (for this calendar year to date only):

 
 And the Top 10 suppliers:


Cloud
companies?  Well, possibly.  Or perhaps, more likely, companies with
available (and, obviously, agile) resource for development projects that
might, or might not, be deployed to the cloud.  It’s also possible that
all of these companies are breaking down the legacy systems into
components that can be deployed into the cloud starting as soon as this
new financial year; we will soon see if that’s the case.

To
help understand what is most likely, here’s another way of looking at
the same data.  This plots the length of an engagement (along the
X-axis) against the total spend (Y-axis) and shows a dot with the
customer and supplier name.

A
cloud-related contract under G-Cloud might be expected to be short and
sharp – a few months, perhaps, to understand the need, develop the
strategy and then ready it for implementation.  With G-Cloud contracts
lasting a maximum of two years, you might expect to see no relationship
last longer than twenty four months.

But there are some
big contracts here that appear to have been running for far longer than
twenty four months.  And, whilst it’s very clear that G-Cloud has
enabled far greater access to SME capability than any previous
framework, there are some old familiar names here.

4) Conclusions

G-Cloud
without Lot 4 would look far less impressive, even if the spend it is
replacing was 10x higher.  It’s clear that we need:

– Transparency. What is the Lot 4 spend going to?

– Telegraphing of need.  What will government entities come to market for over the next 6-12 months?

– 
Targets.  The old target was that 50% of new IT spend would be on
cloud.  Little has been said about that in a long time.  Little has, in
fact, been said about plans.  What are the new targets?

Most of those points are not new – I’ve said them before, for instance in a previous post about G-Cloud as a Hobby and also here about how to take G-Cloud Further Forward.

In
short, Lot 4 needs to be looked at hard – and government needs to get
serious about the opportunity that this framework (which broke new
ground at inception but has been allowed to fester somewhat) presents
for restructuring how IT is delivered.

Acknowledgements

I’m
indebted, as ever, to Dan Harrison for taking the raw G-Cloud data and
producing these far simpler to follow graphs and tables.  I maintain
that GDS should long ago have hired him to do their data analysis.  I’m
all for open data, but without presentation, the consequences of the
data go unremarked.

Mind The Gaps – Nothing New Under The Sun

As we start 2015, a year when several big contracts are approaching their end dates and replacement solutions will need to be in place, here’s a presentation I gave a couple of times last year looking at the challenges of breaking up traditional, single prime IT contracts into potentially lots of smaller, shorter contracts:

G-Cloud By The Numbers (To End June 2014)

With Dan’s Tableau version of the G-Cloud spend data, interested folks need never download the csv file provided by Cabinet Office ever again.  Cabinet Office should subcontract all of their open data publication work to him.

The headlines for G-Cloud spend to the end of June 2014 are:

– No news on the split between lots.  80% of spend continues to be in Lot 4, Specialist Cloud Services

– 50% of the spend is with 10 customers, 80% is with 38 customers

– Spend in June was the lowest since February 2014.  I suspect that is still an artefact of a boost because of year end budget clearouts (and perhaps some effort to move spend out of Lot 4 onto other frameworks)

– 24 suppliers have 50% of the spend, 72 have 80%.  A relative concentration in customer spend is being spent across a wider group of suppliers.  That can only be a good thing

– 5 suppliers have invoiced less than £1,000. 34 less than £10,000

– 10 customers have spent less than £1,000. 122 less than £10,000.  How that boxes with the bullet immediately above, I’m not sure

– 524 customers (up from 489 last month) have now used the framework, commissioning 342 suppliers.  80% of the spend is from central government (unsurprising, perhaps, given the top 3 customers – HO, MoJ, CO – account for 31% of the spend)

– 36 customers have spent more than £1m.  56 suppliers have billed more than £1m (up from 51).  This time next year, Rodney, we’ll be millionaires.

– Top spending customers stay the same but there’s a change in the top 3 suppliers (BJSS, Methods stay the same and Equal Experts squeaks in above IBM to claim the 3rd spot)

One point I will venture, though not terribly well researched, is that once a customer starts spending money with G-Cloud, they are more likely to continue than not.  And one a supplier starts seeing revenue, they are more likely to continue to see it than not.  So effort on the first sale is likely to be rewarded with continued business.

Taking G-Cloud Further Forward

A recent blog post from the G-Cloud team talks about how they plan to take the framework forward. I don’t think it goes quite far enough, so here are my thoughts on taking it even further forward.

Starting with that G-Cloud post:

It’s noted that “research carried out by the 6 Degree Group suggests that nearly 90 percent of local authorities have not heard of G-Cloud”.  This statement is made in the context of the potential buyer count being 30,000 strong.  Some, like David Moss, have confused this and concluded that 27,000 buyers don’t know about G-Cloud.  I don’t read it that way – but it’s hard to say what it does mean.  A hunt for the “6 Degree Group”, presumably twice as good as the 3 Degrees, finds one obvious candidate (actually the 6 Degrees Group), but they make no mention of any research on their blog or their news page (and I can’t find them in the list of suppliers who have won business via G-Cloud).  Still, 90% of local authorities not knowing about G-Cloud is, if the question was asked properly and to the right people (and therein lies the problem with such research), not good.  It might mean that 450 or 900 or 1,350 buyers (depending on whether there are 1, 2 or 3 potential buyers of cloud services in each local authority) don’t know about the framework.  How we get to 30,000 potential buyers I don’t know – but if there is such a number, perhaps it’s a good place to look at potential efficiencies in purchasing.

[Update: I’ve been provided with the 30,000 – find them here: http://gps.cabinetoffice.gov.uk/sites/default/files/attachments/2013-04-15%20Customer%20URN%20List.xlsx. It includes every army regiment (SASaaS?), every school and thousands of local organisations.  So a theoretical buyer list but not a practical buyer list. I think it better to focus on the likely buyers. G-Cloud is a business – GPS gets 1% on every deal.  That needs to be spent on promoting to those most likely to use it]

[Second update: I’ve been passed a further insight into the research: http://www.itproportal.com/2013/12/20/g-cloud-uptake-low-among-uk-councils-and-local-authorities/?utm_term=&utm_medium=twitter&utm_campaign=testitppcampaign&utm_source=rss&utm_content=  – the summary from this is that 87% of councils are not currently buying through G-Cloud and 76% did not know what the G-Cloud [framework] could be used for]

Later, we read “But one of the most effective ways of spreading the word about G-Cloud
is not by us talking about it, but for others to hear from their peers
who have successfully used G-Cloud. There are many positive stories to
tell, and we will be publishing some of the experiences of buyers across
the public sector in the coming months”
– True, of course.  Except if people haven’t heard of G-Cloud they won’t be looking on the G-Cloud blog for stories about how great the framework is.  Perhaps another route to further efficiencies is to look at the vast number of frameworks that exist today (particularly in local government and the NHS) and start killing them off so that purchases are concentrated in the few that really have the potential to drive cost saves allied with better service delivery.

And then “We are working with various trade bodies and organisations to continue
to ensure we attract the best and most innovative suppliers from across
the UK.”
  G-Cloud’s problem today isn’t, as far as we can tell, a lack of innovative suppliers – it’s a lack of purchasing through it.  In other words, a lack of demand.  True, novel services may attract buyers but most government entities are still in the “toe in the water” stage of cloud, experimenting with a little IaaS, some PaaS and, based on the G-Cloud numbers, quite a lot of SaaS (some £15m in the latest figures, or about 16% of total spend versus only 4% for IaaS and 1% for Paas).

On the services themselves, we are told that “We are carrying out a systematic review of all services and have, so far, deleted around 100 that do not qualify.”  I can only applaud that.  Though I suspect the real number to delete may be in the 1000s, not the 100s.  It’s a difficult balance – the idea of G-Cloud is to attract more and more suppliers with more and more services, but buyers only want sensible, viable services that exist and are proven to work.  It’s not like iTunes where it only takes one person to download an app and rate it 1* because it doesn’t work/keeps crashing/doesn’t synchronise and so suggest to other potential buyers that they steer clear – the vast number of G-Cloud services have had no takers at all and even those that have lack any feedback on how it went (I know that this was one of the top goals of the original team but that they were hampered by “the rules”).

There’s danger ahead too: “Security accreditation is required for all services that will hold
information assessed at Business Impact Level profiles 11x/22x, 33x and
above. But of course, with the new security protection markings that
are being introduced on 1 April, that will change. We will be
publishing clear guidance on how this will affect accreditation of
G-Cloud suppliers and services soon.”
  It’s mid-February and the new guidelines are just 7 weeks away.  That doesn’t give suppliers long to plan for, or make, any changes that are needed (the good news here being that government will likely take even longer to plan for, and make, such changes at their end).  This is, as CESG people have said to me, a generational change – it’s going to take a while, but that doesn’t mean that we should let it.

Worryingly: “we’re excited to be looking at how a new and improved CloudStore, can
act as a single space for public sector buyers to find what they need on
all digital frameworks.”
  I don’t know that a new store is needed; I believe that we’re already on the third reworking, would a fourth help?  As far as I can tell, the current store is based on Magento which, from all accounts and reviews online, is a very powerful tool that, in the right hands, can do pretty much whatever you want from a buying and selling standpoint.  I believe a large part of the problem is in the data in the store – searching for relatively straightforward keywords often returns a surprising answer – try it yourself, type in some popular supplier names or some services that you might want to buy.   Adding in more frameworks (especially where they can overlap as PSN and G-Cloud do in several areas) will more than likely confuse the story – I know that Amazon manages it effortlessly across a zillion products but it seems unlikely that government can implement it any time soon (wait – they could just use Amazon). I would rather see the time, and money, spent getting a set of products that were accurately described and that could be found using a series of canned searches based on what buyers were interested in.

So, let’s ramp up the PR and education (for buyers), upgrade the assurance process that ensures that suppliers are presenting products that are truly relevant, massively clean up the data in the existing store, get rid of duplicate and no longer competitive buying routes (so that government can aggregate for best value), make sure that buyers know more about what services are real and what they can do, don’t rebuild the damn cloud store again …

… What else?

Well, the Skyscape+14 letter is not a terrible place to start, though I don’t agree with everything suggested.  G-Cloud could and should:

– Provide a mechanism for services to work together.  In the single prime contract era, which is coming to an end, this didn’t matter – one of the oligopoly would be tasked to buy something for its departmental customer and would make sure all of the bits fitted together and that it was supported in the existing contract (or an adjunct).  In a multiple supplier world where the customer will, more often than not, act as the integrator both customer and supplier are going to need ways to make this all work together.   The knee bone may be connected to the thigh bone, but that doesn’t mean that your email service in the cloud is going to connect via your PSN network to your active directory so that you can do everything on your iPad.

– Publish what customers across government are looking at both in advance and as it occurs, not as data but as information.  Show what proof of concept work is underway (as this will give a sense of what production services might be wanted), highlight what components are going to be in demand when big contracts come to an end, illustrate what customers are exploring in their detailed strategies (not the vague ones that are published online).  SMEs building for the public sector will not be able to build speculatively – so either the government customer has to buy exactly what the private sector customer is buying (which means that there can be no special requirements, no security rules that are different from what is already there and no assurance regime that is above and beyond what a major retailer or utility might want), or there needs to be a clear pipeline of what is wanted.  Whilst Chris Chant used to say that M&S didn’t need to ask people walking down the street how many shirts they would buy if they were to open a store in the area, government isn’t yet buying shirts as a service – they are buying services that are designed and secured to government rules (with the coming of Official, that may all be about to change – but we don’t know yet because, see above, the guidance isn’t available).

– Look at real cases of what customers want to do – let’s say that a customer wants to put a very high performing Oracle RAC instance in the cloud – and ensure that there is a way for that to be bought.  It will likely require changes to business models and to terms and conditions, but despite the valiant efforts of GDS there is not yet a switch away from such heavyweight software as Oracle databases.  The challenge (one of many) that government has, in this case, is that it has massive amounts of legacy capability that is not portable, is not horizontally scalable and that cannot be easily moved – Crown Hosting may be a solution to this, if it can be made to work in a reasonable timeframe and if the cost of migration can be minimised.

– I struggle with the suggestion to make contracts three years instead of two.  This is a smokescreen, it’s not what is making buyers nervous really, it’s just that they haven’t tried transition.  So let’s try some – let’s fire up e-mail in the cloud for a major department and move it 6 months from now.  Until it’s practiced, no one will know how easy (or incredibly difficult) it is.  The key is not to copy and paste virtual machines, but to move the gigabytes of data that goes with it.  This will prove where PSN is really working (I suspect that there are more problems than anyone has yet admitted to), demonstrate how new capabilities have been designed (and prove whether the pointy things have been set up properly as we used to say – that is, does the design rely on fixed IP address ranges or DNS routing that is hardcoded or whatever).  This won’t work for legacy – that should be moved once and once only to the Crown Hosting Service or some other capability (though recognise that lots of new systems will still need to talk to services there).  There’s a lot riding on CHS happening – it will be an interesting year for that programme.

The ICT contracts for a dozen major departments/government entities are up in the next couple of years – contract values in the tens of billions (old money) will be re-procured.   Cloud services, via G-Cloud, will form an essential pillar of that re-procurement process, because they are the most likely way to extract the cost savings that are needed.  In some cases cloud will be bought because the purchasing decision will be left too late to do it any other way than via a framework (unless the “compelling reason” for extension clause kicks in) but in most cases because the G-Cloud framework absolutely provides the best route to an educated, passionate supplier community who want to disrupt how ICT is done in Government today.  We owe them an opportunity to make that happen.  The G-Cloud team needs more resources to make it so – they are, in my view, the poor relation of other initiatives in GDS today.  That, too, needs to change.

Am I Being Official? Or Just Too Sensitive? Changes in Protective Marking.

From April 2nd – no fools these folks – government’s approach to security classifications will change.  For what seems like decades, the cognoscenti have bandied around acronyms like IL2 and IL3, with real insiders going as far as to talk about IL2-2-4 and IL3-3-4. There are at least seven levels of classification (IL0 through IL6 and some might argue that there are even eight levels, with “nuclear” trumping all else; there could be more if you accept that each of the three numbers in something like IL2-2-4 could, in theory, be changed separately). No more.  We venture into the next financial year with a streamlined, simplified structure of only three classifications. THREE!  

Or do we?

The aim was to make things easier – strip away the bureaucracy and process that had grown up around protective marking, stop people over-classifying data making it harder to share (both inside and outside of government) and introduce a set of controls that as well as technical security controls actually ask something of the user – that is, that ask them to take care of data entrusted to them.

In the new approach, some 96% of data falls into a new category, called “OFFICIAL” – I’m not shouting, they are. A further 2% would be labelled as “SECRET” and the remainder “TOP SECRET”.  Those familiar with the old approach will quickly see that OFFICIAL seems to encompass everything from IL0 to IL4 – from open Internet to Confidential (I’m not going to keep shouting, promise), though CESG and the Government Security Secretariat have naturally resisted mapping old to new.

That really is a quite stunning change.  Or it could be.

Such a radical change isn’t easy to pull off – the fact that there has been at least two years of work behind the scenes to get it this far suggests that.  Inevitably, there have been some fudges along the way.  Official isn’t really a single broad classification.  It also includes “Official Sensitive” which is data that only those who “need to know” should be able to access.   There are no additional technical controls placed on that data – that is, you don’t have to put it behind yet another firewall – there are only procedural controls (which might range – I’m guessing – from checking distribution lists to filters on outgoing email perhaps).

There is, though, another classification in Official which doesn’t yet, to my knowledge, have a name.   Some data that used to be Confidential will probably fall into this section.  So perhaps we can call it Official Confidential? Ok, just kidding.

So what was going to be a streamlining to three simple tiers, where almost everyone you’ve ever met in government would spend most of their working lives creating and reading only Official data, is now looking like five tiers.  Still an improvement, but not quite as sweeping as hoped for.

The more interesting challenges are probably yet to come – and will be seen in the wild only after April.  They include:

– Can Central Government now buy an off-the-shelf device (phone, laptop, tablet etc) and turn on all of the “security widgets” that are in the baseline operating system and meet the requirements of Official?

– Can Central Government adopt a cloud service more easily? The Cloud Security Principles would suggest not.

– If you need to be cleared to “SC” to access a departmental e-mail system which operated at Restricted (IL3) in the past and if “SC” allows you occasional access to Secret information, what is the new clearance level?

– If emails that were marked Restricted could never be forwarded outside of the government’s own network (the GSI), what odds would you place on very large amounts of data being classified as “Official Sensitive” and a procedural restriction being applied that prevents that data traversing the Internet?

– If, as anecdotal evidence suggests, an IL3 solution costs roughly 25% more than an IL2 solution, will IT costs automatically fall or will inertia mean costs stay the same as solutions continue to be specified exactly as before?

– Will the use of networks within government quickly fall to lowest common denominator – the Internet with some add-ons – on the basis that there needs to be some security but not as much as had been required before?

– If the entry to an accreditation process was a comprehensive and well thought through “RMADS” (Risk Management and Accreditation Document Set) which was largely the domain of experts who handed their secrets down through mysterious writings and hidden symbols

It seems most likely that the changes to protective marking will result in little change over the next year, or even two years.  Changes to existing contracts will take too long to process for too little return. New contracts will be framed in the new terms but the biggest contracts, with the potential for the largest effects, are still some way from expiry.  And the Cloud Security Principles will need much rework to encourage departments to take advantage of what is already routine for corporations. 

If the market is going to rise to the challenge of meeting demand – if we are to see commodity products made available at low cost that still meet government requirements – then the requirements need to be spelled out.  The new markings launch in just over two months.  What is the market supposed to provide come 2nd April?

None of this is aimed at taking away what has been achieved with the thinking and the policy work to date – it’s aimed at calling out just how hard it is going to be to change an approach that is as much part of daily life in HM Government as waking up, getting dressed and coming to work.