When is a site not a site – When it’s ITSafe

Normally I probably wouldn’t be criticising any effort to raise awareness of the need for properly protecting your home computer. After all, I banged on for long enough about government taking a role in this when I was at the Office of the e-Envoy.

The folks at NISCC (pleasingly pronounced “Nicey”) have responded to the challenge with a little site, itsafe. An odd choice of name – I guess it could have been ITsecure, SecureIT, SafeIT or any one of 1001 others, but it will do the job if enough people link to it (at which point the name of the site is irrelevant). The NISCC folks, on whom I relied more than a few times when at OeE, are a clever and capable bunch who also handle the UNIRAS site – a kind of tech-heavy version of itsafe.

Another odd thing about the site is that the home page of the site today lauds the launch event, where a Home Office Minister unveiled the site:

That’s not too bad – having a Minister launch a website these days is probably quite a tough thing to sort. After all, with 3,500 odd sites, they surely haven’t launched all of them. But the launch is hardly important – what is vital is the content in the site.

There was good press coverage though – the site is widely reported in the professional technology press (Computing etc) and even a bit of mainstream coverage with Mike Cross briefly referring to it in the Guardian (although I don’t think he meant his referral in a good way). A site like this will need to be linked to by tens of thousands of sites to be effective though. It will need to be seen as a definitive source, and that will take a lot more work.

The site has some useful stuff: there are a couple of “how to” ideas, e.g. how to update windows XP or office (but there isn’t a how to update MAC OS X or any other operating system). There is a single advisory – the definition for which is when the problem won’t affect enough users to justify an alert or email being issued – for problems in Firefox that can be fixed with an upgrade to the new version. With 25 million downloads already in less than 100 days, I think Firefox is gaining enough ground to perhaps be given alert of its own – if only to get the word out to yet more people that there are other options for browsers.

What worries me though is that the site is nearly empty. And if it’s to be a definitive source, it needs to have things that are hard to find elsewhere or that are much higher quality than you would find elsewhere; and information that is entirely vendor neutral.

There are plenty of things that could be featured that would improve IT safety – protecting against Spyware, with accreditation of sites with good downloads perhaps? The right browser settings to give best protection, with the risks that you are still exposed to. A detailed study of phishing emails and how to recognise them? Perhaps an archived list of security measures you should already have taken? Maybe, just maybe, a tool that assesses the security of your setup – one that checks if your firewall is on, maybe even collects data from your PC on settings? Would you trust government do to that for you? I’m not sure if I would – but there are plenty of other sites that I would trust even less. Government moving in here could create a sea change in vendor behaviour.

I always thought that government should provide a definitive source for all software patches you needed. You would log your configuration with a government site and then when you visited, it would know what downloads you need and would be apply to source them from a variety of places, bring them together, and allow you to download them. That would be a big leap of trust from where we are now, and it would require enormous vendor co-operation. But if government couldn’t put the stress on to get that, then who could?

Still, maybe it’s just me, reflecting on yet another poor rugby performance from my national side who have just lost to Ireland. Three losses in a row is enough to put any supporter in a bad mood (not even counting the fact that they’ve only won 5 games out of the last 14). The wooden spoon beckons – beating Italy would not count as not winning the wooden spoon. Perhaps I should support Wales more often.

They Had No Choice

Driving up to town today, via Hyde Park, I was struck by a status by the side of the road on Park Lane. It’s striking – there’s a full statue of a mule, a carving of an elephant, a dog and some other animals. A quick search shows what it is – a memorial to the hundreds of animals that served and died during various wars, with a special note to the 60 who have won the equivalent of the Victoria Cross. Who’d have thought?

Animals at War Memorial

The BBC has the full story. They tell of:

– 54 animals – 32 pigeons, 18 dogs, three horses and a cat – commended for their service in World War II. Among these heroes were:

– Rob, a para-dog who made more than 20 parachute drops while serving with the SAS on top-secret missions in Africa and Italy.

– Ricky, a canine mine-detector who continued with his dangerous task of clearing a canal bank in Holland despite suffering head injuries.

– Winkie, a pigeon that flew 129 miles with her wings clogged with oil to save a downed bomber crew.

– Mary of Exeter, another pigeon, which flew back with her neck and right breast ripped open, savaged by hawks kept by the Germans at Calais.

– Search and rescue dogs, Beauty, Peter, Irma and Jet, who located survivors buried in the debris of the London Blitz.

– Metropolitan Police horses, Olga, Regal and Upstart, who faced their fear of fire and the hail of flying bombs.

Perhaps the only odd thing is that it’s in the centre of a busy traffic island – along one of the few stretches of road in London where you can do 40mph on each side. It makes it eye catching as you make the right turn towards Bond Street, but perhaps not the easiest place to visit for a closer look.

iSync to the beat

I’ve just taken delivery of a new phone – a Sony Ericsson P910i. Why that one? Well, the Treo 600 I’ve been using for the last 12 months or so has given up the ghost – that horribly designed stubby aerial that only an American phone needs has started to wobble, making call quality poor. The Treo 650 isn’t going to be around until Mid-March and I needed a phone. I’ve had a P800 and a P900 before, so this seemed a good stopgap until the Treo 650 makes landfall. I actually wanted a sexy V3 from Moto, but the software quality on those is not great and with 500 addresses, the contacts function is too slow to use.

What I don’t get it why SonyEricsson haven’t figured out that Apple is a cool place to be. Syncing with the Mac requires a degree in Computer Science. SE should be partnering with Apple and providing great software that works out of the box. Here’s how I have to sync (thanks to Lozishere):

First step: Go to /system/library/application support/syncservice and then find a file in /501 called SymbianConduitDefaults.plist. Drag this file to the desktop. Now delete all the folders contained in /system/library/application support/syncservice/. They will be called things like 501, 508 etc.
Second Step: Open and then quit iSync.
Third Step: Open the file you just dragged to the desktop. Open it in textedit if it asks you for an application. Delete the file’s contents and replace it with the following:

kBTEmptyFolderIsOkayReally

kBTFilteringDestinationFolderID
F 126881400.11
kBTFilteringDestinationFolderName
General
kNSSyncConduitFilteringContactGroupMap
kNSSyncDeviceID
00-0f-de-87-3d-56
kNSSyncDeviceName
LoZ
kNSSyncDeviceShouldSlowSyncCalendars
kNSSyncDeviceShouldSlowSyncContacts
kNSSyncDeviceUseCalendars
kNSSyncDeviceUseContacts
kSymbianConduitModelKey
P910-1
kSymbianHasDeviceSynced

What on earth is all that about? What happened to plug and play? Or cradles and hotsync buttons? God knows how this guy figured out what needed to be done. Apple cannot go mainsteam until two things are sorted:- (i) easy interfacing with multiple supplier devices and (ii) proper rendering and interoperation of major websites (there are still too many,e.g. HSBC online banking, Parcelforce etc that don’t work properly – the business case is not there for them).

Tomorrow’s task is going to be to follow the instructions above and see if I can make it work. I happen to know that getting a Moto V3 to sync, even to a PC is hard though – for whatever reason they decided not to support Outlook (!!?) and so you have to export all your data to a text CSV file, reformat the fields to map to Moto standards and then import again. Doing that the other day took 6 hours for a friend!

I’m all for integration, but first I want interoperability and interconnectivity. No scratch that. Actually, I don’t really care about integration. I’d like interop first, we can worry about photo ipods some other time.

Snow Fun

Running outdoors in this weather is no fun at all. The books say I should be doing “wind sprints” – they didn’t say I had to do it in what seems like gale force winds. To make things worse, I’ve injured my leg – some kind of shin splint – which has slowed me down enormously. If I get round the course now, I’ll be pretty pleased. Anything less than 5 hours would be a miracle given the pain I get over even short runs. Still, physio, ice and stretching have been prescribed and I’m going to get through it, come what may.

Delivering value online

The Australian National Audit Office have been checking up on how the folks down under are doing with their e-government initiatives. As in all audit reports, there is a brief discussion of any positives before rapidly getting down to uncover the negatives. The key points, though, are relevant I suspect to any and every country with an e-government programme:

First, on website cost comparison

21. While agencies were able to provide estimates of the recurrent costs of their websites, they used different methods to calculate these costs and included a range of different items. Agencies had not conducted activity based costing of their websites. This made it difficult to compare the costs of websites against each other. The major item in most agencies’ recurrent costs of their websites was salaries for the staff responsible for managing the website. IT cost information was limited, and, where such costs were provided, most were relatively small.

22. Websites in agencies at similar stages of Internet service delivery displayed wide variations in costs. However, it was not apparent whether these differences were related to the stage of website development and/or the size of the agency, or to other factors not identified. Further, there was insufficient comparable data to determine whether cost differences were related to degrees of website efficiency and effectiveness.

23. Only one agency had conducted a cost-benefit analysis to determine whether the Internet was the most effective form of delivery for their online service. No agency had calculated an expected return on investment for providing the service. Despite having information on both costs and benefits, and having outlined this as one of the principles to be used in determining whether a particular service should be provided online, other agencies did not include a cost-benefit analysis in their business cases

And then on monitoring success

26. Three agencies had developed performance indicators for their online services. This meant that half of the agencies had not identified how the success of the program would be measured, such as by meeting estimated targets or achieving reduced costs. As well, while agencies included information on various e-government activities related to a number of their programs in their annual reports, few had reported externally on any specific performance indicators for their websites or online services.

27. ANAO considered that some agencies would have difficulty in determining appropriate performance indicators for their websites, because some of the websites’ objectives or aims were very general or not clearly specified. ANAO noted, however, that agencies were already collecting much of the information required to develop adequate indicators to assess performance.

28. Despite including evaluation plans in their business cases, most agencies had not evaluated their website redevelopments or new online services, although most planned to. Further, agencies did not generally have an integrated monitoring and evaluation policy for their Internet service delivery.

And they recommend

34. ANAO suggests that to improve their management of e-government, and their measurement of the efficiency and effectiveness of Internet service delivery, agencies:

  • establish coherent arrangements for management of their websites to further their more efficient use;
  • develop internal policies and guidelines for the Internet and encourage agency staff to use them;
  • quantify the benefits and costs of their websites;
  • consider using AGIMO’s Demand and Value Assessment Methodology to assess websites and online service delivery;
  • identify the audience for their website and online services, and consult potential users about their needs;
  • assess demand for the delivery of services via the Internet, and specify targets for achievements against objectives; and
  • compare the performance of their websites with that of other agencies or sites, to assist in assessing whether the website is efficient and effective.

All of the changes to bold text are mine.

The report includes the comments from the various agencies audited and all agree the findings and the recommendations. I think that’s the first time I’ve ever seen that with a public audit report.

Someone once said to me that the NAO (the UK equivalent of the folks that did this report) know what their report will say within 2 weeks of starting their work but negotiating the wording takes a further 18 months, which is why reports are often published so long after the event being reviewed. The work on this audit was carried out from February to May 2004 and it appears to have been published on Feb 10th 2005. Maybe it’s the same in Australia?

The Dawn of the CPU/hr

At breakfast with Jonathan Schwarz yesterday he was talking about Sun’s recent launch of a true utility – a grid computer available to all. His pitch is essentially that computing should become a service like water, electricity or gas: one that you can turn on or turn off at will. The missing piece is that for something to become a true utility there needs to be transparent pricing, i.e. a clear and indisputable metric in units that are standard to everyone. With electricity, this was KW/h – and the move to utility led to ubiquity.

The comparison between KW/h and CPU/h isn’t seem straightforward – “CPU” after all is a moving target. Sun have helpfully defined what they mean though – a 2.4Ghz opteron (and there are accompanying stats on disk storage etc) – along with issuing a none-too-subtle challenge to IBM. Unlike electricity, you should get progressively more for less as time goes on – the folks who manage the grid perform upgrades and, hopefully, the price falls as the cost of computing is driven down by ubiqity and accompanying widespread use. I haven’t notice my electricity bill go down recently but, as Jonathan says, the first person to have his house fully wired was JP Morgan and he needed full time staff to manage the generator; since then, bills have certainly come down.

Whenever I looked at the task manager application on my PC I was always amused to see it registering mostly 95% idle (when I worked on VAX systems, I saw that rather than say “idle” they used “System tasks” in case any senior management happened to look at it and wonder why they were paying such huge bills I guess). So in terms of CPU/h, I am paying through the nose for ‘CPU’ and getting very little ‘hour’. And, if I’m paying through the nose, then any corporate or public sector entity is getting nosebleeds unless they’re running intensive activities all the time (oil exploration surveys come to mind) or they’re running on out of date hardware and really sweating their assets.

We’re all used to paying for computing as “capital cost” though – we buy a laptop or a desktop for £1000, £2000 or £3000 – and then we manage it operationally (for perhaps £1,000-£3,000 a head from what I hear). Over 3-5 years of depreciation that’s a lot of money for probably relatively few truly productive CPU hours. We probably don’t even know how many or how much they cost.

What Sun have offered here is a benchmark – a pure number that allows direct comparison with our own costs. I don’t think anyone has tried that before. We don’t know, of course, whether that’s Sun’s true cost of service provision (with appropriate margins built in etc) or whether it’s a loss leader (their accounts a few quarters down will perhaps tell that story). But we do have a number that anyone can compare to their own data.

The problem, I think, is that few will have the data to really determine the cost in CPU/h terms. To start, maybe it will be enough to sum the cost of the data centre and divide by the number of hours in a year. That, for most people, will be more than $1. And, if like every system I’ve ever seen, you’re mostly idle, then all of a sudden it starts to look like $10, $100 or $1000 an hour. Or maybe more?

Then come all the objections, all of the comparisons, the dependencies, the issues and restrictions. It will be something like “we couldn’t move our data to Sun’s place because of confidentiality” or because of “security” or “data protection” or “we don’t run their software stack” or whatever. I think that’s the beauty of Sun’s move though – they’re provoking a debate and some folks will take on that challenge and run the numbers and see if there’s maybe a way that they can make use of Sun. Others will run the numbers and look for ways to cut their own costs and get more efficient. And others still will ignore it because they don’t really want to know what their own costs are and how far away they are from true utility computing.

If I could hook a truly dumb terminal up to their grid and run my own basic computing needs against it (I really don’t need a grid but I’ll take it if it’s there) with ubiquitous wireless connectivity, I think my computing bill would be $10 a year or less. I wonder if there’s a model there for the folks who don’t have PCs yet, who don’t quite know why they need one – a fully subscription based system with minimal hardware, practically zero management and a simple monthly fee (after all, it works for satellite TV). You turn your terminal on and you start paying for it – just like water, electricity and gas. I’m sure that Jonathan is not yet ready to get into the consumer market with this – he wants weather forecasting or seismic surveys or something – but maybe once it’s proven there is a model with the right partner to do that.

I don’t think Sun will succeed at making this a profitable business, but I do give them credit for trying. And I’d love to see a debate provoked around what the true costs are for others.