• United Kingdom (English)

Sidewinder - Data security is a big problem in a small world

Data security is a big problem in a small world.

Originally published on the Microsoft NHS Resource Centre on 22 March 2011

Waistbands may be getting bigger, but the world is most definitely shrinking. The Internet is now everywhere, and that means everywhere is just a mouse click on a screen away. Even the screen that you are looking at is shrinking: be it a laptop, Zoostorm Slate or smartphone. Yep, there's no doubt that miniaturisation is hitting us in every regard but one: the problem of data security is getting bigger all the time.

I know that the NHS doesn’t stretch as far as Canada, but bear with me as the story of a missing hard drive in an Edmonton hospital is more relevant than you might at first imagine.

Apparently, the unencrypted back-up drive (why are the drives that go missing always unencrypted, I find myself screaming once again) went missing from the Misericordia Community Hospital; after being placed under a desk during the relocation of some other equipment. It contained thousands of patient images, including babies who died before birth and pictures of surgical procedures relating to some 233 individual patients. Speaking at a news conference, the president of the company which runs the hospital stated that a staff member had not followed policy, and insisted that the hospital has “a very solid policy that just wasn't followed”. To which the obvious, and achingly sensible, reply is: well that just isn't good enough then.

Just as, somewhat nearer to home in this small world, it isn't good enough that an unnamed junior doctor ignored the security policy of Hull and East Yorkshire Hospitals NHS Trust by deciding to take home a laptop containing (you guessed it: totally unencrypted) data on more than one thousand patients including names and treatments given. Not only was the laptop stolen, but the doctor concerned then further ignored security policy by not reporting the theft for an incredible two weeks after it occurred. In this case, the doctor went through the standard disciplinary procedures that the Trust has in place and is now returning to work hopefully a little bit wiser about the need to take security seriously.

The trouble is, both cases highlight the problems facing those in the NHS who have to deal with security issues when it becomes clear that policy alone is not enough.

You might think that the answer lies with the “culture of care” concept that I espoused in these pages just a couple of months ago when I (rather surprisingly) grasped the David Cameron nettle to suggest that we are all in this together. My argument then, in a nutshell, was that employees had to care about security policy in order to stand any chance of being bothered to follow it. Neither of the cases mentioned above would have happened if the staff in question had followed the data security policy put in place by their respective hospital bosses.

So are the bosses actually to blame? It’s an interesting question, to which the answer is a partial “yes”. The most well-meaning and watertight security policy is pretty worthless if a “can't-be-a*sed” employee can break it without even trying that hard. (And some of those employees aren’t lazy – they’re professionals who signed up to a life in healthcare to heal people, not worry about IT).

The Canadian Privacy Commissioner said after the Edmonton affair that he was “perplexed as to how we motivate people to obey and follow all these security rules and policies we have in place”; which, frankly, is just nonsense.

Perhaps he had better ask an IT security consultant for some advice, who would inform him that policy has to be backed up with procedure and sanction. That means using available technology to educate, remind, record and enforce.

  • Educate staff about general security best practice as well as Trust-specific security policy,
  • Remind them of these things at regular intervals when a recorded audit trail shows they may have forgotten
  • While all the time enforcing Trust policy using technology on endpoint machines.

That might have helped in the case of both the missing back-up drive and the stolen laptop, not least if the technology part of the equation had encrypted the data upon the devices. At least then there would be less of a worry about sensitive patient information being open to the prying eyes of the thief.

Of course, while that solves the bigger problem of keeping patient data private, no matter where it may end up, it doesn't address the smaller matter of the missing kit itself.

This has always seemed odd to me. After all, my local supermarket has taken to putting tags on pieces of meat to prevent people walking out of the shop without paying for them, so why can't my local hospital put tags on their portable hard drives, laptops, smartphones and the like? Millions of pounds worth of medical and IT equipment goes missing every year from NHS hospitals, yet a simple RFID tag based asset tracking system would alert staff if equipment was being moved outside a specific safe zone.

If my phone gets stolen, I can use a web browser to geo-locate it and then securely erase all the data on it. This isn’t science fiction, it's common-or-garden enough fact to be available to the ordinary consumer. All it needs is a little focus on the bigger issues and the security problem would start shrinking as well...