Competent healthcare providers are great at medical things, be it measuring fasting blood sugar to diagnose diabetes, swabbing the backs of our throats, or clearing plaque off our grubby molars.
Securing electronic devices or health records? Not so much.
That's the takeaway from a study from the Ponemon Institute, which surveyed 80 healthcare organisations in the US and found that 75% don't secure medical devices containing sensitive patient data, while 94% have leaked data in the last two years (mostly due to staff negligence).
Fittingly enough, the study, the Third Annual Benchmark Study on Patient Privacy & Data Security [PDF], was paid for by ID Experts, a firm that sells identity-theft protection services.
The study also found that 69% of respondent organisations don't secure FDA-approved medical devices such as insulin pumps or wireless heart pumps.
The thinking being, most likely, that securing such devices just isn't healthcare's job, the report suggests:
"This finding may reflect the possibility that they believe it is the responsibility of the vendor - not the health care provider - to protect these devices."
Indeed, concerns over medical device security recently prompted the US Government Accountability Office (GAO), with prodding from Congress, toissue a report recommending that the US Food and Drug Administration (FDA) start thinking about how to secure insulin pumps and implantable defibrillators from being vulnerable to targeted attacks.
Meanwhile, just as medical providers move toward electronic records and health information exchanges where they can share files, the survey finds that cyberattacks against healthcare organisations are growing in frequency.
That's underlined by the frequent headlines detailing data breaches at medical facilities.
Here are just three of the healthcare breach headlines that have appeared since 30 November 2012:
- Alere Home Monitoring in Waltham, Mass., which reported that more than 100,000 patients were affected by a breach involving a stolen laptop.
- University of Virginia Medical Center's Continuum Home Infusion likewise reported that almost 2,000 patients were affected by a breach stemming from a pharmacist losing a USB stick.
- For its part, Christus St John Hospital in Houston told an undisclosed number of patients who participated in the St John Sports Medicine program that an unencrypted USB drive containing sensitive information had gone missing, according to HealthcareInfoSecurity.com.
Obviously, as with many industries, human error is at the heart of most data breaches, be it stolen laptops, lost USB memory sticks, or misplaced fill-in-the-blank devices.
As it goes with rocket scientists, so too does it go with healthcare providers: Ponemon Institute's survey found that the top three causes for data breaches at healthcare providers were, in fact, lost or stolen computing devices, employee mistakes, and third-party snafus.
The price of all these breaches is increasing. Ponemon calculates that the average cost to a breached organisation will hit $2.4 million over the course of two years, up slightly from $2.2 million in 2011 and $2.1 million in 2010.
The study relies on 324 interviews with 80 healthcare organisations, including hospitals or clinics that are part of a healthcare network (46%), integrated delivery systems (36%) and standalone hospitals or clinics (18%).
The participants came from diverse departments: security, administrative, privacy, compliance, finance and clinical.
Here are more findings from the report:
- 94% of organisations had at least one data breach in the last two years. The average number for each participating organisation was four data breach incidents in the past two years.
- The average economic impact of a data breach over the past two years for the responding healthcare organisations was $2.4 million. That's up almost $400,000 since the study was first conducted in 2010.
- The average number of lost or stolen records per breach was 2,769. The types of lost or stolen patient data most often included medical files and billing and insurance records.
- 52% discovered the data breach as a result of an audit or assessment, 47% discovered the data breach through employees.
- More than half (54%) of organisations have little or no confidence that their organisation has the ability to detect all patient data loss or theft.
- 81% permit employees and medical staff to use their own mobile devices such as smartphones or tablets to connect to their organisation's networks or enterprise systems. However, 54% of respondents say they are not confident that these personally owned mobile devices are secure.
- 91% of hospitals surveyed are using cloud-based services, yet 47% lack confidence in the ability to keep data secure in the cloud.
- Despite recent attacks on medical devices, 69% of respondents say their organisation's IT security and/or data protection activities do not include the security of FDA-approved medical devices.
It's interesting to note that while 94% of respondents had at least one data breach in the previous two years, 45% report that they had more than fiveincidents. That's up from only 29% that reported birthing data breach quintuplets (sorry! Couldn't resist!) in 2010.
What's up with that? Ponemon suggests that that particular finding underlines the importance of "determining the cause of a breach and what steps need to be taken to address areas potentially vulnerable to future incidents."
Fair enough.
Translate that into English, and you'll likely arrive at a number of conclusions, one of which NASA came up with, after a string of data breaches.
To wit: after the US space agency suffered yet another unencrypted laptop theft in November 2012, it scrambled to require full-disk encryption agency-wide.
Or perhaps, similarly, these strings of healthcare data breaches will lead to rules that lead to enforcement of encryption on USB keys.
That's a lesson the Greater Manchester Police learned the hard way after an unencrypted USB key was stolen from an officer's home - a data breach for which the UK Information Commissioner's Office slapped them with a £150,000 fine.
Unfortunately, medical skills can't be delivered in a vacuum nowadays. They can't be delivered without concern for security.
Here's hoping that our very smart, very capable healthcare organisations can stem this epidemic of security fumbles.