Privacy and Big Data

Big data is a buzzword in healthcare and probably in every industry. Google, Samsung, and Apple are getting into healthcare with data aggregation and sharing platforms.

New use cases for low-cost sensors with extended battery life are cropping up almost daily. There are sensors in wristbands and clothes. Our phones themselves are tracking our activity and whereabouts

Our search history and Internet history are being captured. So are our social media connections and interactions. Health searches and education happen online, so health data is being captured this way. We’re documenting many or most encounters into a digital record.

With tracking and new data comes privacy concerns, and with good reason. Consumer groups are actively engaged in dialogues on all levels to try to address privacy issues for individuals.

The data that is being collected on us — even just the small sliver of that data that is relevant to health — is staggering. With programs like OpenNotes putting patients in control of their health records, the potential for abuse of data is definitely there. Health data is valuable for many purposes –identity theft, facilitating healthcare fraud, or helping the Chinese government with cancer research.

I’m not fear-mongering, I’m just being realistic. I’m not personally concerned about big health data and my privacy, but I understand and respect why people and groups are.

The real problem is that big data and personal privacy don’t really play nice together. In a world where the data points on individuals are growing every day, we are losing some of our privacy and control over what organizations know about us. There’s little we can do about it.

The main response to this problem is putting people in control of how their data is shared. I don’t think that fixes anything. Here’s why.

Too much data. The sheer volume of data about us is crazy. Most of us are being tracked all the time. Some of it is active, like when you’re at the doctor and you get your vital signs recorded, or at the a smart kiosk at a pharmacy (more of those are getting installed every day). Some of it is passive, such as the restaurants that you frequent or the apps on your phone or the searches you do on Google. All in all, it’s a lot of data, and much of it is relevant to your overall health and wellbeing.

Privacy policies people can’t understand. I covered this last week with the new study that found most mHealth apps don’t have privacy policies and those that do are written at a college level. That’s just the tip of the iceberg. The lack of simple, standard privacy policies have rendered privacy notices useless because people have given up trying to read and understand them. My favorite are the 40-something page Apple agreements I’m supposed to read on my phone. Even HHS complicates matters by stating on its site, “Any covered entity, including a hybrid entity or an affiliated covered entity, may choose to develop more than one notice…" And this is just for covered entities under HIPAA, not the myriad of apps and companies that aren’t defined entities under HIPAA.

Granular sharing is not feasible. When we chose to put control in the hands of individuals, it’s not simply a yes/no option. It includes the type of data to be shared, the types of individuals who can see those types of data, and whether those individuals be able to read, write, or both. Take a look at the screenshot below. This is the standard permissions screen from Apple HealthKit. Imagine a PHR app that connects to HealthKit. It would have tons of entries and read + write options for each. The amount of information being tracked makes granular sharing exceptionally difficult.

healthkit.share

We need to resign ourselves to the fact that we’re giving up some of our privacy with big data. That’s not necessarily a bad thing, as additional data can be used to benefit us, like targeting interventions and quickly learning from those interventions to improve outcomes over time. 

Apple’s recent change to its developer agreement is a good thing for privacy. By limiting the use of HealthKit data for marketing and resale, Apple is curbing a certain amount of abuse.

Compliance frameworks like HIPAA, while flawed in many ways, do at least expose participants (covered entities, business associates, and subcontractors) to real risk, in the form of financial penalties, for not securing health data in certain ways or for selling health data. While far from perfect, it’s a good thing, as it helps align incentives around security and privacy.

There is also the strategy of just trusting companies like Apple and Epic to do the right thing. That’s less appealing to me. As the ecosystem of apps and sensors that produce health data grows and companies like Apple and Epic integrate that data, it’s blurring the lines of control of data.

Regardless of where you stand, big data and the associated privacy concerns are going to continue to be an important dialogue. I don’t think there are great answers to these questions today and there may never be. What do you think?

TGphoto

Travis Good, MD/MBA, is co-founder of Catalyze. More about me.

↑ Back to top

Founding Sponsors

Platinum Sponsors