1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Study reveals privacy risks in 'trusted' health apps

Interview: Zulfikar AbbanySeptember 25, 2015

About 500 million people use health and wellness apps on smartphones - clearly, we want to trust such tools. But should we? Dr Kit Huckvale of Imperial College London found "unaddressed privacy risks."

https://p.dw.com/p/1Gdhi
Doctor using a tablet computer
Image: imago/Westend61

DW: Given the number of people using health apps on smartphones, watches and other devices, your study has revealed a massive breached of data and privacy.

Dr Kit Huckvale: Well, we see it as a big risk, for sure. What we found was that a large number of the apps we looked at had the potential to place users' data in jeopardy. But the study wasn't designed to look at whether that had actually occurred in practice. So we haven't shown that someone has gone in and stolen users' data. We found that a lot of these apps don't take appropriate precautions to protect against that kind of thing happening.

But the privacy risk in health apps has been debated for years. And, for instance, you found that 66 percent of apps sending identifying information had no encryption, four apps transmitted both identifying and health data unencrypted, and some didn't even have a privacy policy. That really is very serious, isn't it?

It certainly is a concern. As you say, it's been known for a while that apps generally pose a risk to privacy and health apps specifically out there in the wider world that may use poor security and privacy practices. But we were interested in this very specific set of apps - accredited apps - and I guess the purpose of accreditation - partly - was to address these privacy concerns, so a user could say, "It comes from an accredited source, I don't need to worry as much about whether there are privacy issues, because the accreditation program will have taken care of that for me.

A range of apps

So people have been putting their trust in the NHS Health Apps Library, which accredits apps… Give us an idea of what kind of health apps we're talking about and whether they're apps for personal or professional use.

The apps we looked at were exclusively for those intended for patient use, and they cover a range of uses: apps for people with a long-term condition, both as a source of information as to how to manage that condition, but also apps that allow you to record diary data - for example, a person with diabetes could log things about their insulin and their blood glucose over time with the intention perhaps of sharing that with their health care professional.

Infographik "Man-in-the-middle" computer attack / hack Englisch
The researchers used a "man-in-the-middle" hacking method that allows data to be intercepted

And then a second key class of apps we looked at were those for people who are maybe otherwise well but are looking to make some kind of change in their health or lifestyle - apps helping people lose weight, so they're recording weight and food intake, or those interested in stopping smoking.

For the actual study, I understand you performed a hack - a "man-in-the-middle" hack. What is that and how does it work?

Well, in fact, we did two things. We pretended to be users - we set ourselves up on these apps and over a period of time tried to use the apps as a [real] user might. And then we set ourselves up in a hacker role, so we could look into the devices and see what was being written onto the storage on the smartphones and tablets we were testing on, and then we also set ourselves up on the network so we could look at all the traffic that was coming off the device and see whether any of that related to the apps we were testing, and if so, what data were being sent and to where.

And you're right - that put us a little in the position of a hacker, because in some cases we were able to see the data that were being sent and were not encrypted, and so we could see in plain text the information that we had entered into the apps. And that would be the kind of information that a general user might enter into the apps that could be accessible to a third party.

Industry standards

So you intercepted encrypted data - but can encrypted data be cracked and read?

The purpose of doing that was to understand what data were being sent. In general, the level of encryption that was being used is an industry standard level, so I think we can assume that where there was encryption that would be an appropriate level of protection.

The idea behind the NHS Health Apps Library is, in a sense, to reassure people that these apps meet UK data protection standards. Does this mean then that the NHS is not fulfilling its duty, or that the Data Protection Act in the UK - and similar acts elsewhere - are insufficient?

I'm not an expert in law, but my understanding is that data protection principles are relatively well established and understood, and I think in terms of ultimate responsibility, it lies with the developers of these apps; they are ultimately responsible in ensuring they comply with the law.

Doctor examing a patient
Do we place too much trust in our own ability to track our health - and in the apps we use to do it?Image: picture-alliance/dpa/Maxppp Mignot

The NHS's accreditation process asks developers to assert that they had taken those steps to comply with the law - that's the process that we understood was going on when we were looking at these apps [August 2013 to January 2014]. And I guess our data suggest that relying on that self-declaration may not be enough, and some more interventionalist [sic] approach may be necessary.

Warning: No privacy policy

What about the apps that had no privacy policy? In those cases, should we worry that anything could potentially be done with the data, such as its being sold to third parties?

From the perspective of the user, the lack of a privacy policy is a signal as to whether you should be placing more or less trust in a particular app. Perhaps I should clarify that when I talk about responsibility ultimately lying with the developer, I'm talking about responsibility in a legal sense. If we're talking about ethical responsibility, then accreditation programs that are making claims that they've checked the apps are also involved in that discussion, so they need to make sure their processes are up to scratch.

If a privacy policy is missing, then that's a problem because app developers are required to disclose the uses to which data will be put, so there's a problem in a legal framework. For users, that leaves us in a grey area. But the advice to users would be simply: if in doubt, that's the kind of signal that you should use to decide that maybe you shouldn't be using that kind of app.

Dr Kit (Christopher) Huckvale is a qualified doctor, researcher at Imperial College London, and lead author of the study.