Digital health technologies are a growing presence in our day-to-day lives - from the step-counter in your smartphone, to online consultations with a GP, to artificial intelligence (AI) virtual patient monitoring. The term ‘digital health’ captures technology of varying complexity, all with the similar aim of engaging with and improving an individual’s health and lifestyle, while improving efficiency. Given the evolution and rising popularity of these technologies on the consumer market, we look at some important data protection considerations that are setting the tone in this new era of digital health products.

What’s so special about health data?

As health, genetic or biometric data is particularly sensitive, its misuse poses greater risks to data subjects. The GDPR therefore designates it as a ‘special category of personal data’ that must be given additional protections. Digital health technology companies need to take care if processing this category of data.

When trying to make their app ‘fit for purpose’, our digital health technology clients often ask us questions like:

  • How do I process health data lawfully?

  • What privacy notices and pop-up messages should my app display?

  • If my digital health app uses AI, does that impose any additional restrictions?

  • Are there any restrictions around using automated decision-making?

1. Processing health data lawfully

Someone can only process special category data lawfully under GDPR if:

  • They have a lawful basis for the data processing in the same way as for processing other personal data. A common example of a lawful basis under Article 6 of the GDPR is contractual necessity or legitimate interests, and

  • They can also satisfy one of the exceptions in Article 9(2) of the GDPR. A common example of an exception is the data subject explicitly consenting to the processing of their special category data

No link between the two is required. In other words, the choice of lawful basis under Article 6 does not affect the special category condition that applies.

Generally speaking, a data controller that provides digital health technologies to users may choose to rely on obtaining the user’s ‘explicit consent’ in order to lawfully process the special category data. Consent has a specific meaning for the purposes of the GDPR and must be given by a clear affirmative act, freely given, specific, informed, and unambiguous.

For example, if the digital health technology involves the use of a fitness app, the data controller may require the user to check an onscreen box indicating his or her consent to the specific data processing in question. In order for this consent to be ‘informed’, the data controller must provide adequate and transparent information to the data subject. An example of how to do this is by displaying a written privacy notice and informing the individual about the right to withdraw their consent at any time.

2.Transparency: privacy notices and information

Data controllers collecting special category health data from users of their digital health technologies face more challenges than those collecting ‘normal’ personal data. Processing personal data in a transparent manner is a key requirement to show compliance with the GDPR. However, fitness trackers and health apps often do not inform individuals exactly what, and how much, of their health data these technologies will process by, and for what purposes. To address this, digital health technology companies should be as transparent as possible regarding the information they show to user about the processing activities.

A good starting point is a carefully drafted and publicly available privacy notice. Under the GDPR, privacy notices must provide clear, intelligible and concise information to individuals on what personal data is collected, and how that data is processed. In particular, when dealing with vulnerable adults or children, information about data processing must be especially transparent; drafting privacy notices appropriate to the level of understanding of these audiences will require special consideration.

Challenge for wearables and app

A challenge that may arise in the context of mobile apps, wearables and fitness trackers is the manner in which information regarding their privacy is provided to users, and how this disclosure can remain GDPR-compliant. Providing sufficient information to users regarding their privacy may prove a little more difficult with wearables and small-screen devices. With this in mind, digital health technology companies may need to consider alternative means of providing information, for example through the use of easily accessible online privacy notices and appropriate linking and layering of full privacy policies.

AI & Transparency

Digital health technology companies, and in particular those who deploy “black-box” or complex AI interfaces, need to consider a number of issues in relation to transparency. Before launch, they must ensure that they are able to show users clear and detailed information about how their technologies will collect and process health data. They need to balance this against their own desire to protect their own trade secrets and details about any customised AI algorithms they deploy.

3. Minimising data vs. maximising AI: striking the right balance

The GDPR principle of ‘data minimisation’ means that a processor is only allowed to process ‘adequate, relevant and limited’ personal data.

Digital health technology companies need to pay particular consideration to this rule and how they can reconcile it with their own AI technologies, which often collect data automatically and require large volumes of data to work most effectively.  

In many ways, AI technology is in its infancy. Nevertheless, adopting a restrictive approach from the outset that limits how the technology can collect and process data may not be ideal for developers. AI technology struggles to learn from minimal amounts of data, which would make it less useful to a user. Developers may also, in turn, have less incentive to create and improve it if they don’t have the volume of data they feel they reasonably need.

The real challenge will be how to reconcile the requirements of GDPR with the realities of AI technology.

4. Automated decision-making: Diagnostic decisions and beyond

Under Article 22 of the GDPR, if a decision will have a significant effect on an individual, human intervention is required at some point in the decision-making process. This will likely come into play for digital health companies if, for example, AI-technology is making diagnostic decisions, or a company is basing decisions in relation to health insurance on data from a health-tracking app.

The automated decision-making principle applies more strictly if processing health data. A company can only rely on explicit consent, contractual necessity, or specific authorisation under EU or Member State if the data subject has given explicit consent for specific purposes, or the processing is necessary for reasons of substantial public interest. Both of these exceptions still require measures in place to protect the data subject’s rights.

Since the implementation of GDPR we have seen a general move away from a consent-based approach to permit the processing of personal data. Despite this, explicit consent by the data subject is still likely be the primary option for companies wishing to process health-related personal data using automated decision-making technology. This is because it may be difficult to satisfy the alternative options of contractual necessity or substantial public interest.

Any consent an individual provides always has to meet the GDPR requirements. The health technology should facilitate the exercise of data subjects' rights, eg it is accurate in its decision-making, it is non-discriminatory, individuals have the right to challenge the decision, and so on. We believe that the solution for most AI digital health providers may be to ensure that the AI that underpins the technology is not the only way to make significant decisions about an individual. However, providers will also need to consider the meaningfulness of the non-AI involvement. For example, have they provided employees with robust training to allow them to question the AI decision?

Final thoughts (from the humans)

The obligations imposed by the GDPR may appear daunting to proponents of digital health technology. The potential financial and reputational fall-out from a technology that gathers unnecessary amounts of personal data, and which is technically insecure, could be significant for an organisation.

Despite this, it is possible to develop practical digital health technologies that meet these requirements. The best examples we see often occur through active engagement with, and joint consideration of, both the technology itself and the legal principles that underpin it. If companies do this at the outset between the technical, commercial, and legal stakeholders, digital health technology companies should be well positioned to manage the complex laws and regulations governing this area. While also providing cutting-edge and revolutionary technologies that have the potential to enhance the lives of their customers.  

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.