Transmission of health data: pitfalls in health apps & fitness trackers

Digitisation is permeating all areas of life. Also, especially within the health care sector, the eagerness to spur the digital transformation is immense – equally from state and private sides. Thus, the market is already well-filled with a variety of fitness trackers and health apps today. Even health insurances promote the use of their own apps and the purchase of fitness bracelets with premiums. However, in view of the increasing trend of self-measurement, this development is no surprise. The digitalisation of the health sector has the potential to significantly improve the quality of medical care: more reliable diagnoses by using artificial intelligence (AI), extensive development of rural areas through new communication channels, drastic reduction of public expenditure by optimising health care.

The key question: what should be considered when transmitting health data?

Fitness trackers develop their full potential only in conjunction with a corresponding app that visualises the collected data pleasantly and understandable. Mostly, users must first register and create a new profile in order to fully benefit from the app. This profile and the collected personal data are usually transmitted to a central server, stored and synchronised continuously. The requirements placed on such data transmission are tightened by the fact that the data to be processed is regularly health data. In addition, hackers are following the development of the self-optimisation market closely, because wearables have become a popular target. Payment data, user and health data are the focus. In this respect, from a technical point of view, data controllers must ensure greater security of data. So, what should be considered with regard to such transmission?

Legal challenges – tightened requirements within the health sector

1. The combination of individual data also leads to personal reference
Unquestionably, the scope of the General Data Protection Regulation (GDPR) is open to application in case data collected via a fitness bracelet or an app (user profile, IP address etc.) is processed since it bears direct personal reference. In addition, it regularly affects particularly sensitive health data. Thus, the combination of a fitness bracelet and an app can even allow recording of ECGs. In order to receive e.g. optimised training recommendations, more data could be entered by the user. However, it must be considered that the combination of individual data may allow drawing conclusions regarding the health state of the person affected. Like this, the so-called “Body Mass Index” (BMI) can be derived by combining additionally indicated data such as weight, age and size. Therefore, companies are often unaware of the specific amount of processed health data. However, a complete capture of the processed data is necessary in order to be able to fulfil the comprehensive transparency and information requirements under Article 13 GDPR.

2. Use Privacy Impact Assessment for your own gain
The common Blacklist of data protection authorities shows: If measured data of sensors, installed in fitness bracelets or smartphones (heart rate monitors, acceleration sensors, etc.), are stored centrally, a data protection impact assessment (DPIA) is regularly to be carried out in accordance with Article 35 GDPR. This obligation to conduct a DPIA should not be seen as a chore by companies, but as an opportunity. DPIAs help companies to evaluate what data they are processing and what they have to be aware of when submitting data in order to comply with the General Data Protection Regulation. By this means, challenges regarding data protection and data security can be identified at an early state in the development in order to prevent later legal complications and penalties.

3. Know specific legal bases for the processing of health data
If health data are processed (including transmission), data controllers are not provided with the regular legal basis of Article 6 (1) GDPR. Accordingly, the processing of health data remains inadmissible despite the existence of one of the legal bases of that article. The lawfulness of the processing of health data is governed exclusively by Article 9 (2) GDPR. If health data is collected and processed via a fitness tracker or a corresponding app, the consent of the person affected must regularly be obtained.

In general, the same requirements apply to the granting of consent under Article 9 (2) (a) GDPR, as to the granting of consent under Article 6 (1) (a) GDPR. However, due to the sensitivity of health data and therefore the narrow interpretation of exemptions for formulating declarations of consent, data controllers should be particularly rigorous and fully informed about the processing intended. Especially the fact, that the data does not remain on the end device but is transmitted to a central server of the enterprise should be highlighted and presented comprehensively.

In addition, if companies intend to process the personal data for a purpose which is not necessary for the provision of the actual service, the so-called prohibition of coupling must be considered. For example, if the data is to be further processed for marketing purposes, such consent may not be combined with the central consent form, but instead the consent must be obtained separately. The same applies in case of transfer to third parties, which is common in the field of health apps.

Checklist Consent

  • voluntary
  • for certain cases (general consent is inadmissible)
  • delivered in an informed and unmistakably manner
  • understandable and easily accessible form
  • clear and simple language

4. Choose conscientious data processors
Server operators on the one hand and providers of apps and fitness trackers on the other hand are often not identical. The server operation is rather outsourced to a specialised service provider (so-called outsourcing), who then is able to inspect the personal data. Such a service provider is regularly qualified as a data processor, so that the conclusion of a data processing agreement becomes relevant. In accordance with Article 28 (1) and (3) (c) GDPR – and in order to avoid reportable data breaches – data controllers must ensure by contract that also the service provider adopts adequate, state-of-the-art safeguards for the protection of data (technical and organisational measures) whilst processing health data. This applies especially in the context of highly sensitive health data.

Due to existing or advantageous infrastructures, companies often fall back on providers from abroad such as Amazon Web Services. In doing so, it is necessary, that data controllers must raise the following questions: has the commission decided by resolution that the respective country shows an adequate level of protection regarding data protection matters? Has the provider been certified under EU-US Privacy Shield? If this is not the case, data controllers and their processors must provide for appropriate guarantees under Article 46 (1) GDPR, such as standard contractual clauses.

The safety of health data plays a key role

Data security has a special importance in the context of health data processing. Hackers show increasing interest in wearables and health apps and, unfortunately, providers oppose only in the rarest of cases with appropriate safeguards. This deficiency is crucial when it comes to an app that is used by hospitals, e.g. for the purpose of monitoring patients. In this context, the continued availability, confidentiality, and integrity of health data is predominantly important. Compromising single connections and manipulating individual measured data can lead to health-damaging or even life-threatening false diagnoses.

In addition, data controllers should not only protect the connection between end device and server from so-called “man-in-the-middle” attacks, but also focus on the transfer between wearable and app, as this is also a popular target of hackers. An encryption of connections (Secure Sockets Layer, short: SSL) should therefore be flanked by the encryption of the transmitted data (keyword “hashing”) by default.

Especially from a technical point of view, that is an opportunity to stand out from the competition.

Reliable anonymisation of health data remains a difficult part

Whenever possible, personal data should be anonymised. In this way, the scope of application for GDPR is left, so that there is significantly less administrative effort concerning the use of data.

With regard to health data, however, successful anonymisation within the meaning of the GDPR (re-identification impossible) seems doubtful. For example, a 2013 study has shown that 4 to 5 blood glucose or cholesterol levels out of around 60,000 patients are enough to allow unambiguous identification of affected individuals.

In this respect, at least the pseudonymisation of personal data should be promoted.

Recommendation for action and Conclusion

Data protection and technical challenges associated with the processing of health data are considerable. For this reason, companies must consider their obligations in order to be able to survive on the market of fitness trackers and health apps in the long term.

In order to avoid data breaches and life-threatening incidents, data controllers should, first of all, take care of particularly secure connections between the tracker, app or smartphone and server. The transfer to third parties should be avoided, if possible. Ideally, companies operate servers on their own. For the purpose of financing, it makes sense to dispense with advertising and, alternatively, develop paid-for apps that impress with particularly high data security levels.

Handling enquiries from data subjects – what is really relevant?

The proper handling of enquiries from data subjects benefits any business. A well-versed approach can not only ensure compliance, but also optimise and accelerate the entire business process. For this reason, responding to data subject requests should be a firm component of a good data protection management system in the organisation. But how does one deal with data subject’s enquiries properly? What should be considered according to the GDPR?

Recognise an enquiry from a person affected

First of all, a request from those persons affected must be recognised as such. This may sound simple, but it is not always simple like that! After all, the majority of the persons concerned are legal laymen who are unlikely to literally use the word “data subject” and, in particular, name the specific right that they seek to assert. However, they do not have to! Here, the data controller is asked to discover what the data subject actually requires. In case of the following phrases, one may already assume that the message represents a data subject’s request and therefore precautionary treat it as such:

  • “I have a question relating data protection” or “I want to know something about how my data is being handled”
  • Terms like information, disclosure, blocking, limitation of processing, deletion, right to be forgotten or objection are used
  • “Where did you get my data from?”
  • Unauthorised, unwanted advertising, spam is objected to
  • “Please unsubscribe from the newsletter”
  • “Please correct the following information about me”
  • Person threatens with lawyer, warning or notification to data protection authority
  • Person demands the data protection officer

These phrases may indicate the different types of request that are to be assigned to the respective data subject’s rights of the GDPR:

  • The right to information (right of access)
  • The right to erasure
  • The right to restriction of processing
  • The right to rectification
  • Right to data portability
  • Right to object to previously granted consent

The Response – what should be considered?

It is true that the requests can be made by any means, such as orally, electronically, by telephone or written. However, the response should always be in writing or, if necessary, electronically. Electronic responses should be handled with care as data security during transmission must be ensured. In addition – and required by law – the answer must be precise, transparent, comprehensible and easily accessible to those concerned. We also advise that the entire process and the entire communication is fully documented!

Furthermore, of course there is a deadline to consider. It is intended that the answer must be given immediately, but no later than within one month of receipt of the request. Although there is an exceptional extension of time up to three months, but you better not rely on it! This is only permissible in rare cases.

It should also be noted that the data subject must always be informed if their request can not be met. This is the case, if e.g. the person requests the deletion of their data, but it is precluded by statutory retention requirements.

If the request reaches the processor, he does not have to answer it, but forward it to the data controller instead. A data processing agreement (DPA) should regulate precisely the order of processing.

The Response – the Process

What is the best way to proceed when a claim has been received? We recommend the following approximate sequence:

  1. Forward the request to the data protection coordinator of the company
  2. Specify the deadline: note the time of receipt of the request
  3. Send acknowledgment of receipt to the person concerned
  4. Research: Searching the data sources for the information of the data subject
  5. Identification: Data protection coordinator identifies data subject based on data sources
  6. Data protection officer can be contacted at any time
  7. Change of database according to the type of request
  8. Send a reply via data protection coordinator
  9. Documentation of the process by data protection officer

In particular, the fourth point encounters difficulties in practice, because the data on the requesting person must first be found. Here, a well-maintained record of processing activities (according to Article 30 GDPR), which clearly lists the data, is enormously helpful.

The consequences of the request

The consequences of the data subject’s request depend on the specific type of query:

Right to information (right of access): If the data subject claims information, he / she must be informed about the stored data and provided with the information to the legally-defined extent. The person specifically has a right to information about the processing purposes of personal data (e.g. for the purposes of e-mail marketing), the categories of personal data being processed (e.g. contact details), the recipients of the data (especially outside the EU) and the retention period of the personal data.

Right to data correction (right to rectification): The Right to data correction includes the right to correct and supplement incorrect data concerning a person.

Right to restriction of processing: Here the data controller must ensure that the stored personal data are no longer processed.

Right to erasure (“right to be forgotten”): If the deletion of personal data is requested, the consequences are self-explanatory: the relevant data must be deleted. In some cases, however, an anonymisation may be sufficient, if reidentification of the person is technically and organisationally excluded.

Right to data portability: If a person asserts his or her right to data portability, the person must receive the data from the data controller in a structured, commonly used and machine-readable format, if the processing takes place in an automated process and on the basis of a consent or if it is required in order to carry out a contract. If technically feasible, the data can also be transmitted directly to another data controller.

Right to object: With the right of objection, the data processing being intrinsically allowed is made inadmissible for the future. The right may be exercised on data processed based on public or legitimate interest. However, it should be noted that consents can be withdrawn, too.

Conclusion: Ensure compliance with structure and routine!

In fact, it is not that difficult to deal with said enquiries properly. If you are familiar with the procedure of how to respond and what to consider which each type of request occurs, that is half the story.

However, the devil is in the details!

It always depends on the individual case and on how the answers are structured in terms of language and content. It may even happen that you have to deviate from the usual procedure. For this reason, we recommended to entrust an experienced data protection with including the data subjects’ requests into the companies’ data protection management system and establishing a proper and complete record of processing activities.

Please do not hesitate to contact the IT and legal experts of ISiCO Datenschutz GmbH! We can show you how to safely and correctly deal with data subjects’ enquiries and help you to create your own record of processing activities! Moreover, we train your employees in the proper handling of data subjects’ rights and we would also be happy to take the role as a data protection officer in your company!

The biggest GDPR myths: the consent – what is right and what is wrong?

GDPR myth busters – Part 1

The GDPR is effective since 25 May 2018. Before and after, there was a lot going on. Hardly any other topic has been talked about and published so much. Unfortunately, the numerous publications with the allegedly “best” references and recommendations have not resulted in an understanding amongst the “data controllers” of what it is about. On the contrary, there is still general confusion about the implementation of the GDPR. Best example: The flood of newsletter emails before and after 25 May 2018, with the request to give consent again (or for the first time) or simply with the information that newsletters will not be sent in the future due to lack of consent. What is true and what is wrong? Lots and lots of questions, but no satisfying answers. With this series, we want to give the required answers: What myths are there about the GDPR? What is right and what can be refuted?

The biggest GDPR myths about consent:

In data protection law, the principle applies: The processing of personal information is prohibited, unless the law or the data subjects permit it. This permission is called Consent and is the linchpin of the GDPR. The law places strict requirements for consent. But what is it exactly and is everything that you hear about it true? Nine misconceptions about consents under the GDPR:

1. Consent must always be obtained in writing!

With the form of the consent, the GDPR complies with the data controller. For example, in Sec. 4a of the former German Federal Data Protection Act (FDPA), the written form for consent in the processing of personal data was so far ordered. According to the GDPR, a written form is not necessary anymore. A “declaration or other affirmative action” suffices, i.e., theoretically, even a nod would be enough. However, only in theory, because data controllers must prove that the consent has been given effectively (obligation of documentation and proof). This may be difficult when you have only a nod. Instead, the active clicking on a check box (opt-in) is sufficient.

Caution: A check box where the box had already been ticked and has to be deactivated (opt-out), is not considered an effective consent.

2. Consents, which were obtained before the rule of the GDPR are invalid!

That is not correct. The European legislator has decided in favour of the processors that also consents obtained before 25 May 2018 remain in full force and effect. This means, that once effectively obtained data protection legal consents do not need to be obtained again, as long as they also comply with the provisions of the GDPR. Since – for example in Germany – the previous requirements are very similar to those of the GDPR, in most cases a renewed request for consent will not be necessary.

Caution: There are special requirements for the consent of minors.

3. Minors can not give consent effectively!

It is true, that the GDPR sets rigid age limits for the effective submission of consent. The GDPR stipulates that minors can only effectively consent to the processing of their personal data by the age of 16 years. This age limit can be lowered by the member states up to the absolute lower limit of 13 years. For the processing of personal data of under 16-year-old persons data controllers need the approval of the person’s legal representative.

4. Consents can also be obtained later!

The myth, that it is not a matter of the timing of obtaining the consent, is not true. “Consent” is a legal term and means prior consent. The opposite of consent is the “authorisation”, i.e. subsequent consent that does not legitimise the processing of personal data.

5. For consent, the double opt-in procedure is mandatory!

First, it can be stated with certainty: The opt-out has become obsolete! Consents designed as opt-in (that is, ticking in a check box) must be granted actively. It is questionable whether the so-called double-opt-in is mandatory for the granting of consent. Under the Double-Opt-in procedure, the tick is set first. In addition, a provided link, e.g. when subscribing to the newsletter, must be clicked to confirm the consent. The advantage of this procedure is that it offers a simplified proof of the granting of consent, which is mandatory for the data controller’s obligations of documentation and proof. If this proof can also be provided with the single-opt-in method, this is enough, so that a double-opt-in procedure does not necessarily have to be carried out. However, it can be said: The Double-Opt-in offers more security against sanctions and two is better than one anyway!

6. One consent for all data processing is sufficient!

The idea that one blanket consent is sufficient for legitimacy of unlimited data processing is grossly wrong. The principles of consent are voluntariness and purpose. The person affected must be informed precisely regarding the purpose for which his data is used. He/she then has to decide for him/herself which processing operations he/she wants to agree to.

Caution: Voluntariness is not given if the rendering of a service is made dependent on the consent of a processing that is not necessary for the provision of the service (so-called prohibition of coupling).

Example: A customer orders goods in an online shop. During the ordering process, the customer is informed that the data entered (e.g. e-mail address, address, telephone number) can also be used for advertising purposes. In order to continue the ordering process, the customer must give his consent to the use of his data for advertising purposes. Since the consent to data processing is not needed for the provision of services (shipping of the product) the prohibition of coupling applies.

7. The revocation of consent must be equal to the granting!

A once given consent must be revocable for the future. New in this context is the regulation of Art. 7 para. 3 p. 4 GDPR: The revocation of consent must be as simple as the granting of consent. However, this does not mean that the revocation procedure must be completely the same as the one granted. Especially in online trading, the consent is usually given once via opt-in before the data processing begins and there is no possibility of removing the tick later, unless there is a customer account. The keyword is simplicity. As long as there is an easy way to revoke consent, this is sufficient. In newsletters, for example, this is possible by putting an unsubscribe-link to the end of the email or an opt-out option into the privacy policy.

Caution: The possibility of withdrawal must be made aware of clearly and easy to read.

8. I can not process any data without consent!

If consent or permission according to Art. 6 para. 1 lit. b or c GDPR are missing, data processing can also be based on legitimate business interests under certain conditions, insofar as these outweigh the rights and interests of the data subject (Article 6 (1) (f) GDPR). Even without consent, data processing is possible. According to the recitals of GDPR, economic interests and, in particular, direct marketing are expressly recognized as legitimate corporate interests. However, companies must in fact carry out a balancing of interests and this must be in favour of the company.

9. For every cookie I need a permission!

The subject of cookies and consent is still confusing under new data protection law. In any case, it is clear, that cookies represent personal data that must be measured against the GDPR. This means, that the use of cookies is generally prohibited, unless the data subject has consented, or it is subject to one of the permitted grounds of Article 6 GDPR. As described above, data processing in accordance with Art. 6 para. 1 lit. f) GDPR can be allowed, if there is a legitimate interest. This includes, inter alia, commercial corporate interests (e.g. advertising). It must always be considered, whether the interest in data processing outweighs the data subject’s interest in the protection of his/her data.

However, for good reasons, the setting of cookies for advertising purposes can be based on a predominant interest in accordance with Art. 6 para. 1 lit. f GDPR as long as the data is collected in a pseudonymous form. The pseudonymisation would meet the needs of the users’ legitimate interests. As a result, one would be back at the opt-out option. However, also in this constellation, it is useful, to use a cookie banner on which the website user is informed about the setting of cookies and their purposes.

Caution: The person affected must always be given the opportunity to object and must be clearly informed about this opportunity.

Digitalisation in healthcare and data protection – two things, that get along?

(Including references to German legislation in particular, but not exclusively)

German State Minister for Digitisation Dorothee Bär gave an interview to the German newspaper Welt am Sonntag on 23 December 2018, in which she explained that digitisation in the healthcare sector could be achieved, among other things, by easing data protection regulations.

“In Germany we have the strictest data protection laws worldwide and the highest privacy requirements. This blocks many developments in the health sector, so we have to disarm at one point or another, delete some rules and loosen others”(see interview). In particular, she took aspects into account concerning the introduction of the electronic health record until 2021.

Regardless of whether there actually is a causal connection between data protection and slowed digitisation, the question is, if the constraints of the data protection regulations in Germany can simply be “eased” or even have to be “eased”!

The General Data Protection Regulation regulates the processing of health data in various places and should therefore be taken into account in all matters of digitisation in the health sector. This article is intended to provide an overview of the data protection requirements of the GDPR in the health sector and show some possible fields for Member States’ own national regulations.

Background: Regulations on health care within the GDPR

The GDPR does not explicitly regulate the health system. Originally, there was the intention for an Article 81 “Processing of personal data for health purposes”, but it was dropped after the discussions on the latest version. Nevertheless, the GDPR contains provisions that specifically relate to health data and thus have a significant impact on digitisation within the healthcare sector.

Data processing, i.e. the processing of personal data, is generally prohibited under the GDPR, but allowed in accordance with Article 6 (1) lit. a), for example, if the person affected consents to the processing. Furthermore, in addition to other permission reasons, data processing is even permissible if a balancing of interests turns out to be in favour of data processing (such as a company’s interests). For health data, however, Article 9 GDPR provides for special regulations. Health data is considered particularly worthy of protection under Article 9 because it is particularly sensitive. Due to the sensitivity of the data, their processing is therefore only permissible in accordance with the strict requirements of Article 9 (2) GDPR in conjunction with Article 6 GDPR.

The scope of application of Article 9 (2) GDPR

Businesses are urgently advised to monitor closely the legislation of the EU and the Member States, as Article 9 (2) lit. (a) gives the respective legislators the possibility to make certain areas of the processing of health data consent consistent. If legislators make use of this option, the processing of certain health data would be unlawful, despite the consent, since Article 9 GDPR does not provide for a contractual basis for processing.

However, up to a national rule, it remains the case that if consent meets the requirements of Article 9 GDPR, it can serve as the legal basis for the processing of health data. Especially in connection with the use of health apps, such consent becomes relevant.

Health Apps

An important use case in the processing of health data in the field of digitisation are health apps, such as heart rate monitors or sports trackers. Firstly, Article 9 (2) lit. (a) GDPR enables the processing of personal data via consent of the data subject. It is important to pay attention to the extensive information and transparency obligations of the GDPR. The affected person must know at all times what happens to his data. Without his permission, health data may not be disclosed to third parties (such as insurance companies).

Article 9 (2) GDPR contains a large number of other permissive rules that health apps can be taken hold of. An example is the protection of vital interests if the person affected is no longer able to give consent (for example, if the person affected is located having a heart attack reported by an app). However, for this permissive rule to apply, there must be an actual and specific case concerning a vital interest. That means, that the permanent tracking of data that (eventually) can also affect a vital interest is therefore not allowed. Likewise, the prior obvious publication of data by the data subject (such as posting a lower resting pulse as a result of regular training) constitutes a permissive rule.

In addition, the protection of individual and public health is considered as a legal basis. Where the individual health requires to treat certain illnesses and symptoms of the person affected professionally, this may also require falling back on the data needed from the app. Public health mainly covers cases of spreading health threats, such as epidemics, of which the spreading can be tracked by an app. The justification standards of individual and public health for the operation of health apps lead to the related question of the admissibility of the electronic medical record under the GDPR. In any case, there is sufficient legitimacy for the apps on the basis of the GDPR, so that the loosening of German national data protection regulations does not seem necessary in this respect.

The Electronic Medical Record

Both Article 9 (2) lit. (h) (individual health) and Article 9 (2) lit. (i) (public health) enable Member States to determine when the processing of health data for these two purposes is necessary. Thus, there are so-called opening clauses giving Member States great flexibility to adopt their own national regulations without violating the GDPR. In this context, corresponding provisions in the German Social Security Code V (SGB V) could be introduced, which not only allow an electronic patient record having the format of a health card, but also of an app on the smartphone. Although the opening clauses of the GDPR and SGB V thus make it possible to introduce the electronic patient file, their respective use in individual cases must always be dependent on the respectively given consent of the patient (or the other permission norms of Article 9 (2) GDPR). If German State Minister for Digitisation Bär in this regard has spoken of a relaxation of standards for data protection to introduce and use the electronic medical record, there are considerable doubts whether the use of the patient record would be compatible with the GDPR without the consent of the patient and outside the provisions of Article 9 (2) GDPR.

The current proposal for reform on Article 305 (1) SGB V expressly stipulates such a need for consent to use the electronic patient record. However, it remains uncertain whether under the proposed version of Article 305 (1) SGB V “providers of electronic medical records” would not be able to undermine the consent requirement, since the current wording of the reform proposal enables such an exemption by allowing easier access for such providers. This probably represents a violation of the consent requirement of the GDPR.

Big Data

Big Data analyses of health data happen in different applications. For example, such analyses can also prevent the spreading of diseases and optimise the need-based supply of medicines and medical devices. For both reasons, therefore, they may be founded upon Article 9 (2) lit. (i) GDPR (protection of public health and ensuring the quality of medicinal products and medical devices). However, the data subject rights, such as the right of revocation – which is mostly difficult to implement in respect of big data analyses – must always be considered. In order to meet these challenges, the principles of pseudonymisation and anonymisation must be taken into account. This also applies in the event, that the Member States decide to use the opening clauses – which are also available in this context.

Conclusion and Recommendation for Action

Upon evaluation to the GDPR, health data represent data, which is particularly sensitive and therefore particularly worthy of protection. Regardless of whether these data are processed on the basis of national law or based on the GDPR, one must always ensure a legal basis for their processing. If the legal requirements are met in each individual case and their conformity with GDPR has also been checked, the processing of health data in new digital applications will be admissible.

Artificial Intelligence and Data Privacy – application scenarios

The possible ranges of applications of Artificial Intelligence (AI) are quite divers. The General Data Protection Regulation (GDPR) is often quoted a showstopper – with the following article, we want to demonstrate, that is not the case. To this end, we are taking up very specific fields of application for Artificial Intelligence, which are already a reality in numerous companies and institutions – “despite” the GDPR. The examples illustrate a selection of data protection challenges that must be overcome in order to be able to confidently deploy the respective AI.

Chatbots and digital assistants as first point of contact – the appropriate legal basis without risking a conversion killer

The trend towards supplementing customer service is moving inherently towards digital assistants, who proactively recognise problems and needs of the user (so-called “predictive analytics”) and proactively propose appropriate solutions. Prominent examples of digital assistants are Alexa (Amazon), but also Siri (Apple) and Google Assistant (Google).

For a chatbot to take the step from so-called machine learning to deep learning, it is necessary that at first, large amounts of user data are collected and structured or rather processed for use by an AI. The required data are regularly collected by companies via the input field of the chat window. However, the immediate linking of the user’s input with the IP address of his end device creates a reference to persons that opens up the scope of application of the General Data Protection Regulation (GDPR). The processing of personal data on the other hand requires the existence of a legal basis. In case of chatbots or rather digital assistants, declarations of consent are conceivable, but the implementation involves a lot of effort. Before the Chatbot can actually provide any assistance, it must first obtain the consent of the requesting party. The person seeking help, however, does not want to have to deal with data protection information before he gets the solution to his problem.
Since the chatbot regularly only gets active on the initiative of the person seeking advice, companies – by means of correct structuring – may use other legal bases instead. In this way, the chatbot can begin its actual work immediately, since the advice seeker does not have to be fully informed. In case of chatbots, data controllers usually provide their transparency and information obligations with the privacy policy and a corresponding note.

More reliable diagnosis in the health sector – take protection measures at an early stage

Especially in the health sector, Artificial intelligence has the potential to make significant progress, if not breakthroughs. In the field of imaging techniques, AI can be used to greatly simplify and complement the work of a physician today.
The basic requirement for the use of AI is always, that the information to be analysed is already digitised: CT scans, electrocardiograms, close-ups. Particularly in the area of healthcare, this basic requirement is fulfilled in many sectors.
However, the problem here is the fact that health data – which are particularly sensitive data – are processed. At first, large amounts of reference data are needed to train the AI. In order to finally detect irregularities and make a diagnosis, the AI must match the patient’s information (e.g., in the form of a CT scan) with the reference data. In both cases, personal information is involved since the data comes directly from a natural person.

For the processing of health data the GDPR sets up special restrictions. In so far, it requires appropriate safeguards. The focus should be on a pseudonymisation or at best anonymisation of said data. Anonymisation would lead to the scope of the GDPR being abandoned. However, if this is not possible, institutions and companies should concentrate on the use of recognised pseudonymisation techniques, as the GDPR fundamentally rewards any kind of risk-minimising measures. For long-term studies or similar projects, the additional use of a “trust centre” is recommended. In such a case, an assignment of the pseudonymised data is only possible by including the trust centre, i.e. a third party that securely stores the assignment rule.
Companies or institutions should lay the technical foundation for the protection of personal data – so-called “Privacy by Design” – already during the stage of development. A data privacy impact assessment (DPIA) can be helpful at that stage. Due to the potentially high risks regularly associated with the processing of health data, the obligation to carry out a DPIA is particularly prevalent in the health sector. With the help of a DPIA, specific risks can be identified and targeted protective measures can be taken to ensure the security of the data.
Apart from that, data controllers should above all comply with the extensive transparency obligations under the GDPR in case AI is applied.

More cost-effective results through automated decision-making – use exemptions

Another area of application for Artificial Intelligence is automated decision-making. This way of data processing affects our lives already today – especially in the field of bank lending. It is problematic though, that in many cases the decision-making of self-learning algorithms can not be fully understood due to their black box character.
One possible consequence of an incomprehensible, fully automated case-by-case decision is always the exclusion or even discrimination of certain persons or group of persons. In order to protect data subjects against such consequences, the GDPR requires that not only an algorithm may decide, if said decision has legal effect. Rather, a person must be interposed, who, to an extent, ultimately makes the decision. The result of the so-called “ADM system” (ADM = Algorithmic Decision Making) can only be used as an aid to decision-making.
Companies that do not want to sacrifice cost savings and sometimes even more neutral decisions, still have the opportunity to use an AI in a way that a “human in the loop” is not necessary: Again, it is conceivable to obtain the consent of the data subject – with all its disadvantages.

However, its use is considerably simpler if the data processing is required for the conclusion or the fulfilment of a contract between the data subject and the data controller. In case an automated decision-making is contractually agreed upon, its use is also required. Same applies to the fulfilment of a legal obligation. For example, in Germany, provisions of the German Banking Act and the German Civil Code in case of (real estate) consumer loan agreements set up an obligation to carry out a creditworthiness check based on information like income, expenses and other relevant circumstances.

A fully automated credit check should also be qualified as necessary for all other contracts where the solvency of the counterparty is a decisive criterion. Ultimately, it always depends on the specific case.

AI and data privacy are not mutually exclusive

GDPR often provides data controllers with a choice of legal bases on which they can base their respective processing activities. The advantages and disadvantages of each legal basis should be carefully balanced against each other. Any obligation to carry out a data protection impact assessment, as it is often the case in the health sector, should not be seen as an inevitable evil, but as an opportunity.
The examples illustrated show that the GDPR and its numerous requirements on the handling of personal data by no means prevent the use of Artificial Intelligence. Anyone who precociously thinks about how the concrete implementation of AI can be brought into line with the GDPR – at best during the development phase – will also benefit in the long term: unnecessary follow-up costs that may result from any adaptation are avoided. In addition, the risk of reputational damages due to a lack of data security and due to a data protection incident can be effectively contained.

You need advice on Artificial Intelligence and data protection? Please do not hesitate to contact us!

Implementation of GDPR: current state of legislation of the EU member states UPDATE

Update on July 15, 2019: Our overview is updated continuously. New entries for Estonia and Italy.

Since 25 May 2018, the General Data Protection Regulation (GDPR) has been effective and companies affected have to adapt to far-reaching changes throughout the EU. In 1993, the European Community adopted Directive 95/46/EC, which has since become a benchmark for data protection in the EU Member States. The Directive required legal transformation in the Member States. By contrast, the GDPR is directly applicable in the Member States as a regulation and has application priority over national law (see Article 288 (2) TFEU). In principle, the General Data Protection Regulation is therefore the most important source for the clarification of data protection issues. However, the GDPR provides a large number of opening clauses granting the Member States regulatory powers.

Germany is the first state to pass a corresponding GDPR transposition law. This also applies since 25 May 2018 and replaces the previous Federal Data Protection Act (FDPA). As many companies are uncertain about the current legal status of data protection rules in other European countries, we have compiled an overview that shows the legal status in most of EU Member States. We update our overview regularly.

Belgium

In Belgium, the GDPR is primarily applied directly. However, a law has been passed to transform the former “Commission for the Protection of the Privacy” into a supervisory authority that meets the requirements of the GDPR. This was accompanied by an increase in the budget.

Official title: „Loi du 3 décembre 2017 portant création de l’Autorité de protection des données“.

Current status of the legislative procedure: Concluded on 10 January 2018

Source text: link

Further information (in national language): link

 

Bulgaria

In Bulgaria, the draft of an implementation law is still in the public debate. The slow implementation process is criticized nationally. According to a statement by the Bulgarian Industry Federation: “The lack of information on the forthcoming legislative changes creates the conditions for disseminating misinformation from various sources and prevents companies from adequately preparing themselves.” (Source: СЕГА)

Title of the act to implement the GDPR: „ЗАКОНА ЗА ЗАЩИТА НА ЛИЧНИТЕ ДАННИ“.

Current status of the legislative procedure: In public debate since 14 May 2018

Source text: link

Recitals to the draft: link

Further information (in national language): link

 

Denmark

In Denmark, the GDPR is subject to Denmark’s own data protection law, which contains supplementary provisions for various areas.
Official title: „Lov om supplerende bestemmelser til forordning om beskyttelse af fysiske personer i forbindelse med behandling af personoplysninger og om fri udveksling af sådanne oplysninger (databeskyttelsesloven)“.

Current status of the legislative procedure: Concluded on 17 May 2018

Source text: link

Further information on process (in national language): link

Further information (in national language): link

 

Estonia

In Estonia, the new Data Protection Act on the adaptation of data protection law in Estonia to the GDPR entered into force in January 2019.

Official title: „Isikuandmete kaitse seadus 616 SE“.

Source text: link

Further information (in national language) is to be provided

 

Finland

In Finland, the government introduced a draft for a new data protection law on 01 March 2018.

Official title: „Hallituksen esitys eduskunnalle EU:n yleistä tietosuoja-asetusta täydentäväksi lainsäädännöksi HE 9/2018“.

Current status of the legislative procedure: First reading concluded

Source text: link

Further information on process (in national language): link

Further information (in national language): link

 

France

The law on implementation was adopted by the National Assembly on 14 May 2018.

Official title: „Projet de loi relatif à la protection des données personnelles (JUSC1732261L)“.

Current status of the legislative procedure: concluded on 14 May 2018

Source text: link

Further information on process (in national language): link

 

Greece

A bill was published on 20 February 2018. The consultation on the draft took place on 05 March 2018.

Official title: „Νόμος για την Προστασία Δεδομένων Προσωπικού Χαρακτήρα σε εφαρμογή του Κανονισμού (ΕΕ) 2016/679“.

Current status of the legislative procedure: consultation on 05 March 2018

Source text: link

Information on process (in national language): link

Further information (in national language): link

 

Ireland

The transposition law was adopted on 18 May 2018 by the Dáil Éireann.

Official title: „Data Protection Bill 2018“.

Current status of the legislative procedure: concluded on 18 May 2018

Source text: link

information on process (in national language): link

Further information (in national language): link

 

Italy

In Italy, the legislative process is still ongoing.

Official title: „Decreto Legislativo 10 agosto 2018, n.101)“. This amended the Italian Data Protection Act (Decreto Legislativo 30 giugno 2003, n. 196) in order to adapt Italian data protection law to the GDPR requirements.

Current status of the legislative procedure: The new law of 10 August 2018 contains transitional provisions. In particular, in cases initiated before the new law enters into force, lower fines may be imposed.
In addition, an opinion of „Commissione speciale su atti urgenti del Governo“ (a special commission on urgent government matters) on further transposition laws is expected.

Source text: link

information on process (in national language): link_1 & link_2

 

Croatia

The Implementation Act was adopted by Parliament on 27 April 2018 and entered into force on 25 May 2018.

Official title: „ZAKONO PROVEDBI OPĆE UREDBE O ZAŠTITI PODATAKA – NN24/2018“.

Current status of the legislative procedure: concluded on 27 April 2018

Source text: link

Further information (in national language): link

 

Latvia

On Thursday, April 12, the Saeima supported the draft law on the processing of personal data at first reading.

Official title: „Personas datu apstrādes likums“.

Current status of the legislative procedure: Draft in the second reading in the Saeima.

Source text: link

Information on process (in national language): link

Further information (in national language): link

 

Lithuania

In Lithuania, the legislative process is still ongoing.

Official title: „IETUVOS RESPUBLIKOS ASMENS DUOMENŲ TEISINĖS APSAUGOS ĮSTATYMAS“.

Current status of the legislative procedure: The bill will be examined by the Sejma on 26 June 2018.

Source text: link

Further information (in national language): link

 

Luxembourg

In Luxembourg, the process is still ongoing.

Official title: „Projet de loi portant création de la Commission nationale pour la protection des données et la mise en oeuvre du règlement (UE) 2016/679 du Parlement européen et du Conseil du 27 avril 2016 relatif à la protection des personnes physiques à l’égard du traitement des données à caractère personnel et à la libre circulation de ces données, portant modification du Code du travail et de la loi modifiée du 25 mars 2015 fixant le régime des traitements et les conditions et modalités d’avancement des fonctionnaires de l’Etat et abrogeant la loi du 2 août 2002 relative à la protection des personnes physiques à l’égard du traitement des données à caractère personnel“.

Current status of the legislative procedure: Opinions and amendments to the draft are obtained.

Source text: link

Further information (in national language): link

 

Malta

In Malta, the process is still ongoing.

Official title: „l-Att tal-2018 dwar il-Protezzjoni u l-Privatezza tad-Data“.

Current status of the legislative procedure: Third reading at 24 May 2018.

Source text: link

Further information (in national language): link

 

Netherlands

In the Netherlands, the procedure has already been completed.

Official title: „Uitvoeringswet Algemene verordening gegevensbescherming“.

Current status of the legislative procedure: Completed on 15 May 2018 as a hammer piece.

Source text: link

Further information (in national language): link

 

Austria

In Austria, the implementation was carried out by several amendment acts to the national data protection law.

Official title: „Datenschutzgesetz DSG“.

Current status of the legislative procedure: Last amendment to act on 15 May 2018.

Source text: link

Further information (in national language): link

 

Poland

In Poland, the process is still ongoing. A draft is being dealt with in the Sejm.

Official title: „Ustawa o ochronie danych osobowych“.

Current status of the legislative procedure: Statement from the „Konferencja Rektorów Akademickich Szkół Polskich“ on 23 May 2018.

Source text: link

Further information (in national language): link

 

Portugal

In Portugal, the process is still ongoing.

Official title: „Proposta de Lei 120/XIII“.

Current status of the legislative procedure: Draft submitted on 26 March 2018.

Source text: link

 

Romania

In Romania, a law implementing the GDPR came into force on 31 July 2018. The Romanian legislature barely makes use of opening clauses of the GDPR to clarify open questions and definitions. For example, it remains unclear how the requirements of Article 37 of the GDPR are substantiated for the appointment of a data protection officer. On the other hand, it includes privileges for authorities and parties in case of violations of the GDPR, that is for example lower fines than for violations by private entities.

Official title: Law No. 190/2018

Current status of the legislative procedure: Concluded on 31 July 2018.

Source text: link

Further information (in national language): link

 

Sweden

In Sweden the process is already concluded.

Official title: „Ny Dataskyddslag“.

Current status of the legislative procedure: concluded on 18 April 2018

Source text: link

Further information (in national language): link

 

Slovakia

In Slovakia the process is already concluded.

Official title: „Zákon o ochrane osobných údajov a o zmene a doplnení niektorých zákonov“.

Current status of the legislative procedure: concluded on 30 January 2018

Source text: link

Information on process (in national language): link

 

Slovenia

In Slovenia, the process is still ongoing.

Official title: „OSNUTEK ZAKONA O VARSTVU OSEBNIH PODATKOV (ZVOP-2)“.

Current status of the legislative procedure: Government bill submitted on 23 January 2018

Source text: link

 

Spain

In Spain, the process is still ongoing.

Official title: „Proyecto de Ley Orgánica de Protección de Datos de Carácter Personal“.

Current status of the legislative procedure: The draft was approved by the Council of Ministers on 11 April 2018 for referral to the Cortes Generales

Source text: link

Information on process (in national language): link

 

Czech Republic

In Czech Republic, the process is still ongoing.

Official title: „Návrh zákona o zpracování osobních údajů“.

Current status of the legislative procedure: Government bill submitted on 21 March 2018

Information on process (in national language): link

 

Hungary

In Hungary, the process is still ongoing.

Official title: „jogról és az információszabadságról szóló 2011. évi CXII. törvény jogharmonizációs célú módosításáról szóló törvény“.

Current status of the legislative procedure: Draft of 25 September 2017

Source text: link

 

United Kingdom

In the United Kingdom the process is already concluded.

Official title: „Data Protection Act 2018“.

Current status of the legislative procedure: concluded on 23 May 2018

Source text: link

Information on process (in national language): link

 

Cyprus

In Cyprus, the process is still ongoing.

Official title: „της προστασίας των φυσικών προσώπων έναντι της επεξεργασίας των δεδομένων προσωπικού χαρακτήρα και για την ελεύθερη κυκλοφορία των δεδομένων αυτών Νόμος του 2018.“.

Current status of the legislative procedure: Consultation on draft on 04 April 2017

Source text: link

Information on process (in national language): link

Appointing a group data protection officer (EU): What are the benefits?

The entry into force of the GDPR on 25 May 2018 also brought considerable changes to the tasks of a “group data protection officer”, which did not previously exist in a comparable form. For the first time, the GDPR explicitly states that a group of undertakings may appoint a single data protection officer (DPO). This results in consequences for matters such as data protection impact assessments, liability issues, and other issues that require strict compliance with the requirements of the GDPR. In contrast to the prior legal situation, which was covered by the Federal Data Protection Act (BDSG), the GDPR expressly provides for and thus facilitates the appointment of a data protection officer in a group of undertakings.

What’s different about a group data protection officer?

The provisions of Article 37 (2) of the GDPR state that a group of undertakings (group) may appoint a single data protection officer. As a consequence, it is no longer necessary for each single company within a group to appoint a separate data protection officer; instead, one data protection officer is responsible for all companies within the group.

What are the tasks of the data protection officer?

The appointed data protection officer’s tasks generally correlate with those of companies and supervisory authorities. He or she is responsible for complying with the data protection requirements set by the contracting company or group. Just like a DPO responsible for just one company, a group data protection officer is required to meet the requirements of the GDPR. He or she must therefore possess certain professional qualifications, the ability to discharge his or her tasks in compliance with the GDPR, and relevant experience of such matters. This makes the need to keep pace with a rapidly changing legal and technological field one of the primary tasks of a group data protection officer. Although the GDPR does not require any specific legal background for this role, it is hard to imagine anyone being in a position to manage the complex requirements of the GDPR without a legal background and practical experience of the law. The tasks of a group data protection officer thus also include familiarisation with the procedures involved in transferring data between individual companies and knowledge of internal processes.

Article 39 of the GDPR sets out his or her tasks in detail. His or her primary responsibility is to inform employees of a company or the overall group of the data protection requirements that apply under the GDPR. It may be expected that the group data protection officer is not just in possession of the necessary legal knowledge for this purpose but is also able to communicate it in a clear and coherent manner. Large groups of undertakings, in particular, may arrange for training sessions to be prepared and then conducted by selected employees. A key element of such training is that any language barriers are eliminated.  It is also necessary for the group data protection officer to be given access, with no unreasonable barriers, to the individual companies, e.g. for the purpose of evaluating training. An unreasonable barrier may, for example, exist if the group data protection officer needs more than one day to travel to a specific company. The group must therefore ensure ease of access between its various locations, thereby allowing the group data protection officer to fulfil his or her duties. For the group data protection officer, this means that he or she must be located within the EU, even if the group has companies that lie outside the EU. In addition, the requirement of easy accessibility applies not only to the individual companies but also to the data subjects and supervisory authorities.

Language skills and easy accessibility are also required for the second key task of a group data protection officer: he or she is responsible for monitoring the implementation of the GDPR by the companies in the group. This additional task has consequences for liability issues, as in the event of a breach, and in contrast to the prior situation, the GDPR provides for high administrative fines. The contracting company, in this case the group, is generally liable for any breach of duty by the data protection officer. In addition, companies or groups may also be liable if they do not provide the data protection officer with sufficient support in the performance of his or her duties. If, then, the group data protection officer is not granted the opportunity to obtain an overview of the processes relevant to data protection within the individual companies, in particular the cooperation between the individual companies, and can therefore not adequately perform his or her duties, a liability risk to the company may arise on two counts.

In addition, the group data protection officer is responsible for cooperation with the supervisory authorities. In this context, it is important for him or her to complement their function within the company and thus work to achieve transparent cooperation with the supervisory authorities.

Further tasks of the group data protection officer include providing advice regarding the data protection impact assessment, which must be carried out in cases in which there is an increased risk to the rights of data subjects to data protection, i.e. when an intrusion into the privacy of the individual is particularly far-reaching. CCTV is a good example of this. A high risk of this type must always be assumed when particularly sensitive data (e.g. health data) is processed. By means of the data protection impact assessment, the group data protection officer should carefully consider whether the interests of data subjects override the interests of the group (or an individual company) or vice versa in a company, in several companies, or in the process of transferring data between individual companies in a group. If there is a risk to the interests and rights of data subjects, the group data protection officer must then initiate the action required to better protect the data of data subjects in advance. This may be done, for example, by providing information on how data will be processed in good time or by means of appropriate technical defaults (privacy by design).

What are the benefits of appointing a group data protection officer?

Similarly to an external data protection officer, a group data protection officer can easily obtain an overview of all issues relevant to data protection within a group of undertakings. The question of who is responsible for complying with GDPR requirements may frequently arise, in particular when processing personal data by means of transfer between individual companies; such matters can be transparently set out, thus ensuring compliance with the GDPR, by the group data protection officer. In addition, companies can use a group data protection officer to ensure compliance with their documentation obligations and thus comply with the burden of proof rules under the GDPR. Overall, a group data protection thus considerably reduces the risk of the high administrative fines provided for in the GDPR by creating uniform group standards. Companies can also make financial savings by appointing a single data protection officer. Standardized data protection requirements and structures that apply to the entire group of undertakings can also fulfil the transparency requirements of the GDPR.

Recommended action

Due to the efficiency and transparency benefits involved in appointing a group data protection officer, groups of undertakings are strongly urged to consider this route of action. The complex procedures relating to data transfer between companies and the involvement of processors, in particular, require a broad view of and method of handling issues relevant to data protection to avoid the high administrative fines provided for in the GDPR.