With the digital revolution, the future has well and truly arrived in the business world – be it through almost limitless networking possibilities or the ability to work virtually anywhere. By using the very latest in IT solutions, however, future-oriented companies are becoming increasingly dependent on technology. As companies shift every aspect of their business from the analogue to the digital world, they provide hackers with more and more targets for cyberattacks. And then there is the human risk factor: a single careless act can jeopardise protected infrastructures, be it through the use of insecure passwords, opening an email with an infected attachment or link, or the (accidental) disclosure of information. Increasingly complex attacks call for increasingly professional technologies to defend against the sophisticated methods employed by cybercriminals. According to a study, the question is not so much whether a company will suffer a cyberattack, but rather when and how the attack will take place – and how serious its effects might be. Good risk management, strong technical and organisational measures, and a careful strategy are therefore all the more important for cybersecurity within companies!

Types of cyberattacks

A cyberattack is a malicious attempt to compromise IT systems. By launching a targeted attack on a specific information technology structure, hackers or criminal organisations attempt to plant malware inside IT systems in order to cause damage. While some attacks are politically motivated, many hackers are driven by the desire for financial gain when stealing data. Selling stolen data or extorting ransoms can be quite lucrative.

Cyberattacks come in many different forms and are becoming more and more sophisticated. Most attacks are highly complex, making traceability very difficult.

Among the most prominent means and methods for cyberattacks are:

  • Spam emails
  • Malicious software (malware or junkware)
  • Drive-by exploits
  • Brute force attacks
  • DDoS attacks
  • Phishing emails
  • Ransomware

What threats are associated with cyberattacks?

The goal of most cyberattacks is to steal data, whereby personal data is of particular economic value, as evidenced by data-driven business models. Cyberattacks can also encrypt data on which a company depends in order to operate successfully. The data is then usually only released after a considerable ransom has been paid. The economic impact can be enormous, as can the damage to the company’s reputation if the attack becomes known to the public. This is often accompanied by damage to the company’s internal data infrastructure, which is the foundation of any digital company – possibly resulting in operational disruptions or even preventing the company from operating altogether. The misuse of data is also a major threat for affected companies.

Some cyberattacks are a form of industrial espionage, for example with the aim of gaining competitive advantages by stealing information about the victim’s corporate strategy.

Another inherent threat in cyberattacks is the manipulation of communication channels. In a “man-in-the-middle attack”, for example, hackers can eavesdrop on communications between several parties in order to steal information or even manipulate the content before sending it on to the intended recipient.

It is important to note that any such attack triggers legal obligations. Especially when personal data is involved, the regulations of the General Data Protection Regulation (GDPR) must be observed. Companies that process personal data are obliged to ensure the appropriate security of personal data and to report data protection incidents. The principle of integrity and confidentiality includes, among other things, protecting data against accidental loss and damage as well as unauthorised processing (cf. Art. 5(1)(f) GDPR).

How to respond to a cyberattack

Spread the word internally: If a cyberattack occurs, it makes sense to immediately contact the relevant units, such as your information and IT security officers. In addition, the data protection officer, the IT department and of course management should be informed of the attack. They can then take immediate action based on an incident response plan.

Convene the crisis team: Especially in situations that call for swift and prudent action, it makes sense to run things past a crisis team set up specifically for this purpose, in order to avoid a chaotic internal response and to take uniform action against the attack. It is essential to involve the aforementioned internal units in the crisis team, as well as whichever department has been targeted.

Collect information: In order to stop an attack that has already begun, as well as to be able to prevent future attacks, it is sensible to collect certain information about the attack. Among other things, this includes investigating how the attack came to light, what impact the attack may have on the company’s core services, why the attack happened, and the likely impact on third parties such as customers or business partners.

Comply with notification obligations: Furthermore, it may be necessary to notify the competent data protection supervisory authority and the data subjects affected by the attack. Under certain conditions, companies are subject to this obligation under the GDPR.

Depending on the incident, it may also be advisable to bring the attack to the attention of law enforcement authorities and file a criminal complaint.

To put an end to the attack, it usually makes sense to take the affected system – the one used as a gateway by the hackers – offline. In some cases, however, the only way to overcome a cyberattack is to completely reinstall the affected system, as it is usually the entire system that is considered compromised. Cybercriminals use a variety of methods, some of which can cause damage even on a network that has been taken offline. For this reason, companies should be sure to create regular back-ups and store them securely and separately from the rest of the infrastructure.

What cybersecurity steps can companies take?

Cybersecurity is all about taking protective measures to secure and protect systems, networks and programs from unauthorised digital access.

Companies take cybersecurity measures in order to ensure the confidentiality of data and information. In addition, they aim to protect the integrity of personal data. Another goal of cybersecurity is to protect the availability of information used in the company against threats from cyberspace.

Cybersecurity should therefore be embedded as part of the company’s internal risk management. Companies should rely on a combination of strategy, technology and user awareness training. With the right cybersecurity risk management, internal vulnerabilities can be identified and administrative solutions and measures found to offer appropriate protection to the company and its digital infrastructure.

Possible cybersecurity measures

In order to prevent cyberattacks, it is advisable to take preventive measures to protect the company’s internal structures.

It is sensible to implement security gateways for individual network transitions, such as a proxy firewall (also called an application firewall or gateway firewall). A proxy firewall is essentially a security system for a communication network. Using a proxy firewall, no direct connection is established between your own network and another network such as the internet. The firewall is placed between the networks. It filters all requests from the internet, for example, either forwarding them or blocking them. This allows harmful viruses or similar to be detected and warded off before they have a chance to enter your own system.

Furthermore, system segmentation, i.e. the division of computer networks into several smaller subnetworks, is often a successful tool in preventing cyberattacks. With network segmentation, companies can determine whether all network traffic should only remain within one part of the network, or whether it – or at least certain categories of it – should be allowed to pass over into other network segments. This makes it more difficult for hackers and cybercriminals to penetrate the entire network, as the different segments are not connected to each other.

It is also essential to implement a patch management system that identifies software updates and “patches” and makes these available on endpoints such as computers, mobile devices or servers. Installing updates and patches regularly and at short notice is an effective means of eliminating software vulnerabilities that could otherwise be exploited as gateways for cyberattacks. By using patch management, companies can reduce the security risks associated with software and applications. Employees should therefore be made aware of and obliged to install updates and use the latest version of a given application. As far as possible, regular updates should be enforced technically.

The software used by employees should be equipped with effective virus protection from the outset anyway. Ideally, this will prevent malware from infiltrating systems in the first place. It also makes sense to use two-factor identification wherever possible. This can better ensure that only authorised persons ever have access to systems. Effective password management that mandates strong passwords also increases the security of IT systems. Since IT security breaches are often caused by employees clicking on an infected link or similar, employee training is also an effective measure. Regular staff training and clear rules of conduct on the part of the company can raise awareness of cyberattacks before they happen.

AI-based tools that are embedded in the system as a defence against cybercrime and can detect and ward off attacks fully automatically can be particularly effective.

There are already some frameworks on the market for cybersecurity risk management that can also be used to identify measures. Well-known frameworks include ISO 27001 and the BSI-Grundschutz certification process. It is important to document any measures taken.

Conclusion

Cybersecurity and good risk management serve to protect companies from threats to their systems. Anyone who wants to be armed against ransomware, malware and other cyberattacks needs robust cybersecurity. Regularly reviewing your cybersecurity strategy can reveal vulnerabilities in your own company well before an attack occurs and ward off cyberattacks. Nevertheless, it is not possible to be completely protected against all cyberattacks at all times, which is why your company should know exactly how to act in the event of an IT security breach. In particular, it is advisable to document all measures taken – so that the system can be reviewed regularly, but also in order to comply with your accountability requirements under the GDPR (cf. Art. 5(2) GDPR). The Federal Office for Information Security (BSI) also provides companies with comprehensive information on how to manage, report and prevent incidents.

Feel free to contact us – our data protection experts will support you in developing a suitable cybersecurity strategy for your company!

Learn more

  • Cybersecurity and data protection: Challenges for companies

    What should be done when a cyber attack occurs? What can cybersecurity look like in the company? Read more now!

    Read more

  • Rights of the data subject under the GDPR: An overview

    The General Data Protection Regulation (GDPR) has resulted in significant changes in the area of data subjects’ rights. What do companies have to consider?

    Read more

  • New EDPB guidelines on the right of access: How companies can provide information in a legally compliant manner

    In this article, we present these guidelines and provide companies with valuable practical advice on how to proceed with a request for information.

    Read more

The EU-wide General Data Protection Regulation (GDPR) came into effect on 25 May 2018. Ever since then it has presented companies with enormous challenges. Especially in the area of what are known as the rights of the data subject, many things have changed compared to the previous legal situation. Data subjects have been given a variety of tools to help them monitor and manage how their personal data is handled. Since the GDPR came into effect, the supervisory authorities in Germany and other EU countries have already imposed a large number of administrative fines, many of them for non-compliance with the rights of data subjects. The list ranges from not granting access, to missed deadlines and failing to delete data despite the right to erasure. The right to data portability under Art. 20 GDPR also poses a major challenge for companies.

What are “rights of the data subject”?

Rights of the data subject means the rights of any individual affected by data processing pursuant to Art. 12 et seq. GDPR. They protect the right to “informational self-determination” (Art. 2(1) in conjunction with Art. 1(1) of the German constitution) and serve to provide information and transparency.

Art. 12(3) GDPR stipulates that data subject requests must receive a response “at the latest” within one month. An extension for a further two months is possible in exceptional cases. However, this extension is not justified by arguing, for example, that the company is generally too busy to respond sooner, but must be considered on a case-by-case basis.

What rights of the data subject does the GDPR define?

1. The controller’s duty to inform (Art. 13, 14 GDPR).

Art. 13 and Art. 14 GDPR together form a single complex. Together with Art. 15 GDPR, the provisions constitute an essential component (“Magna Carta”) of the rights of the data subject. It is only through the information obtained with the help of Art. 13 GDPR that the data subject can properly assess a data processing operation and properly exercise their rights as a data subject. Art. 13 GDPR is therefore of fundamental importance.

The EU legislator has fleshed out the principles of fair and transparent processing by specifying certain information which the controller is obliged to provide. In this sense, Art. 13(1) GDPR stipulates that the data subject must be informed above all of the contact details of the controller, the purpose (for each individual data processing operation separately) and the duration of the data processing, as well as information about the recipients of the personal data, the legal basis of the data processing, and a comprehensible explanation of how the interests of the data subject were weighed against those of the controller.

According to Art. 13(1) and (2) GDPR, the data subject must also be informed of all rights of the data subject, i.e. that they have a right of access, a right to rectification, to erasure, to restriction of processing, a right to object, and a right to data portability. What’s more, the data subject must be informed about the extent to which decision-making is based exclusively on automatic data processing (especially profiling). It is important to note here that the data subject must be provided with all this information where the data is collected, e.g. when subscribing to a newsletter or concluding a purchase contract online, but possibly even before concluding a purchase contract, e.g. when registering for a user account. Art. 12(1) GDPR requires that the information to be provided to the data subject is presented in a “transparent, intelligible and easily accessible form, using clear and plain language”. This means that the respective addressees must be able to understand the information – so privacy notices, for example, should avoid ambiguous wording, foreign words and complicated syntax and instead use more everyday language. Under the GDPR, the information can be provided orally, in writing or electronically. Particularly with regard to children, attention must be paid not only to the aforementioned obligation to use simple language, but also so the fact that the language is appropriate for the age group. According to Art. 13(4) GDPR, the obligation to provide information only does not apply if the data subject already has the necessary information when their data is processed. If there is any doubt, it is up to companies to prove this.

Newsletter

Don’t miss any updates on data protection, information security and compliance. Subscribe to our newsletter today (only in German)!

By clicking on “Subscribe” you agree to receive our monthly newsletter (with information on judgements, specialist articles and events) and the aggregated usage analysis (measurement of the opening rate using pixels, measurement of clicks on links) in the mails. You will find an unsubscribe link in every newsletter to withdraw your consent. You can find more information in our privacy policy.

Art. 14 GDPR also regulates corresponding information obligations in the event that the data was not collected by the controller itself, but by third parties (e.g. information about creditworthiness obtained from credit agencies). Where data was collected from third parties, the company’s information obligations are basically comparable to those under Art. 13 GDPR. In addition, the company has a duty to disclose the source of the information. Unlike under Art. 13 GDPR, the information does not have to be provided immediately in all cases, but at the latest within a maximum period of one month after obtaining the data. If, however, the personal data is to be used to communicate with the data subject, the notification has to be given no later than when the first contact is made.

2. The controller’s active duty to inform corresponds to the data subject’s extensive right of access (Art. 15 GDPR)

Art. 15 GDPR grants a right to comprehensive information regarding the personal data processed as well as specific circumstances of the data processing. This right of access is limited by conflicting rights of third parties. This has the particular consequence that access does not have to be given to information pertaining to trade secrets. Art. 15 GDPR is highly relevant in practice and is likely to become even more so in the future.

The right of access is structured in two stages. The first stage gives the data subject the right to know whether or not personal data concerning them is being processed. If this is not the case, the controller must inform the data subject accordingly. If the data subject’s data is being processed, then the second stage gives the data subject a right of access to that personal data and to certain additional information.

In order to establish this right, the data subject may request access to information about data processing at reasonable intervals. In principle, there are no formal requirements for requesting access. In the event of a data subject access request, the controller must above all provide information about the purpose of the data processing, the categories of personal data processed and the recipients or categories of recipients to whom the data may have been disclosed.

In addition, the right of access covers further information such as

  • The envisaged storage period or the criteria used to determine that period
  • Information about the individual rights of the data subject (such as the right to rectification, erasure, restriction of processing, right to object, right to lodge a complaint with a supervisory authority)
  • The existence of automated decision-making, including profiling, and any further consequences
  • In the case of data transfers to third countries or to international organisations, information about appropriate safeguards.

In addition, pursuant to Art. 15(3) GDPR, the data subject has a right to receive a copy of the personal data undergoing processing free of charge. The controller may charge a reasonable fee for any further copies. The data subject is not considered to be requesting a “further” copy if they submit a new request for access and the data held by the controller has changed significantly since the last copy was sent. However, there is still a great deal of controversy in terms of what and how much exactly is covered by the right to a copy of the data. The information to be provided by the controller can be very extensive indeed, depending on the amount of data involved. In these cases, it is advisable to prepare the data accordingly as part of the access process – a process which should be integrated into ongoing business processes well in advance.

3. The right to rectification (Art. 16 GDPR)

If the data subject’s personal data has been processed incorrectly, the data subject has the right to rectification without undue delay. The right of the data subject to rectification is closely related to the right of access under Art. 15 GDPR. Without the right of access to the personal data concerning them, the data subject would not be able to exercise their right to rectification. The right to rectification has two components: the data subject may request both that inaccurate data be rectified and that incomplete data be completed or supplemented.

4. The right to erasure (Art. 17 GDPR)

The right to erasure (Art. 17 GDPR) is also known as the “right to be forgotten”. The data subject has the right to obtain from the controller the erasure of personal data concerning them without undue delay and the controller has the obligation to erase personal data without undue delay where one of the following grounds applies:

  • Storing the data is no longer necessary in relation to the purposes for which it was collected
  • The data subject withdraws the consent to data processing which they gave previously
  • The data subject has objected to the processing and there is no legitimate interest in the processing (in the case of Art. 21(2) GDPR, erasure must take place regardless of the controller’s interest in the processing)
  • The data has been unlawfully processed
  • The company is obliged to erase the data due to a legal obligation (under EU law or the national law of a Member State).
  • The personal data has been collected in relation to the offer of information society services referred to in Art. 8(1) GDPR.

In addition, according to Art. 17(3) GDPR, there are a number of derogations where the erasure obligation does not apply. The most important derogation is where the obligation to erase data does not apply due to a legal obligation, for example if there is a duty to retain data for a longer period under employment, tax or commercial law.

5. The right to restriction of processing

According to Art. 18 GDPR, the data subject has a right to restriction of processing. This provision is intended to strike a provisional balance between the data subject’s interests – namely in the protection of their right to “informational self-determination” – and those of the controller in processing the personal data. The data subject has the right to obtain from the controller the restriction of processing where one of the following applies:

  • The data subject contests the accuracy of the data
  • The processing is unlawful
  • The controller no longer needs the personal data for the purposes of the processing, but the data is required for the establishment of legal claims
  • The data subject has objected to processing pursuant to Art. 21(1) GDPR pending the verification of whether the legitimate grounds of the controller override those of the data subject.

According to Art. 18 GDPR, once processing has been restricted, data may now only be processed under particularly narrow conditions and for special purposes. The personal data does not have to be erased, but may no longer be processed in any other way. To this end, the data whose processing is to be restricted needs to be marked and treated accordingly.

6. Right to data portability (Art. 20 GDPR)

The right to data portability is a new right created by the GDPR. The provision is intended to give the data subject more efficient control over their data and to counter lock-in effects by facilitating “provider switching”. This is to promote competition. The provision gives the data subject the possibility to obtain data stored about them (for example, on social media) in an appropriate portable format for the purpose of transmission or, where appropriate, to have the data transmitted directly to the other provider. This is to prevent monopolies, for example because the data subject fears that setting up a new profile with a competing provider would take them too much time.

However, this provision only covers data which the data subject has provided to the controller. In particular, this means data that the data subject themselves used when creating the user account or when posting on social media. The question of whether the provision applies to data collected through interaction with the controller’s service, such as data collected by smart devices or “wearables”, has yet to be clarified.

Since it is quite possible for the data provided by the data subject to contain information not only about themselves but also about third parties, Art. 20(4) GDPR specifies that the right to data portability must not adversely affect the rights and freedoms of others. This means that in the case of data concerning third parties, the fundamental rights and interests of the person making the request must be weighed against those other data subjects. After all, the right to data portability does not apply if it would be used for unfair or abusive purposes.

7. Do the rights of the data subject apply equally in all Member States?

One of the aims of the GDPR is to create a uniform level of data protection in all Member States. However, at many points the GDPR contains so-called “opening clauses” (e.g. Art. 85(2) GDPR), which allow Member States to adopt their own national regulations within certain limits. In Germany’s case, it is particularly important here to take what’s known as media privilege into account, which the German legislator has regulated in Sect. 55 of the Interstate Broadcasting Treaty (RStV). In abstract terms, it can be said that this media privilege leads to the extensive exemption of the press, broadcasting and telemedia from data protection requirements.

Conclusion and recommended action: How important are the rights of the data subject?

The rights of the data subject are one of the central pillars of the GDPR. The supervisory authorities punish infringements with hefty administrative fines. For individuals, the rights of the data subject are a means of both communicating with and monitoring controllers. No company can avoid compliance with the GDPR. It is one of the fundamental legal obligations of any company towards its customers. For this reason alone, it is vital that companies attach great importance to how the public perceive their approach to their data protection duties. A well-managed data protection department that swiftly, comprehensively and reliably complies with the rights of all data subjects sends a strong message. Companies should always take requests from data subjects seriously but also use them to self-monitor and improve the quality of their existing data protection processes.

Learn more

  • Cybersecurity and data protection: Challenges for companies

    What should be done when a cyber attack occurs? What can cybersecurity look like in the company? Read more now!

    Read more

  • Rights of the data subject under the GDPR: An overview

    The General Data Protection Regulation (GDPR) has resulted in significant changes in the area of data subjects’ rights. What do companies have to consider?

    Read more

  • New EDPB guidelines on the right of access: How companies can provide information in a legally compliant manner

    In this article, we present these guidelines and provide companies with valuable practical advice on how to proceed with a request for information.

    Read more

On 28 January 2022, the European Data Protection Board (EDPB) published guidelines on the right of access (Guidelines 01/2022). They serve first and foremost as a guide to ensure that the General Data Protection Regulation (GDPR) is applied consistently in all Member States of the European Union. While not legally binding, they are used by data protection authorities, data protection consultants and, under certain circumstances, even courts, which is why companies should definitely be familiar with them. The EDPB addresses some questions that the courts have in some cases answered inconsistently in recent years, such as how wide-ranging the right of access really is. In addition, companies can derive important recommendations from the guidelines. In this article, we present the guidelines and provide companies with valuable practical tips on how to proceed when faced with an access request.

When do companies have to provide access under Art. 15 GDPR?

First of all, it is important to understand when companies are required to provide access in the first place. The law itself states that a data subject has the right to obtain from the controller confirmation as to whether or not the controller processes their personal data. The controller is the party which, alone or jointly with others, determines the purposes and means of the processing of personal data. So if a company determines whether, for what purposes, and how personal data is processed, it is considered a controller under the GDPR. It should be noted that the data subject’s right of access is unconditional. This means that the controller need not check why a request has been made and whether it meets certain requirements. In principle, natural persons are always entitled to request access, in which case the company must respond.

Applying a kind of three-step test, companies should then determine to what extent they have to provide access. It is important to consider here whether the company even processes any personal data of the person requesting access.

  • If this is not the case, then the company needs to inform the person of this.
  • If the company does process the person’s data, then the data subject has a right of access to their personal data and to the information referred to in Art. 15(1)(a)–(h), (2) GDPR.

If a company processes personal data, then under Art. 15(3) GDPR it also has to provide a copy of the personal data that is the subject of the processing. The first copy is always free of charge – no matter how much it costs the company to issue it. The copy must comprehensively list all the data mentioned in Art. 15(1) GDPR. Simply providing a brief summary is not sufficient.
Companies may charge a reasonable fee for any additional copies. In this context, it is often not clear whether the data subject has submitted a new request, which would then again entitle them to a free copy, or whether it is a question of merely providing an additional copy. The EDPB explains that this depends on the specific request and how it relates to the first request in terms of time and scope. If the data subject requests a different amount of personal data at a later date, then the company should assume that the request is for a new copy and not for a further copy.

What data and information does the company have to provide access to?

The data

So far, understanding of the scope of the right of access has been characterised by inconsistent case law. The EDPB states that a very broad understanding should be taken as a basis. It argues that an access request covers all of the data subject’s personal data. Besides basic personal data like name or address, a variety of other data must also be included, such as medical findings, creditworthiness indicators, activity logs and search activities. Pseudonymised data is also listed as personal data which controllers have to provide. Communication history – something often requested when the parties have communicated by email – is also considered data which is subject to disclosure. This even applies if the emails have already been deleted but the server provider still has access to them. The Hanover Regional Labour Court (LAG) ruled differently (judgment of 9 June 2020, ref. 9 Sa 608/19) in June 2020, when it ruled that Art. 15 GDPR did not apply to the email correspondence that an employee had conducted or received themselves. The EDPB, on the other hand, bases its view on the very broad understanding of Art. 4 No. 1 GDPR, which provides for a comprehensive definition of personal data. It argues that the right of access under the GDPR must be as broad as the definition allows for the categorisation of personal data. That means a very broad definition – and not one that is overly restrictive. This was also the understanding of the Cologne Higher Regional Court (OLG) when it considered the case (judgment of 23 October 2020, ref. 20 U 57/19) of an insurance policyholder who had contacted their insurance company requesting access. The court took the view that this also applied to information about the history of the premium account, the establishment of the insurance contract, and to the correspondence stored about the data subject.

Newsletter

Don’t miss any updates on data protection, information security and compliance. Subscribe to our newsletter today (only in German)!

By clicking on “Subscribe” you agree to receive our monthly newsletter (with information on judgements, specialist articles and events) and the aggregated usage analysis (measurement of the opening rate using pixels, measurement of clicks on links) in the mails. You will find an unsubscribe link in every newsletter to withdraw your consent. You can find more information in our privacy policy.

Only if the amount of data had been too extensive would the controller have been allowed to demand that the data subject specify their request in more detail. A restricted scope may also be considered if the data subject has explicitly requested only certain data.
By way of example, the EDPB lists the following data which has to be disclosed, taking into account the rights and freedoms of others:

  • Special categories of personal data: sensitive data such as health data
  • Personal data relating to criminal convictions and offences
  • Data knowingly and actively provided by the data subject, such as account data or data collected via forms, etc.
  • Observed data provided by virtue of using a service or device, such as access logs, history of website usage, search activities, location data, clicking activity, keystrokes, etc.
  • Data derived from other data, such as credit ratio, country of residence derived from postcode
  • Data inferred from other data but not directly provided by the data subject, such as algorithmic results, results of a health assessment or a personalisation or recommendation process
  • Pseudonymised data.

In line with this, in a key ruling the Federal Court of Justice (BGH) already ruled (judgment of 15 June 2021, ref. VI ZR 579/19) in favour of a broad understanding of the right of access. Among other things, information should be provided on what is already known or on internal notes.

As regards requests from employees, companies should be aware of the EDPB’s view that elements that have been used to decide whether to promote someone, give them a raise, or assign them a new job must be classified as personal data to which the employee would have a right of access. This might be their annual performance review, training requests or other career potential. This is something the Hamm Regional Labour Court (LAG) decided in its judgment of 11 May 2021 (ref. 2 AZR 363/21), in that it regarded data about an employee’s performance and conduct as personal data within the meaning of Art. 4 No. 1 GDPR.

In response to a request, companies are also required to disclose inaccurate or unlawfully processed data. This helps data subjects to gain a true understanding of all processing operations. Unlike the right to data portability under Art. 20 GDPR, where data can be taken from one controller to another, the right of access also includes derived rights. With data portability, only data generated by the respective company itself can be transferred. Conversely, companies are required to provide access with regard to derived personal data from other providers.

The information to be provided

The data subject may request the following information regarding the processing of the personal data which is the subject of their access request:

  • The purposes for which the data is processed
  • The categories of data, such as health data, biometric or genetic data
  • Information about recipients to whom the data has been disclosed
  • To the extent possible, the envisaged storage period or the criteria used to determine that period
  • The existence of the right to request that data be rectified or erased, that its processing be restricted, or the right to object to processing
  • The existence of a right to lodge a complaint
  • All available information as to the source of the data
  • The existence of automated decision-making, for example by means of artificial intelligence
  • Information about data transfers to third countries, insofar as the data was transferred to a third country.

According to the EDPB, one way for companies to communicate this information is to use existing text from their privacy notice or from their records of processing activities (cf. Art. 30 GDPR). Here, however, it should first be checked very carefully whether the information that uses boilerplate can actually answer the request specifically enough. This would make sense especially in the case of general information that is often identical, such as a notification about the right to rectification, in which case companies could employ automated processes here.


You might also be interested in this:


How should companies grant access?

How can companies retrieve the necessary data?

According to the EDPB, it is important to take into account not just data found in IT systems, but also all non-IT file systems. So companies are also required to research and disclose data from paper files. Personal data collected in a computer memory by means of binary code or stored on a videotape is also considered a potential source of data for such data subject requests. Against the background of the data protection principles of “privacy by design” and “privacy by default”, companies should already have implemented appropriate ways in their IT systems to be able to quickly find requested data. This involves data protection-friendly default settings on the technical side for data processing procedures.

Form

Companies are required to communicate the data as well as the information in written or other form, possibly also electronically. A permanent copy of the information should be provided. For example, if the data subject wishes to receive information orally, companies should comply. However, companies should still be able to provide copies of data if requested by the data subject. As a rule, requests for access are made electronically. According to the EDPB, if the request was sent electronically, then the data copy should also be sent by electronic means of communication. For example, a company might choose to send a PDF file by email. Another option might be an online self-service tool that processes requests automatically. This would only be different if the data subject expressly requested that information be provided in a certain form, for example as a written document sent by post. Then companies should also comply with this wish.

Presenting the information

All notifications must be conveyed in a precise, transparent, comprehensible and easily accessible form, in clear and simple language. A particular challenge for companies is often the sheer volume of data to be disclosed, which can conflict with the requirement to keep things concise. One solution the EDPB mentions here is a layered approach, in which the information is communicated in different layers. In the first layer, companies could communicate information about the processing and the rights of the data subject as well as providing initial information about which personal data is processed. The second layer would then provide more detailed personal data.

Granting access: The clock is ticking

If a company receives a request for access, it is imperative to take action immediately. Companies need to respond to the request as soon as possible, and in any case within one month. In exceptional cases, for example if a company receives a high number of highly complex requests, it may be possible to have the deadline extended by two months.

Limitations

A data subject’s right to obtain a copy of their processed data may not adversely affect the rights and freedoms of others. With regard to the limitations, the EDPB states that trade secrets and intellectual property of the company itself also fall under the rights and freedoms of others, and as such these must not be infringed. In any case, the company would be required to prove any impact on its own rights and freedoms.
Should there be a conflict of interest between the person requesting access and the interests of others, the EDPB certainly sees the possibility of redacting relevant passages or otherwise making them unrecognisable. This would make it possible to grant access to the information while protecting the interests of another person or the company itself. Cologne Regional Court (LG) already came to a similar conclusion (judgment of 24 June 2020, ref. 20 O 241/19) with regard to data disclosure by sending a claims file. It was argued that, in principle, an insurance company was not permitted to transmit an entire claims file due to the interests of third parties. However, personal data of third parties can be redacted in such a case in order to still grant access.

Companies have a right of refusal if requests are manifestly unfounded. The EDPB emphasises that this can only be assumed in very few cases. Companies may also refuse to grant access if requests are frequently repeated and thus excessive in character.

Companies can find further restrictions on the right to access in Sect. 34 of the German Federal Data Protection Act (BDSG). Among other things, under this regulation data subjects do not have a right of access if the data has been stored only because it may not be erased due to legal or statutory provisions on retention. Especially in the case of personal data found in commercial books, annual financial statements or business letters, companies may refuse to provide access. The same applies if data is used exclusively for the purposes of data backup or privacy monitoring. This means, for example, backup copies or log files. However, there must also be disproportionate effort involved, whereby it is primarily a matter of weighing up the interests of the data subject or the controller.

Checklist for requests for access

Companies should take requests for access seriously and process them carefully. Otherwise, they may risk fines or claims for damages under the GDPR. The following checklist will help add structure to the process:

  • Check whether a request is a request for access to personal data under Art. 15 GDPR.
  • Check whether the sender is entitled to obtain access – by checking their identity to determine whether they are a data subject or whether they are entitled to make a request.
  • Find out about the scope of the request – to which data is the person requesting access? If necessary, you can ask the data subject to specify their request.
  • The right of access extends to processed personal data and further information pursuant to Art. 15(1)(a)–(h) GDPR. The EDPB generally considers it to be possible to use existing text taken from the controller’s privacy notice to convey the information from Art. 15(1)(a)–(h) GDPR.
  • The EDPB takes the view that an automated process should be set up for granting access. Especially in the case of large numbers of requests, this serves to relieve the workload of company staff. It is advisable to create a list documenting the date of the request, the requesting person, the contact details of the requesting person, the date and nature of processing and the member of staff responsible within the company. Otherwise, the EDPB recommends setting up a self-service tool that processes requests automatically on its own.
  • Consider the interests of other persons and weigh up whether it is still possible to provide information. For example, redacting certain information could be sufficient.
  • Check whether the request is manifestly unfounded or excessive. Then you can demand a reasonable fee for granting access or simply refuse.

We are happy to support you with all questions regarding EDPB guidelines. Contact us now!

Implementing data security and data protection appropriately within your company is a complex task. Given the large number of data processing operations that are carried out every day at different points and the equally large number of legal rules that have to be observed, it is easy to lose track of exactly what is going on. This is why companies should opt for a comprehensive data protection management system (DPMS) to help them keep track and not risk being fined for legal violations. In this article, we will show you what is most important.

I. DPMS: A brief introduction

A DPMS is a way to centrally manage and take care of all your data protection requirements. It defines processes, determines who is responsible for what, and introduces control mechanisms. The most important processes include: maintaining records of processing activities; handling data subject requests, complaints and data protection incidents; and conducting regular staff training. As for determining who is responsible for what, it is important to clearly separate your different responsibilities at team or department level and to appoint data protection coordinators. In addition, there should always be close cooperation with the person in charge of data protection. Regular review processes and internal audits are recommended for control purposes.


Newsletter: Stay always up to date

Don’t miss any updates on data protection, information security and compliance. Subscribe to our newsletter today (only in German)!

[ctabtn target=”#newsletter” subject=”Information”]Newsletter subscription[/ctabtn]


II. Designing a DPMS

In order to be able to define a meaningful structure for your DPMS, it is first important to know what companies are generally required to do under data protection law. Any collection, use, archiving or even erasure of personal data constitutes a processing operation that must comply with the General Data Protection Regulation (GDPR). Besides defining responsibilities, legal bases or documentation practice, there are requirements such as ensuring compliance with the data protection principles of “privacy by design” and “privacy by default”, and special regulations for data transfers to third countries, for processing data on the controller’s behalf, and other scenarios. This all results in a large number of overarching goals that your DPMS should achieve – and on the basis of which your DPMS can be structured. These include in particular:

  • Transparency
  • Purpose limitation
  • Data minimisation
  • Accuracy
  • Storage limitation
  • Integrity and confidentiality
  • Accountability.

The many requirements can cause data protection management within your company to become highly time-consuming and tedious for the staff involved. Given that there are different tasks with different deadlines, and differently structured documents sent out to different departments, the whole affair is very prone to errors. This makes a carefully crafted DPMS all the more important.


You might also be interested in this:

Transmission of health data: pitfalls in health apps & fitness trackers
The biggest GDPR myths: the consent – what is right and what is wrong?
Handling enquiries from data subjects – what is really relevant?


III. Key content in an DPMS

One important component of any DPMS is information about records of processing activities. It also contains all the documentation necessary for accountability purposes under Art. 5(2) GDPR. But it also serves as a basis for the monitoring processes that are mandatory under Art. 32 GDPR. In addition, it contains information on the erasure concept, service provider management, on documentation of technical and/or organisational measures, on data security and data protection impact assessments (DPIAs). Other points to include in your DPMS are processes involving data subjects’ rights, data protection incidents and requests from authorities, as well as staff training.

Dealing with data subjects’ rights is another central part of a DPMS. The most important thing here is to make sure that you are guaranteeing these rights in the legally prescribed form. This includes informing data subjects in plain language about the processing of their personal data and what specific rights they have, for example the right of access or to have their data erased. In order to answer requests from data subjects in full and on time, companies need to set up an effective system that includes a whole host of factors, such as who is responsible and who to contact, appropriate tools for answering or forwarding requests, as well as erasure and deadline management systems.

Finally, it is crucial to react properly to data breaches. Under the GDPR, breaches must be reported to the supervisory authority within 72 hours at the latest, and in the case of a particularly high risk, also to the data subjects themselves. A complete DPMS therefore also includes processes for dealing with data breaches and reporting obligations. Your staff need to know the potential scenarios in which data breaches may occur and how to report them.
This does not mean simply drawing up a list, but regularly reviewing and updating that list is crucial to ensure a functioning DPMS and also to comply with accountability requirements under data protection law.

IV. The PDCA cycle

When it comes to implementation, it is advisable to apply a four-phase PDCA cycle (Plan Do Check Act), which is highly suitable for a DPMS and can therefore also be found in the standard data protection model. It consists of four phases that are repeated with the aim of continuously learning and improving. The first step should be to obtain an overview of all processes which involve processing personal data. Subsequently, compliance with the data protection requirements is reviewed. It is not only important to record individual data breaches that have already occurred: according to a risk-based approach, the respective processes in the company also need to be evaluated with regard to the potential risk of data breaches. In the case of a high risk, you should then set about using the results obtained and taking action to improve the situation even before a breach occurs. For example, measures can be taken to implement data protection principles such as data minimisation or to ensure the fulfilment of data subjects’ rights. It is also important to ensure that the DPMS integrates well with existing systems and processes. This results in the following verification phases within the PDCA cycle:

  1. Plan: Planning, specification, documentation
  2. Do: Implementation, logging
  3. Check: Check, audit, assessment
  4. Act: Improvement.

V. What else belongs to a DPMS

One of the advantages of a DPMS is that it can employ uniform documentation. This makes it easy to set deadlines and priorities and to send notifications or reminders early on. Through change management, adjustments can be made quickly to different documentation such as the records of processing activities or a DPIA, while the documents remain synchronised and up to date. The DPMS can also be used to carry out comprehensive compliance checks and risk analyses. In addition, a DPMS is a good way to work on data protection in a team. With responsibilities clearly distributed, direct communication channels and uniform documentation, misunderstandings and additional work can be prevented. A digital DPMS is not a must, but an advantage. This offers a central database as the basis for all requirements under data protection law, which your staff can access via a user-friendly dashboard depending on their responsibilities. In addition, communication among each other can be made easier with task assignment or comment functions.

VI. Conclusion

A DPMS makes it much easier to comply with data protection obligations. Documenting and reviewing the GDPR requirements promotes clarity and helps avoid data breaches while reducing time and effort. A good DPMS also enables you to react quickly to changes. Constantly reviewing the data protection standard and the possibility of flexible adjustments ensure that all measures always remain up to date. Dedicated DPMS software can make things even more straightforward, as it lets you centralise and automate data protection throughout your company. In the long term, this can be a way of improving processes as a whole and not only in the area of data protection. Companies that want to implement a DPMS should be guided by the requirements of the GDPR and use the records of processing activities as a basis for creating a suitable, comprehensive system.
Please contact us if you need advice on DPMS.

Digitisation is permeating all areas of life. Also, especially within the health care sector, the eagerness to spur the digital transformation is immense – equally from state and private sides. Thus, the market is already well-filled with a variety of fitness trackers and health apps today. Even health insurances promote the use of their own apps and the purchase of fitness bracelets with premiums. However, in view of the increasing trend of self-measurement, this development is no surprise. The digitalisation of the health sector has the potential to significantly improve the quality of medical care: more reliable diagnoses by using artificial intelligence (AI), extensive development of rural areas through new communication channels, drastic reduction of public expenditure by optimising health care.

The key question: what should be considered when transmitting health data?

Fitness trackers develop their full potential only in conjunction with a corresponding app that visualises the collected data pleasantly and understandable. Mostly, users must first register and create a new profile in order to fully benefit from the app. This profile and the collected personal data are usually transmitted to a central server, stored and synchronised continuously. The requirements placed on such data transmission are tightened by the fact that the data to be processed is regularly health data. In addition, hackers are following the development of the self-optimisation market closely, because wearables have become a popular target. Payment data, user and health data are the focus. In this respect, from a technical point of view, data controllers must ensure greater security of data. So, what should be considered with regard to such transmission?

Legal challenges – tightened requirements within the health sector

1. The combination of individual data also leads to personal reference
Unquestionably, the scope of the General Data Protection Regulation (GDPR) is open to application in case data collected via a fitness bracelet or an app (user profile, IP address etc.) is processed since it bears direct personal reference. In addition, it regularly affects particularly sensitive health data. Thus, the combination of a fitness bracelet and an app can even allow recording of ECGs. In order to receive e.g. optimised training recommendations, more data could be entered by the user. However, it must be considered that the combination of individual data may allow drawing conclusions regarding the health state of the person affected. Like this, the so-called “Body Mass Index” (BMI) can be derived by combining additionally indicated data such as weight, age and size. Therefore, companies are often unaware of the specific amount of processed health data. However, a complete capture of the processed data is necessary in order to be able to fulfil the comprehensive transparency and information requirements under Article 13 GDPR.

2. Use Privacy Impact Assessment for your own gain
The common Blacklist of data protection authorities shows: If measured data of sensors, installed in fitness bracelets or smartphones (heart rate monitors, acceleration sensors, etc.), are stored centrally, a data protection impact assessment (DPIA) is regularly to be carried out in accordance with Article 35 GDPR. This obligation to conduct a DPIA should not be seen as a chore by companies, but as an opportunity. DPIAs help companies to evaluate what data they are processing and what they have to be aware of when submitting data in order to comply with the General Data Protection Regulation. By this means, challenges regarding data protection and data security can be identified at an early state in the development in order to prevent later legal complications and penalties.

3. Know specific legal bases for the processing of health data
If health data are processed (including transmission), data controllers are not provided with the regular legal basis of Article 6 (1) GDPR. Accordingly, the processing of health data remains inadmissible despite the existence of one of the legal bases of that article. The lawfulness of the processing of health data is governed exclusively by Article 9 (2) GDPR. If health data is collected and processed via a fitness tracker or a corresponding app, the consent of the person affected must regularly be obtained.

In general, the same requirements apply to the granting of consent under Article 9 (2) (a) GDPR, as to the granting of consent under Article 6 (1) (a) GDPR. However, due to the sensitivity of health data and therefore the narrow interpretation of exemptions for formulating declarations of consent, data controllers should be particularly rigorous and fully informed about the processing intended. Especially the fact, that the data does not remain on the end device but is transmitted to a central server of the enterprise should be highlighted and presented comprehensively.

In addition, if companies intend to process the personal data for a purpose which is not necessary for the provision of the actual service, the so-called prohibition of coupling must be considered. For example, if the data is to be further processed for marketing purposes, such consent may not be combined with the central consent form, but instead the consent must be obtained separately. The same applies in case of transfer to third parties, which is common in the field of health apps.

Checklist Consent

  • voluntary
  • for certain cases (general consent is inadmissible)
  • delivered in an informed and unmistakably manner
  • understandable and easily accessible form
  • clear and simple language

4. Choose conscientious data processors
Server operators on the one hand and providers of apps and fitness trackers on the other hand are often not identical. The server operation is rather outsourced to a specialised service provider (so-called outsourcing), who then is able to inspect the personal data. Such a service provider is regularly qualified as a data processor, so that the conclusion of a data processing agreement becomes relevant. In accordance with Article 28 (1) and (3) (c) GDPR – and in order to avoid reportable data breaches – data controllers must ensure by contract that also the service provider adopts adequate, state-of-the-art safeguards for the protection of data (technical and organisational measures) whilst processing health data. This applies especially in the context of highly sensitive health data.

Due to existing or advantageous infrastructures, companies often fall back on providers from abroad such as Amazon Web Services. In doing so, it is necessary, that data controllers must raise the following questions: has the commission decided by resolution that the respective country shows an adequate level of protection regarding data protection matters? Has the provider been certified under EU-US Privacy Shield? If this is not the case, data controllers and their processors must provide for appropriate guarantees under Article 46 (1) GDPR, such as standard contractual clauses.

The safety of health data plays a key role

Data security has a special importance in the context of health data processing. Hackers show increasing interest in wearables and health apps and, unfortunately, providers oppose only in the rarest of cases with appropriate safeguards. This deficiency is crucial when it comes to an app that is used by hospitals, e.g. for the purpose of monitoring patients. In this context, the continued availability, confidentiality, and integrity of health data is predominantly important. Compromising single connections and manipulating individual measured data can lead to health-damaging or even life-threatening false diagnoses.

In addition, data controllers should not only protect the connection between end device and server from so-called “man-in-the-middle” attacks, but also focus on the transfer between wearable and app, as this is also a popular target of hackers. An encryption of connections (Secure Sockets Layer, short: SSL) should therefore be flanked by the encryption of the transmitted data (keyword “hashing”) by default.

Especially from a technical point of view, that is an opportunity to stand out from the competition.

Reliable anonymisation of health data remains a difficult part

Whenever possible, personal data should be anonymised. In this way, the scope of application for GDPR is left, so that there is significantly less administrative effort concerning the use of data.

With regard to health data, however, successful anonymisation within the meaning of the GDPR (re-identification impossible) seems doubtful. For example, a 2013 study has shown that 4 to 5 blood glucose or cholesterol levels out of around 60,000 patients are enough to allow unambiguous identification of affected individuals.

In this respect, at least the pseudonymisation of personal data should be promoted.

Recommendation for action and Conclusion

Data protection and technical challenges associated with the processing of health data are considerable. For this reason, companies must consider their obligations in order to be able to survive on the market of fitness trackers and health apps in the long term.

In order to avoid data breaches and life-threatening incidents, data controllers should, first of all, take care of particularly secure connections between the tracker, app or smartphone and server. The transfer to third parties should be avoided, if possible. Ideally, companies operate servers on their own. For the purpose of financing, it makes sense to dispense with advertising and, alternatively, develop paid-for apps that impress with particularly high data security levels.

The proper handling of enquiries from data subjects benefits any business. A well-versed approach can not only ensure compliance, but also optimise and accelerate the entire business process. For this reason, responding to data subject requests should be a firm component of a good data protection management system in the organisation. But how does one deal with data subject’s enquiries properly? What should be considered according to the GDPR?

Recognise an enquiry from a person affected

First of all, a request from those persons affected must be recognised as such. This may sound simple, but it is not always simple like that! After all, the majority of the persons concerned are legal laymen who are unlikely to literally use the word “data subject” and, in particular, name the specific right that they seek to assert. However, they do not have to! Here, the data controller is asked to discover what the data subject actually requires. In case of the following phrases, one may already assume that the message represents a data subject’s request and therefore precautionary treat it as such:

  • “I have a question relating data protection” or “I want to know something about how my data is being handled”
  • Terms like information, disclosure, blocking, limitation of processing, deletion, right to be forgotten or objection are used
  • “Where did you get my data from?”
  • Unauthorised, unwanted advertising, spam is objected to
  • “Please unsubscribe from the newsletter”
  • “Please correct the following information about me”
  • Person threatens with lawyer, warning or notification to data protection authority
  • Person demands the data protection officer

These phrases may indicate the different types of request that are to be assigned to the respective data subject’s rights of the GDPR:

  • The right to information (right of access)
  • The right to erasure
  • The right to restriction of processing
  • The right to rectification
  • Right to data portability
  • Right to object to previously granted consent

The Response – what should be considered?

It is true that the requests can be made by any means, such as orally, electronically, by telephone or written. However, the response should always be in writing or, if necessary, electronically. Electronic responses should be handled with care as data security during transmission must be ensured. In addition – and required by law – the answer must be precise, transparent, comprehensible and easily accessible to those concerned. We also advise that the entire process and the entire communication is fully documented!

Furthermore, of course there is a deadline to consider. It is intended that the answer must be given immediately, but no later than within one month of receipt of the request. Although there is an exceptional extension of time up to three months, but you better not rely on it! This is only permissible in rare cases.

It should also be noted that the data subject must always be informed if their request can not be met. This is the case, if e.g. the person requests the deletion of their data, but it is precluded by statutory retention requirements.

If the request reaches the processor, he does not have to answer it, but forward it to the data controller instead. A data processing agreement (DPA) should regulate precisely the order of processing.

[bluebox]

The Response – the Process

What is the best way to proceed when a claim has been received? We recommend the following approximate sequence:

  1. Forward the request to the data protection coordinator of the company
  2. Specify the deadline: note the time of receipt of the request
  3. Send acknowledgment of receipt to the person concerned
  4. Research: Searching the data sources for the information of the data subject
  5. Identification: Data protection coordinator identifies data subject based on data sources
  6. Data protection officer can be contacted at any time
  7. Change of database according to the type of request
  8. Send a reply via data protection coordinator
  9. Documentation of the process by data protection officer

[/bluebox]

In particular, the fourth point encounters difficulties in practice, because the data on the requesting person must first be found. Here, a well-maintained record of processing activities (according to Article 30 GDPR), which clearly lists the data, is enormously helpful.

The consequences of the request

The consequences of the data subject’s request depend on the specific type of query:

Right to information (right of access): If the data subject claims information, he / she must be informed about the stored data and provided with the information to the legally-defined extent. The person specifically has a right to information about the processing purposes of personal data (e.g. for the purposes of e-mail marketing), the categories of personal data being processed (e.g. contact details), the recipients of the data (especially outside the EU) and the retention period of the personal data.

Right to data correction (right to rectification): The Right to data correction includes the right to correct and supplement incorrect data concerning a person.

Right to restriction of processing: Here the data controller must ensure that the stored personal data are no longer processed.

Right to erasure (“right to be forgotten”): If the deletion of personal data is requested, the consequences are self-explanatory: the relevant data must be deleted. In some cases, however, an anonymisation may be sufficient, if reidentification of the person is technically and organisationally excluded.

Right to data portability: If a person asserts his or her right to data portability, the person must receive the data from the data controller in a structured, commonly used and machine-readable format, if the processing takes place in an automated process and on the basis of a consent or if it is required in order to carry out a contract. If technically feasible, the data can also be transmitted directly to another data controller.

Right to object: With the right of objection, the data processing being intrinsically allowed is made inadmissible for the future. The right may be exercised on data processed based on public or legitimate interest. However, it should be noted that consents can be withdrawn, too.

Conclusion: Ensure compliance with structure and routine!

In fact, it is not that difficult to deal with said enquiries properly. If you are familiar with the procedure of how to respond and what to consider which each type of request occurs, that is half the story.

However, the devil is in the details!

It always depends on the individual case and on how the answers are structured in terms of language and content. It may even happen that you have to deviate from the usual procedure. For this reason, we recommended to entrust an experienced data protection with including the data subjects’ requests into the companies’ data protection management system and establishing a proper and complete record of processing activities.

Please do not hesitate to contact the IT and legal experts of ISiCO Datenschutz GmbH! We can show you how to safely and correctly deal with data subjects’ enquiries and help you to create your own record of processing activities! Moreover, we train your employees in the proper handling of data subjects’ rights and we would also be happy to take the role as a data protection officer in your company!

GDPR myth busters – Part 1

The GDPR is effective since 25 May 2018. Before and after, there was a lot going on. Hardly any other topic has been talked about and published so much. Unfortunately, the numerous publications with the allegedly “best” references and recommendations have not resulted in an understanding amongst the “data controllers” of what it is about. On the contrary, there is still general confusion about the implementation of the GDPR. Best example: The flood of newsletter emails before and after 25 May 2018, with the request to give consent again (or for the first time) or simply with the information that newsletters will not be sent in the future due to lack of consent. What is true and what is wrong? Lots and lots of questions, but no satisfying answers. With this series, we want to give the required answers: What myths are there about the GDPR? What is right and what can be refuted?

The biggest GDPR myths about consent:

In data protection law, the principle applies: The processing of personal information is prohibited, unless the law or the data subjects permit it. This permission is called Consent and is the linchpin of the GDPR. The law places strict requirements for consent. But what is it exactly and is everything that you hear about it true? Nine misconceptions about consents under the GDPR:

1. Consent must always be obtained in writing!

With the form of the consent, the GDPR complies with the data controller. For example, in Sec. 4a of the former German Federal Data Protection Act (FDPA), the written form for consent in the processing of personal data was so far ordered. According to the GDPR, a written form is not necessary anymore. A “declaration or other affirmative action” suffices, i.e., theoretically, even a nod would be enough. However, only in theory, because data controllers must prove that the consent has been given effectively (obligation of documentation and proof). This may be difficult when you have only a nod. Instead, the active clicking on a check box (opt-in) is sufficient.

Caution: A check box where the box had already been ticked and has to be deactivated (opt-out), is not considered an effective consent.

2. Consents, which were obtained before the rule of the GDPR are invalid!

That is not correct. The European legislator has decided in favour of the processors that also consents obtained before 25 May 2018 remain in full force and effect. This means, that once effectively obtained data protection legal consents do not need to be obtained again, as long as they also comply with the provisions of the GDPR. Since – for example in Germany – the previous requirements are very similar to those of the GDPR, in most cases a renewed request for consent will not be necessary.

Caution: There are special requirements for the consent of minors.

3. Minors can not give consent effectively!

It is true, that the GDPR sets rigid age limits for the effective submission of consent. The GDPR stipulates that minors can only effectively consent to the processing of their personal data by the age of 16 years. This age limit can be lowered by the member states up to the absolute lower limit of 13 years. For the processing of personal data of under 16-year-old persons data controllers need the approval of the person’s legal representative.

4. Consents can also be obtained later!

The myth, that it is not a matter of the timing of obtaining the consent, is not true. “Consent” is a legal term and means prior consent. The opposite of consent is the “authorisation”, i.e. subsequent consent that does not legitimise the processing of personal data.

5. For consent, the double opt-in procedure is mandatory!

First, it can be stated with certainty: The opt-out has become obsolete! Consents designed as opt-in (that is, ticking in a check box) must be granted actively. It is questionable whether the so-called double-opt-in is mandatory for the granting of consent. Under the Double-Opt-in procedure, the tick is set first. In addition, a provided link, e.g. when subscribing to the newsletter, must be clicked to confirm the consent. The advantage of this procedure is that it offers a simplified proof of the granting of consent, which is mandatory for the data controller’s obligations of documentation and proof. If this proof can also be provided with the single-opt-in method, this is enough, so that a double-opt-in procedure does not necessarily have to be carried out. However, it can be said: The Double-Opt-in offers more security against sanctions and two is better than one anyway!

6. One consent for all data processing is sufficient!

The idea that one blanket consent is sufficient for legitimacy of unlimited data processing is grossly wrong. The principles of consent are voluntariness and purpose. The person affected must be informed precisely regarding the purpose for which his data is used. He/she then has to decide for him/herself which processing operations he/she wants to agree to.

Caution: Voluntariness is not given if the rendering of a service is made dependent on the consent of a processing that is not necessary for the provision of the service (so-called prohibition of coupling).

Example: A customer orders goods in an online shop. During the ordering process, the customer is informed that the data entered (e.g. e-mail address, address, telephone number) can also be used for advertising purposes. In order to continue the ordering process, the customer must give his consent to the use of his data for advertising purposes. Since the consent to data processing is not needed for the provision of services (shipping of the product) the prohibition of coupling applies.

7. The revocation of consent must be equal to the granting!

A once given consent must be revocable for the future. New in this context is the regulation of Art. 7 para. 3 p. 4 GDPR: The revocation of consent must be as simple as the granting of consent. However, this does not mean that the revocation procedure must be completely the same as the one granted. Especially in online trading, the consent is usually given once via opt-in before the data processing begins and there is no possibility of removing the tick later, unless there is a customer account. The keyword is simplicity. As long as there is an easy way to revoke consent, this is sufficient. In newsletters, for example, this is possible by putting an unsubscribe-link to the end of the email or an opt-out option into the privacy policy.

Caution: The possibility of withdrawal must be made aware of clearly and easy to read.

8. I can not process any data without consent!

If consent or permission according to Art. 6 para. 1 lit. b or c GDPR are missing, data processing can also be based on legitimate business interests under certain conditions, insofar as these outweigh the rights and interests of the data subject (Article 6 (1) (f) GDPR). Even without consent, data processing is possible. According to the recitals of GDPR, economic interests and, in particular, direct marketing are expressly recognized as legitimate corporate interests. However, companies must in fact carry out a balancing of interests and this must be in favour of the company.

9. For every cookie I need a permission!

The subject of cookies and consent is still confusing under new data protection law. In any case, it is clear, that cookies represent personal data that must be measured against the GDPR. This means, that the use of cookies is generally prohibited, unless the data subject has consented, or it is subject to one of the permitted grounds of Article 6 GDPR. As described above, data processing in accordance with Art. 6 para. 1 lit. f) GDPR can be allowed, if there is a legitimate interest. This includes, inter alia, commercial corporate interests (e.g. advertising). It must always be considered, whether the interest in data processing outweighs the data subject’s interest in the protection of his/her data.

However, for good reasons, the setting of cookies for advertising purposes can be based on a predominant interest in accordance with Art. 6 para. 1 lit. f GDPR as long as the data is collected in a pseudonymous form. The pseudonymisation would meet the needs of the users’ legitimate interests. As a result, one would be back at the opt-out option. However, also in this constellation, it is useful, to use a cookie banner on which the website user is informed about the setting of cookies and their purposes.

Caution: The person affected must always be given the opportunity to object and must be clearly informed about this opportunity.

(Including references to German legislation in particular, but not exclusively)

German State Minister for Digitisation Dorothee Bär gave an interview to the German newspaper Welt am Sonntag on 23 December 2018, in which she explained that digitisation in the healthcare sector could be achieved, among other things, by easing data protection regulations.

“In Germany we have the strictest data protection laws worldwide and the highest privacy requirements. This blocks many developments in the health sector, so we have to disarm at one point or another, delete some rules and loosen others”(see interview). In particular, she took aspects into account concerning the introduction of the electronic health record until 2021.

Regardless of whether there actually is a causal connection between data protection and slowed digitisation, the question is, if the constraints of the data protection regulations in Germany can simply be “eased” or even have to be “eased”!

The General Data Protection Regulation regulates the processing of health data in various places and should therefore be taken into account in all matters of digitisation in the health sector. This article is intended to provide an overview of the data protection requirements of the GDPR in the health sector and show some possible fields for Member States’ own national regulations.

Background: Regulations on health care within the GDPR

The GDPR does not explicitly regulate the health system. Originally, there was the intention for an Article 81 “Processing of personal data for health purposes”, but it was dropped after the discussions on the latest version. Nevertheless, the GDPR contains provisions that specifically relate to health data and thus have a significant impact on digitisation within the healthcare sector.

Data processing, i.e. the processing of personal data, is generally prohibited under the GDPR, but allowed in accordance with Article 6 (1) lit. a), for example, if the person affected consents to the processing. Furthermore, in addition to other permission reasons, data processing is even permissible if a balancing of interests turns out to be in favour of data processing (such as a company’s interests). For health data, however, Article 9 GDPR provides for special regulations. Health data is considered particularly worthy of protection under Article 9 because it is particularly sensitive. Due to the sensitivity of the data, their processing is therefore only permissible in accordance with the strict requirements of Article 9 (2) GDPR in conjunction with Article 6 GDPR.

The scope of application of Article 9 (2) GDPR

Businesses are urgently advised to monitor closely the legislation of the EU and the Member States, as Article 9 (2) lit. (a) gives the respective legislators the possibility to make certain areas of the processing of health data consent consistent. If legislators make use of this option, the processing of certain health data would be unlawful, despite the consent, since Article 9 GDPR does not provide for a contractual basis for processing.

However, up to a national rule, it remains the case that if consent meets the requirements of Article 9 GDPR, it can serve as the legal basis for the processing of health data. Especially in connection with the use of health apps, such consent becomes relevant.

Health Apps

An important use case in the processing of health data in the field of digitisation are health apps, such as heart rate monitors or sports trackers. Firstly, Article 9 (2) lit. (a) GDPR enables the processing of personal data via consent of the data subject. It is important to pay attention to the extensive information and transparency obligations of the GDPR. The affected person must know at all times what happens to his data. Without his permission, health data may not be disclosed to third parties (such as insurance companies).

Article 9 (2) GDPR contains a large number of other permissive rules that health apps can be taken hold of. An example is the protection of vital interests if the person affected is no longer able to give consent (for example, if the person affected is located having a heart attack reported by an app). However, for this permissive rule to apply, there must be an actual and specific case concerning a vital interest. That means, that the permanent tracking of data that (eventually) can also affect a vital interest is therefore not allowed. Likewise, the prior obvious publication of data by the data subject (such as posting a lower resting pulse as a result of regular training) constitutes a permissive rule.

In addition, the protection of individual and public health is considered as a legal basis. Where the individual health requires to treat certain illnesses and symptoms of the person affected professionally, this may also require falling back on the data needed from the app. Public health mainly covers cases of spreading health threats, such as epidemics, of which the spreading can be tracked by an app. The justification standards of individual and public health for the operation of health apps lead to the related question of the admissibility of the electronic medical record under the GDPR. In any case, there is sufficient legitimacy for the apps on the basis of the GDPR, so that the loosening of German national data protection regulations does not seem necessary in this respect.

The Electronic Medical Record

Both Article 9 (2) lit. (h) (individual health) and Article 9 (2) lit. (i) (public health) enable Member States to determine when the processing of health data for these two purposes is necessary. Thus, there are so-called opening clauses giving Member States great flexibility to adopt their own national regulations without violating the GDPR. In this context, corresponding provisions in the German Social Security Code V (SGB V) could be introduced, which not only allow an electronic patient record having the format of a health card, but also of an app on the smartphone. Although the opening clauses of the GDPR and SGB V thus make it possible to introduce the electronic patient file, their respective use in individual cases must always be dependent on the respectively given consent of the patient (or the other permission norms of Article 9 (2) GDPR). If German State Minister for Digitisation Bär in this regard has spoken of a relaxation of standards for data protection to introduce and use the electronic medical record, there are considerable doubts whether the use of the patient record would be compatible with the GDPR without the consent of the patient and outside the provisions of Article 9 (2) GDPR.

The current proposal for reform on Article 305 (1) SGB V expressly stipulates such a need for consent to use the electronic patient record. However, it remains uncertain whether under the proposed version of Article 305 (1) SGB V “providers of electronic medical records” would not be able to undermine the consent requirement, since the current wording of the reform proposal enables such an exemption by allowing easier access for such providers. This probably represents a violation of the consent requirement of the GDPR.

Big Data

Big Data analyses of health data happen in different applications. For example, such analyses can also prevent the spreading of diseases and optimise the need-based supply of medicines and medical devices. For both reasons, therefore, they may be founded upon Article 9 (2) lit. (i) GDPR (protection of public health and ensuring the quality of medicinal products and medical devices). However, the data subject rights, such as the right of revocation – which is mostly difficult to implement in respect of big data analyses – must always be considered. In order to meet these challenges, the principles of pseudonymisation and anonymisation must be taken into account. This also applies in the event, that the Member States decide to use the opening clauses – which are also available in this context.

Conclusion and Recommendation for Action

Upon evaluation to the GDPR, health data represent data, which is particularly sensitive and therefore particularly worthy of protection. Regardless of whether these data are processed on the basis of national law or based on the GDPR, one must always ensure a legal basis for their processing. If the legal requirements are met in each individual case and their conformity with GDPR has also been checked, the processing of health data in new digital applications will be admissible.

The possible ranges of applications of Artificial Intelligence (AI) are quite divers. The General Data Protection Regulation (GDPR) is often quoted a showstopper – with the following article, we want to demonstrate, that is not the case. To this end, we are taking up very specific fields of application for Artificial Intelligence, which are already a reality in numerous companies and institutions – “despite” the GDPR. The examples illustrate a selection of data protection challenges that must be overcome in order to be able to confidently deploy the respective AI.

Chatbots and digital assistants as first point of contact – the appropriate legal basis without risking a conversion killer

The trend towards supplementing customer service is moving inherently towards digital assistants, who proactively recognise problems and needs of the user (so-called “predictive analytics”) and proactively propose appropriate solutions. Prominent examples of digital assistants are Alexa (Amazon), but also Siri (Apple) and Google Assistant (Google).

For a chatbot to take the step from so-called machine learning to deep learning, it is necessary that at first, large amounts of user data are collected and structured or rather processed for use by an AI. The required data are regularly collected by companies via the input field of the chat window. However, the immediate linking of the user’s input with the IP address of his end device creates a reference to persons that opens up the scope of application of the General Data Protection Regulation (GDPR). The processing of personal data on the other hand requires the existence of a legal basis. In case of chatbots or rather digital assistants, declarations of consent are conceivable, but the implementation involves a lot of effort. Before the Chatbot can actually provide any assistance, it must first obtain the consent of the requesting party. The person seeking help, however, does not want to have to deal with data protection information before he gets the solution to his problem.
Since the chatbot regularly only gets active on the initiative of the person seeking advice, companies – by means of correct structuring – may use other legal bases instead. In this way, the chatbot can begin its actual work immediately, since the advice seeker does not have to be fully informed. In case of chatbots, data controllers usually provide their transparency and information obligations with the privacy policy and a corresponding note.

More reliable diagnosis in the health sector – take protection measures at an early stage

Especially in the health sector, Artificial intelligence has the potential to make significant progress, if not breakthroughs. In the field of imaging techniques, AI can be used to greatly simplify and complement the work of a physician today.
The basic requirement for the use of AI is always, that the information to be analysed is already digitised: CT scans, electrocardiograms, close-ups. Particularly in the area of healthcare, this basic requirement is fulfilled in many sectors.
However, the problem here is the fact that health data – which are particularly sensitive data – are processed. At first, large amounts of reference data are needed to train the AI. In order to finally detect irregularities and make a diagnosis, the AI must match the patient’s information (e.g., in the form of a CT scan) with the reference data. In both cases, personal information is involved since the data comes directly from a natural person.

For the processing of health data the GDPR sets up special restrictions. In so far, it requires appropriate safeguards. The focus should be on a pseudonymisation or at best anonymisation of said data. Anonymisation would lead to the scope of the GDPR being abandoned. However, if this is not possible, institutions and companies should concentrate on the use of recognised pseudonymisation techniques, as the GDPR fundamentally rewards any kind of risk-minimising measures. For long-term studies or similar projects, the additional use of a “trust centre” is recommended. In such a case, an assignment of the pseudonymised data is only possible by including the trust centre, i.e. a third party that securely stores the assignment rule.
Companies or institutions should lay the technical foundation for the protection of personal data – so-called “Privacy by Design” – already during the stage of development. A data privacy impact assessment (DPIA) can be helpful at that stage. Due to the potentially high risks regularly associated with the processing of health data, the obligation to carry out a DPIA is particularly prevalent in the health sector. With the help of a DPIA, specific risks can be identified and targeted protective measures can be taken to ensure the security of the data.
Apart from that, data controllers should above all comply with the extensive transparency obligations under the GDPR in case AI is applied.

More cost-effective results through automated decision-making – use exemptions

Another area of application for Artificial Intelligence is automated decision-making. This way of data processing affects our lives already today – especially in the field of bank lending. It is problematic though, that in many cases the decision-making of self-learning algorithms can not be fully understood due to their black box character.
One possible consequence of an incomprehensible, fully automated case-by-case decision is always the exclusion or even discrimination of certain persons or group of persons. In order to protect data subjects against such consequences, the GDPR requires that not only an algorithm may decide, if said decision has legal effect. Rather, a person must be interposed, who, to an extent, ultimately makes the decision. The result of the so-called “ADM system” (ADM = Algorithmic Decision Making) can only be used as an aid to decision-making.
Companies that do not want to sacrifice cost savings and sometimes even more neutral decisions, still have the opportunity to use an AI in a way that a “human in the loop” is not necessary: Again, it is conceivable to obtain the consent of the data subject – with all its disadvantages.

However, its use is considerably simpler if the data processing is required for the conclusion or the fulfilment of a contract between the data subject and the data controller. In case an automated decision-making is contractually agreed upon, its use is also required. Same applies to the fulfilment of a legal obligation. For example, in Germany, provisions of the German Banking Act and the German Civil Code in case of (real estate) consumer loan agreements set up an obligation to carry out a creditworthiness check based on information like income, expenses and other relevant circumstances.

A fully automated credit check should also be qualified as necessary for all other contracts where the solvency of the counterparty is a decisive criterion. Ultimately, it always depends on the specific case.

AI and data privacy are not mutually exclusive

GDPR often provides data controllers with a choice of legal bases on which they can base their respective processing activities. The advantages and disadvantages of each legal basis should be carefully balanced against each other. Any obligation to carry out a data protection impact assessment, as it is often the case in the health sector, should not be seen as an inevitable evil, but as an opportunity.
The examples illustrated show that the GDPR and its numerous requirements on the handling of personal data by no means prevent the use of Artificial Intelligence. Anyone who precociously thinks about how the concrete implementation of AI can be brought into line with the GDPR – at best during the development phase – will also benefit in the long term: unnecessary follow-up costs that may result from any adaptation are avoided. In addition, the risk of reputational damages due to a lack of data security and due to a data protection incident can be effectively contained.

You need advice on data protection? Please do not hesitate to contact us!

True to the motto “the future is now!”, artificial intelligence is regarded as the key technology of the future but has long since been established in the present all over the world. More and more companies are discovering the advantages of machine learning and deep learning as well as Big Data for themselves. Processes can be accelerated, optimised and made more profitable. This is made possible by algorithm-based automated data processing that takes the place of human decisions and evaluations.

The GDPR is often called an obstruction to development because of stringent requirements. In fact, it regulates the area of data processing relevant to the application of artificial intelligence. Here the instruments of anonymisation and pseudonymisation become relevant. They can overcome the challenges of GDPR that emerge for AI companies and they can minimise compliance efforts.

However, only those AI companies that use anonymisation and pseudonymisation in accordance with legal certainty may benefit from this advantage. We will explain step-by-step the challenges that you may encounter whilst applying anonymisation and pseudonymisation, as well as the benefits obtained by these tools and what you should be aware of when applying them.

Operational challenges in implementing GDPR requirements for automated processes

The big legal challenge is the applicability of the GDPR to the data processed by an AI company. The GDPR is not applicable if no personal data are available. However, in such cases, the AI company can manage with anonymisation. The GDPR is particularly not applicable to anonymised data and entrepreneurs in the AI sector could avoid the data protection requirements!

However, if personal data are available and a company deals with the data protection requirements of the GDPR, then there are two steps to follow:

Step 1: Is the data processing legally justified? This, for example, is the case, if the person concerned consented to the processing of his personal data.

Step 2: The operational implementation of the GDPR. What requirements does the GDPR pose for specific data processing and how can these be implemented?

Here, AI users encounter actual or practical challenges in justifying data processing (step 1) and in implementing the requirements of the GDPR (step 2), such as personnel and financial charges. A pseudonymisation of the data would be a considerable relief in solving the emerging challenges, since a pseudonymisation minimises the risk of data protection incidents and therefore has a privileging role in the GDPR.

Anonymise data – avoid the GDPR!

The advantage of anonymisation is obvious: the GDPR is not applicable to anonymised data and thus an AI company could process those data. From a data protection point of view, nothing prevents from the use of Artificial Intelligence. However, in many cases, AI users need personal information, so that, for example, a deletion would be disadvantageous for them. Anonymisation is then out of the question.

Exploit the balancing of interests for the benefit of the AI user!

Pseudonymization has a simplifying function for AI users. You play a role in step 1, namely that data can be processed if a justification for processing is relevant. Data can not only be processed by consent.

One possible justification is also the processing due to predominant interests of the data controller. Since, however, his interests must be weighed against those of the person concerned, pseudonymisation has the appealing aspect for AI companies that a balancing of interests tends to be in their favour when data is pseudonymised. Thus, the data is better protected than without pseudonymisation. Consequently, the data processing becomes lawful.

Eliminate requests of data subjects by means of pseudonymisation!

Furthermore, for AI companies a pseudonymisation has the advantage that data subject requests may lapse. The implementation of the requirements (step 2), which the GDPR provides to data controllers in the context of data subject rights, is made easier by eliminating the time-consuming processing of data subjects’ inquiries and may related costs and workload be omitted. AI users who pseudonymise personal information do not face these issues. According to Art. 11 para. 2 GDPR, those persons affected would no longer have their rights at their disposal if it was not possible for the data controller to identify those persons affected. This includes pseudonymous data.

The challenges said are faced by AI entrepreneurs because of the principle of transparency. Accordingly, the person concerned by the processing must always be comprehensibly informed about the use of their data in AI systems. If artificially intelligent systems replace human decisions, decision-making process must be explainable.

In addition, the data controller must prove the general compliance: he has the obligation to provide evidence that the legal requirements of the GDPR are complied with (accountability). As a result, the processing must be able to be presented comprehensibly not only to the persons concerned, but in case of doubt also to business partners and authorities.

If automated processes are used, these requirements lead to a general compliance problem, since automated processes are more difficult to track and determined for the data controller. Particularly within the context of deep learning, AI users often do not even know how their system will evolve, since they are working with changing and self-adapting models. Depending on the selected models (keyword: black box), the fulfilment of transparency obligations and information rights requires considerable documentation effort.

This problem is reflected in specific requirements of the GDPR. It gives special rights to persons affected by automated decisions (Artt. 13-15 GDPR). In such cases, a person concerned also has the right to access and information about the logic involved as well as the scope and intended effects of automated processing for the data subject (Article 13 (2) (h), Art. 15 (1) lit. h GDPR). Higher costs, more staff and higher compliance costs resulting from processing data subjects’ enquiries are the result of not pseudonymising data.

Risk minimization by pseudonymisation – interaction of the compliance areas

Pseudonymised data is also advantageous in the framework of the data protection impact assessment. The idea behind this is a risk analysis and evaluation of the data processing to be carried out by the data controller. It does not always have to be done, but particularly in cases of profiling and other types of automated decisions that are often used in the context of AI, a data protection impact assessment is required.

In addition, the data protection impact assessment is not just a one-off, but the development and production processes must be constantly monitored (monitoring) in order to detect new risks and to meet the accountability requirements. This is often difficult due to the agile development of AI systems and requires good communication between developers and legal advisers who can provide legal guidelines for development. The implementation of the data protection impact assessment thus interacts with other compliance areas.

However, if the data is pseudonymised, the risk of data processing to be assessed is lower than normal and it may not be necessary to consult the supervisory authority, as would it be the case with a high level of risk. Often pseudonymised data even eliminates the obligation to carry out the data protection impact assessment. Pseudonymisation also makes it easier for the data controller to follow his accountability obligations. It must be done before the data processing, that means, still in the context of the generation of the AI.

What are anonymised and pseudonymised data?

Anonymous data are the flip side of personal data. Data is either personal or anonymous. The consequence of this is that the GDPR is not applicable to anonymous data. Anonymous data can exist in two constellations: First, data can be anonymous in the first place simply by not referring to an identified or identifiable person. On the other hand, it is possible that initially personal data were available, but these were subsequently anonymised.

In contrast to anonymisation, pseudonymised data describes personal information; the later simply constitutes a subordinate. The scope of application of the GDPR is therefore opened and consequently the processing of pseudonymous data is prohibited.

Pseudonymous data is information that can be assigned to a person only when accessing separately kept and protected information (see Article 4 (5) GDPR). Thus, it can be achieved that the assignment of data to a person is no longer possible. The difference to anonymisation is, that the information that makes the identification possible, is not deleted, but stored separately. The guiding principle is the principle of functional separation. It must be technically ensured that users of the pseudonymised data have no access to the separate information and therefore a person cannot be identified.

In order to fully exploit the advantages of anonymisation and pseudonymisation, it makes sense to AI companies to consider these procedures at an early stage of development. One should begin with the raw data, so that in the context of pseudonymisation one can already alienate for example the address. Care should be taken to ensure that no non-pseudonymised personal information is incorporated into machine learning. In this way only, the advantages of anonymisation and pseudonymisation can be realised and privacy by design can be effectively implemented.

Conclusion: Anonymisation and pseudonymisation are effective means of solving data protection problems

The GDPR brings up challenges to the application of algorithm-based artificial intelligence. However, the innovation is not stopped. Through the instruments of anonymisation and pseudonymisation, the GDPR provides users with procedures that they can be used to mitigate the challenges. This allows AI users to use Big Data for themselves, to apply machine learning and deep learning systems in accordance with data protection law and to avoid increasing costs, work loads and legal violations.

However, only those AI companies using anonymisation and pseudonymisation in accordance with legal requirements benefit from this advantage. In the end, if data were not properly anonymised, the GDPR is applicable and the company is subject to appropriate sanctions if it did not know it and did not process the data in accordance with the requirements of the GDPR. Therefore, we recommend appropriate legal advice in that matter.

We may help you creating a concept for anonymisation. It must be ensured that account is taken to all the means that might be used by the data controlling company or other persons in order to identify a person. The identification of a person must be made irrevocably impossible. The costs and time required for the identification must also be taken into account. Technical requirements for the process of anonymisation cannot be derived from the law; the deletion or generalisation of the identifying features are examples, that may come into question.

In order to successfully apply the means of pseudonymisation and to reap benefits to the full, companies must ensure that the type of pseudonymisation they are using complies with the requirements of the GDPR. There are now guidelines for pseudonymisation solutions in a White Paper issued by the Ministry of the Interior. Entrepreneurs and their legal advisors may follow this. One possibility is, for example, that a person concerned chooses a user ID or gets assigned to one by a third party. The data controller can also assign a pseudonym to a person concerned by a code number if he knows the identity.

In the event, that you need advice or have questions, ISiCO Datenschutz GmbH is your ideal contact. Along with our affiliated law firm Schürmann Rosenthal Dreyer Rechtsanwälte, we successfully work together in the area of data protection law. By implementing the requirements of the GDPR legally compliant, potential customers build trust in you and chose you instead of the competition. This is how you become a trademark of European AI companies! We know what matters, you can count on our expertise!

Please also read our article on “Artificial Intelligence and Data Protection: application cases”!

You need advice on your AI project? Trust in the expertise of our ISiCO consultants. Please contact us directly.