Data Protection update - July 2021

Data Protection update - July 2021

Welcome to our Data Protection bulletin, covering the key developments in data protection law from July 2021.

Data protection

Cyber security


Civil litigation

Data protection

The EDPB has adopted guidelines on the concepts of controller and processor under the GDPR

At its 51st plenary session on 7 July 2021, the European Data Protection Board ("EDPB") adopted the final version of its new Guidelines on the concepts of "controller" and "processor" under the GDPR. It is important that the meaning of these concepts, as well as any associated terms, are understood clearly by any entities that deal with data throughout the European Union ("EU") and the European Economic Area ("EEA"). Whether an actor is a "controller" or a "processor" depends on the actual activities it undertakes, rather than how it is described. Our key takeaways are as follows:

  • Controllership may be defined by law or may derive from a factual analysis of the underlying circumstances of the case. While it is likely that the terms of a contract will assist in identifying the controller, where the factual circumstances suggest otherwise, such terms may not be conclusive.
  • A controller can still be classified as a controller, regardless of whether or not it has access to the data being processed.
  • A joint controllership exists where the processing would not occur without the participation of multiple entities. If this is not the case, the data exchanged between the two parties should be considered a transmission between separate controllers.
  • The processor is always a separate entity to the controller and must process data in accordance with the controller's instructions. However, the processor is able to determine the practical implementation of these instructions, by choosing suitable technical and organisational means.
  • A controller must only consider using processors which have sufficient capability to implement the necessary technical and organisational measures for the processing and a formal contract should be entered into between the two entities.
  • Joint controllers should agree on how the responsibilities for GDPR compliance will be allocated between themselves. While the format for this is not prescribed by the GDPR, the EDPB recommends that joint controllers should enter into a binding contract to govern their arrangement. In any event however, and regardless of the terms of any arrangement, data subjects will be able to exercise their rights against any of the joint controllers.

The EDPB has adopted guidelines on codes of conduct as tools for transfers

During the same plenary session mentioned above, the EDPB adopted its Guidelines on the codes of conduct as tools for transfers (the "CoC Guidelines"). It is intended that the CoC Guidelines will clarify the application of Articles 40(3) and 46(2)(e) of the GDPR, which refer to "approved codes of conduct" as appropriate safeguards for the international transfer of personal data to or from third countries.

A code of conduct must go through a two-stage approval process before being formally adopted – first it must be approved by a competent supervisory authority within the EEA; then it must be granted general validity by the European Commission (the "Commission"). The CoC Guidelines are therefore intended to act as a guide, not only for supervisory authorities and the Commission who may refer to it when performing assessments of codes, but also for code authors, who may refer to the checklist to ensure that formal requirements and thresholds are met (notably, this checklist takes account of the Schrems II decision). It is also hoped that the CoC Guidelines will ensure a consistent approach across the various supervisory authorities tasked with evaluating and assessing such codes of conduct.

As part of its public consultation, the EDPB has welcomed comments on the guidelines, which can be submitted up until 1 October 2021.

UK indicates its intention to enter data 'adequacy' deals and to publish details of candidate countries

The EU's recognition of the UK's data protection regime as being equivalent to that of the EU GDPR (reported on in our previous update, accessible here) ensured that data flows into the UK from the EU and the EEA could continue uninterrupted.

Now, the UK government has indicated its intention to enter into deals with the "fastest growing economies". These trade deals will reportedly seek to promote innovation and trade through the free flow of personal data from the UK to worldwide jurisdictions. The government has stated that it will attempt to use this opportunity to reduce the obligations on organisations which use data for the purposes of tackling global issues, such as climate change and disease prevention.

However, the government will also seek to ensure that personal data is protected to the usual high standard. Secretary of State for Digital, Culture, Media and Sport ("DCMS") Oliver Dowden has said that the government will focus on using the power of data to drive innovation and boost the economy, while also ensuring that people's safety and privacy is protected.

It is anticipated that details of the government's plans for new UK data adequacy decisions will be published later this month. John Whittingdale, Minister of State for Media and Data recently explained (during the 34th Annual International Conference: Resetting Privacy: Winning Trust, Privacy Laws & Business, online, July 5-7, 2021) that the decisions about candidate countries will depend on trade relationships and the strength of their data protection regime. Whittingdale also indicated that there would be a move to increase the number of jurisdictions that the UK considers to have "adequate" data protection regimes, thereby preventing the need for additional safeguards when making transfers to these jurisdictions.

The UK government will need to balance the desire to remove barriers to data flows against the risk that the Commission may reassess the UK's own adequacy decision, if onward transfers from the UK are seen as providing insufficient protection for personal data of European origin.

EU confirms its commitment to international data talks, despite the slow progress

On 28 June 2021, a Commission official said (during the UK-EU-Japan digital policy coordination online event, June 28, 2021) that international talks on data flows were taking "baby steps". Silvia Viceconte, of the Commission's digital department, said the discussions were focused on finding "commonalities" between governments in enabling "trusted" data flows, which may include such tools as adequacy agreements and trade or multilateral arrangements.

Some countries have responded to concerns about the misuse of data by introducing strict laws to restrict data flow from their countries and to oblige companies to store and process data on domestic servers. Such concerns have arisen in particular since the Court of Justice of the European Union's ("CJEU") decision in Schrems II in July 2020, which declared the EU-US Privacy Shield invalid. The Privacy Shield was used by some 5,300 companies for trans-Atlantic data flows and many have responded by attempting to keep their personal data locally and minimise overseas transfers.

Nevertheless, international governments are aware of the importance of the free flow of data to economic development. This has led to international talks, such as at the Organisation for Economic Co-operation and Development ("OECD"), on having common international standards to avoid the need for localisation requirements. Viceconte said (during the same online event mentioned above) that identifying the commonalities is "long, and at times quite conflicting".

The OECD's Committee on the Digital Economy Policy previously warned that requirements imposed by governments to access personal data held by companies could restrict international data flows.

According to Viceconte, the EU have identified the importance of making the talks more inclusive for developing countries.

EDPB resolves WhatsApp fine dispute between Ireland's DPC and EU regulators regarding the adequacy of a €50 million penalty

The EDPB has used the Article 65 process under the GDPR to mediate the dispute between Ireland's data protection commissioner ("DPC"), and a number of EU data regulators regarding the DPC's decision in December to impose a penalty of €50 million on WhatsApp for sharing user data with Facebook.

The fine reportedly ranged from €30 million to €50 million, equal to the highest fine imposed in Europe so far but was considered to be insufficient according to EU regulators. The parties had been in talks but had thus far failed to reach a compromise, resulting in the EDPB's intervention. The supervisory authorities of each of the 27 members will vote in mid-August on individual issues of difference if arbitration has proven to be unsuccessful.

WhatsApp declined to comment on the use of the Article 65 process, however, it does have the right to be heard under the EDPB procedural guidelines. Depending on the outcome of this procedure, WhatsApp could be required to change how it handles user data.

Cyber security

Incoming: cross-border regulatory sandboxes

Stephen Almond, the ICO's Director of Technology and Innovation, recently hinted (during the 34th Annual International Conference: Resetting Privacy: Winning Trust, Privacy Laws & Business – 5-7 July 2021) following an increase in the number of established regulatory sandboxes across the globe, cross-border-jurisdictional sandboxes will arrive in the UK "very soon".

A regulatory sandbox is a mechanism which enables businesses to test innovative services and products, such as fintech, in a controlled environment that is being supervised by regulators. Almond further explained that it was "the sort of thing that we're going to need to start getting our heads around, particularly where a country has a very different legislation in relation to this." Of particular note, however, was Almond's indication that ICO would actively collaborate with other regulatory authorities when designing further experimental tools.


ICO issues enforcement notice against marketing consultancy for mischaracterising itself as a processor

The ICO has issued an enforcement notice against the global email marketing consultancy, Emailmovers Limited ("EML"), for failing to comply with its obligation to process data fairly, lawfully and transparently under Article 5(1)(a) of the UK GDPR. The ICO's investigation revealed that EML had classified itself as a data processor when it was in fact acting as a data controller. This misclassification resulted in EML's failure to identify the lawful basis upon which it processed personal data. The enforcement notice stated that "In response to a request for policies concerning privacy and data protection, EML provided a number of policies. None of those policies addressed the manner in which, and the purposes for which, EML processed personal data provided to it by third parties in business to consumer marketing".

The enforcement notice requires EML to meet the following requirements within three months:

  • notify all data subjects of the information required by Article 14 of the UK GDPR;
  • cease processing personal data where data subjects cannot be provided with notice in compliance with Article 14 of the UK GDPR;
  • cease processing personal data which was purportedly obtained and processed on the basis of consent; and
  • keep appropriate records in relation to consent.

Update on the Luxembourg DPC's proposed fine against Amazon

Last month we reported that the Luxembourg National DPC ("CNPD") proposed to issue a fine of over $425 million to Inc (see our June bulletin here). It has recently been disclosed in a filing which Amazon made with the SEC that, in fact, the proposed fine is even larger, totalling €746 million ($884.9 million) – nearly 15 times larger than the largest fine issued for breaches of GDPR to date.

The proposed fine is understood to arise from a complaint filed by La Quadrature du Net, a French privacy advocacy group, in relation to Amazon's targeted system of advertising.

Amazon's stated position on the CNPD's proposed fine is: "[t]he decision relating to how we show customers relevant advertising relies on subjective and untested interpretations of European privacy law, and the proposed fine is entirely out of proportion with even that interpretation."

TikTok receives fine from Dutch DPA

The data protection regulator of the Netherlands ("DDPA") has fined TikTok with a €750,000 fine following an investigation which began last May. The fine was imposed because the app only provided its privacy policies in English rather than Dutch. Therefore, it was deemed not to comply with the GDPR requirement to provide information to data subjects about how the app collects, processes and uses personal data.

The app was investigated because of the DDPA's concerns about the privacy of children, who are treated as an especially vulnerable category of data subjects. During the course of the investigation, TikTok established a headquarters in Ireland and so the DDPA has transferred some of the results of its investigations to the Irish DPC which may take further enforcement action in the future.

TikTok has appealed against the fine.

We previously reported on an investigation into TikTok by the French Data Protection Authority ("DPA") in our September 2020 update here.

COVID-19 testing company receives a fine for inappropriate use of staff WhatsApp groups

The Danish DPA has fined Charlottenlund Medical Hospital Medicals Nordic I/S ("MN") €80,700 for a breach of the GDPR. MN created WhatsApp Groups for employees of its different COVID-19 testing centres. These employees transmitted data to the hospital which included the social security number and health information of data subjects. There were inadequate security measures to prevent the inappropriate disclosure of personal data as other employees in the WhatsApp Groups, including those which had left the employment of MN, could continue to access the personal data being transmitted.

In addition to the fine, the Danish DPA has reported MN to the Danish Police Authority for processing confidential information and health information in connection with COVID-19 tests without the necessary security measures having been established.

In our May bulletin we gave some advice for organisations which communicated and ran businesses through WhatsApp.

Italian DPA fines Foodinho for breaches of GDPR for its use of algorithms to manage riders doing food deliveries

The Italian DPA ("Garante") has issued a €2,600,000 fine to Foodinho ("FO") for breaches of various Articles of the GDPR through its use of algorithms to manage riders doing food deliveries. The decision drew particular attention to Article 22 of the GDPR which provides protections for data subjects, against being solely subject to automated decision-making, including the right to obtain information about a specific decision, object to it, and to ask for a human review.

The investigation by Garante found that FO did not provide adequate information to employees about the functioning of the algorithm and did not include safeguards to ensure accuracy which meant that discriminatory reviews from clients affected the ratings of riders. Additionally, FO did not guarantee procedures to protect the rights in Article 22 of the GDPR even though the algorithm could cause a rider to be excluded from job opportunities.

In its press release, Garante said that FO had not "adequately informed the workers on the functioning of the system and did not guarantee the accuracy and correctness of the results of the algorithmic systems used for the evaluation of the riders". Garante also found that FO did not "guarantee procedures to protect the right to obtain human intervention, express one’s opinion and contest the decisions adopted through the use of the algorithms in question, including the exclusion of a part of the riders from job opportunities".

Civil litigation

High Court provides guidance on claims arising out of data breaches

In Warren v DSG Retail Ltd [2021] EWHC 2168 (QB) Saini J provided some helpful guidance on the nature of the claims available arising in claims relating to data breaches arising out of unauthorised access to systems.

The proceedings arise from a data breach that affected the personal data of 14 million customers of Dixons Carphone. The attack was caused by a failure in its computer systems. Between July 2017 and April 2018, the attacker installed malware on 5,390 tills at Curry’s PC World and Dixon Travel stores. Private information was being collected for 9 months before the attack was uncovered.

The severity and nature of the attack led the ICO to issue a Monetary Penalty Notice to Dixons Carphone £500,000, the maximum under the Data Protection Act 1998.

The Claimants' pursued a claim seeking £5,000 for alleged misuse of private information ("MPI"), breach of confidence ("BoC"), breach of the Data Protection Act 1998, and negligence, on Dixons Carphone's part.

Dixons Carphone applied to strike out each of these claims save for that relating to its alleged breach of the Data Protection Act 1998 in failing to have in place “appropriate technical and organisational measures to [prevent] unauthorised or unlawful processing of data” primarily on the basis that it had not taken positive steps to breach the Claimant's confidence or misuse his private information.

Saini J approved Dixons Carphone's application finding that:

“[T]he Claimant’s claim is that the DSG failed in alleged duties to provide sufficient security for the Claimant’s data. That is in essence the articulation of some form of data security duty. In my judgment, neither BoC nor MPI impose a data security duty on the holders of information (even if private or confidential). Both are concerned with prohibiting actions by the holder of information which are inconsistent with the obligation of confidence/privacy. Counsel for the Claimant submitted that applying the wrong of MPI on the present facts would be a “development of the law”.”

And, in relation to the Claimant's MPI claim:

“I accept that a ‘misuse’ may include unintentional use, but it still requires a ‘use’: that is, a positive action.”

And, in relation to the Claimant's negligence claim he concurred with the Court of Appeal's finding in Smeaton v Equifax Ltd [2013] 2 All ER 959 that "there is neither need nor warrant to impose such a duty of care where the statutory duties under the DPA 1998 operate" and, in any event: "[a] cause of action in tort for recovery of damages for negligence is not complete unless and until damage has been suffered by the claimant. Some damage, some harm, or some injury must have been caused by the negligence in order to complete the claimant's cause of action… a state of anxiety produced by some negligent act or omission but falling short of a clinically recognisable psychiatric illness does not constitute damage sufficient to complete a tortious cause of action".

The Claimant's claim for breach of the Data Protection Act 1998 is presently stayed pending the outcome of Dixons Carphone's appeal of the ICO's Monetary Penalty Notice to the First-tier Tribunal.

European Court of Justice delivers ruling on the exercise of non-Lead Supervisory Authorities' powers over cross-border data processing

The European Court of Justice ("ECJ") has delivered its preliminary ruling, in Facebook Ireland Ltd, Facebook Inc, Facebook Belgium BVBA v Gegevensbeschermingsautoriteit (Case C-645/19) EU: C2021:5, on whether only the lead supervisory authority ("LSA"), as opposed to another supervisory authority, is permitted to bring proceedings in court for alleged infringement of the GDPR regarding cross-border processing.

The ECJ concluded that "it is the responsibility of the LSA, as a general rule, to adopt a decision with respect to the cross-border processing (…), to give notice of that decision to the main establishment (…) of the controller or processor (…), and to inform the other supervisory authorities concerned and the European Data Protection Board of the decision". The ECJ said that the "'one-stop shop' mechanism might be jeopardised if a supervisory authority [other than the LSA] could exercise the power [to bring proceedings in court] other than those where it has competence".

The ECJ noted that there are limited exceptions to this general rule including the urgency procedure, under Article 66 of the GDPR, where a Supervisory Authority has sought assistance from the LSA which has not then provided the requested information. We reported on a decision of the CJEU in relation to the urgency procedure and the response of the Irish DPC (the LSA for Facebook) in our June update here. Another exception relates to a situation under Article 56(2) of the GDPR where a Supervisory Authority, other than the LSA, receives a complaint concerning the processing of personal data if the subject matter relates only to an establishment in its own member state or substantially affects data subjects only in that member state.

This decision is of particular significance given the resolution of the European Parliament (which we reported in our May update) calling for infringement procedures to start against the Irish DPC because of an "insufficient level of enforcement of the GDPR".

Supreme Court of Vienna refers Schrems' dispute with Facebook to the CJEU

The Supreme Court of Vienna ("OGH") has sought a preliminary ruling from the CJEU in relation to certain issues engaged by the proceedings between Max Schrems and Facebook.

Mr Schrems had asked for access to his personal data along with raising concerns about the basis of Facebook's processing of his personal data.

The issues referred to the CJEU are:

  1. Whether the basis for processing the personal data by Facebook is correct. Facebook argues that, after the GDPR came into force, it was entitled to process data subjects' personal data on the basis of contract between data subjects and Facebook (i.e. pursuant to Article 6(1)(b) GDPR: "processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract") rather on the basis of the data subjects' consent;
  2. Whether the processing of data on, and websites which use Facebook's buttons or advertising, for any purpose is compatible with the requirement of data minimization under the GDPR; and
  3. Whether the use of sensitive personal data (such as political opinions or health information) for personalised advertising by Facebook is permitted.

In addition, the OGH gave a partial judgment to Mr Schrems, and ordered Facebook to pay him €500, because it did not provide him with access to all of his data. The OGH found that "the information provided was incomplete. Rather, the defendant only provided information on the personal data that it itself needed for its own purposes. considered "relevant"." The OGH described the fact that Facebook had provided online tools to enable Mr Schrems to access his own data as "not sufficient". It was insufficient as he would have had to search at least 60 data points, with thousands of data points, requiring several hours of work; a task described as an "Easter egg search".

British Airways settles data breach class action

British Airways ("BA") has settled a U.K. group litigation claim, the largest class action personal data claim so far in the UK (with over 16,000 claimants), from a data breach in 2018. The breach involved the personal data of around 430,000 of BA's customers being compromised by a hacker gaining access via systems used to permit staff/contractors to work remotely. The settlement sum has not been disclosed and the settlement did not include any admission of liability by BA.

The breach had already led to the ICO issuing BA with a Monetary Penalty Notice requiring it to pay £20 million which we analysed here.

Data breach claimants lose bid for anonymity in court

In Various Claimants v Independent Parliamentary Standards Authority [2021] EWHC 2020, the Claimants, over 200 employees and former employees of MPs, sought an order permitting their names and addresses to be withheld from a Claim Form on the basis that the inclusion of such information therein would undermine the purpose of their claim against the Independent Parliamentary Standards Authority relating to a data breach which resulted in the unlawful disclosure of their personal data and may create a personal safety risk.

Nicklin J refused the Claimants' application holding that, if the application were to succeed, it would have the effect of granting anonymity to "most if not all" data breach claims.

He considered that: (1) the Claimants had "fall[en] a way short of demonstrating a credible risk that if [they] were named (…) they would be exposed to some risk of harm"; and (2) the "civil justice system and the principles of open justice cannot be calibrated upon the risk of irrational actions of a handful of people engaging in what would be likely to amount to criminal behaviour".

However, Nicklin J granted an order restricting access by non-parties to documents on the Court File containing the confidential and private information of the Claimants.