Data Protection update - February 2021

Data Protection update - February 2021

Welcome to our Data Protection bulletin, covering the key developments in data protection law from February 2021.

Data protection

Cyber security

Regulatory enforcement

Civil litigation

Data protection

An adequacy decision on the horizon: the future of the EU-UK data flow

On 19 February 2021, the European Commission issued its anticipated draft adequacy decision for data flows between the EU and the UK which, if adopted, would mean that personal data can continue to freely flow between the EU and UK without the need for additional security measures.

Contrary to speculation, the European Commission did not include special terms and conditions acknowledging the recent Schrems II ruling, but confirms that existing UK law is sufficient without any additional data safeguards.

However, even if this adequacy decision is adopted, the UK and the EU are still subject to separate regulatory regimes. Following the end of the transition period for the UK’s withdrawal from the EU on 1 January 2021, organisations that process data in the EU and the UK are now subject to both the EU GDPR and the UK GDPR. Depending on their operations, businesses may need to take certain action including appointing an EU or UK representative and understanding which supervisory authority will be their lead authority.

The European Commission has reported that the basis for this draft adequacy decision was a careful assessment of the UK's data protection laws. However, despite this draft decision, the European Commission still appears to be concerned about a potential future scenario in which UK legislators weaken the approach to data protection.  In fact, the European Commission has suggested that the continued adequacy of the UK’s data protection laws would be reliant on the UK's adherence to both the Convention 108 of the Council of Europe and the European Convention on Human Rights. The EU is likely revisit its decisions if the UKs alignment with the EUs position changes.

Further, the European Parliament's Civil Liberties Committee (CLC), despite not playing any formal role in the adequacy decision process, has reportedly supported a motion stating that the UK needs to reform its privacy standards prior to an adequacy decision being finalised and issued. Whether this will impact the adoption of the draft decisions remains to be seen.

This draft decision will provide some assurance about the continuing free flow of personal data between the EU and UK however the Commission have made it clear there is a degree of contingency and so businesses should ensure that they keep up to date with ongoing regulatory changes and potential issues.

Cookie Laws in France: the CNIL encourages compliance whilst Google challenges cookie related fines

Early in the month, the CNIL sent letters and emails to approximately 300 organizations, including 200 public ones, to remind them of the new cookie guidelines published on 1 October 2020 (the “Guidelines”).

The new Guidelines recommend that user consent be obtained for each individual site (as opposed to group sites) and requires cookie tools to offer users the option to refuse cookies with the same simplicity as accepting cookies. The CNIL also retracted their position on ‘cookie walls’ in the Guidelines, asserting that the lawfulness of cookie walls must be assessed in each case such that there cannot be a single sweeping ban on cookie walls. The Guidelines also softened the conditions around the use of analytics cookies and restricted the ability of businesses to consider silence to constitute acceptance of the cookie settings.

The letters and emails sent by the CNIL this month reminded businesses of the need to audit sites and apps to comply with these Guidelines by 31 March 2021, when the transition period for implementation of the Guidelines ends. The CNIL regularly investigates the cookie compliance of the most used websites in France and this move to encourage compliance follows an observation that many websites in the public section continue to be non-compliant with the Guidelines.

This reminder comes in the same month that Google filed a challenge against the EUR 100,000,000 fine administered by the CNIL in relation to cookie violations and demonstrates the increasing seriousness of non-compliance with cookie regulations.

New data protection tools

During February, various agencies and entities published some helpful data protection tools to assist businesses navigate data protection legislation. We have briefly outlined three of these tools below.

The European Union Agency for Cybersecurity (“ENISA”) released a report on data pseudonymisation for controllers and processors which examines solutions and techniques of pseudonymisation in different example cases, with a specific focus on healthcare scenarios. It also examines the application of basic pseudonymisation techniques in common cybersecurity use cases, such as the use if telemetry solutions. This report concludes that controllers and processors should engage in data pseudonymisation based on a security and data protection risk assessment and taking due account of the overall context and characteristics of personal data processing.

The Information Commissioner’s Office (“ICO”) launched its data analytics tool which highlights some of the key data protection points that organisations need to consider when undertaking any project, for example, when using software to automatically assess patterns in data sets and make certain classifications or predictions about data subjects. This toolkit asks questions about the way in which personal data is processed, particularly in relation to obligations of lawfulness, accountability, governance and ensuring data subject rights. Once the questions are answered and the risk assessed in the context of the processing activities, the ICO’s toolkit will generate a report containing tailored advice for the project.

Lastly, the Department for Digital, Culture, Media and Sport (“DCMS”) has published a prototype trust framework that sets out draft rules and standards for organisations wanting to provide or use digital identity products and services. The framework sets out specific standards and requirements for organisations which provide or use digital identity services including informing users of changes made to their digital identity, having account recovery processes, following guidance on how to choose secure authentication and best practice recommendations for information security and encryption.

EDPS comments on proposals for the digital future

On 10 February 2021, the EDPS published opinions on two of the European Commission’s proposals; one on the Digital Services Act and another on the Digital Markets Act. Both opinions consider the digital future and data protection rights.

The Digital Services Act introduces new rules and responsibilities for businesses providing online intermediary services. In the opinion, the EDPS confirmed his support for the Digital Services Act’s promotion of transparent and safe online environments. The EDPS has also made certain recommendations intended to offer solutions which better protect individuals against targeted advertising and content moderation. These recommendations include additional safeguards such as prohibiting profiling for the purpose of content moderation, introducing minimum interoperability requirements and promoting the development of EU technical standards.

The Digital Markets Act, on the other hand, introduces new rules for platforms such as search engines, social networks, messaging services and online intermediation services in the digital sector. These rules are aimed at preventing these platforms from imposing unfair conditions on businesses and consumers as well as ensuring the openness of digital services. In its opinion on the Digital Markets Act, the EDPS welcomes the proposal to promote fair and open digital markets by regulating gatekeeping online platforms. However, the EDPS highlights that competition, consumer, and data protection laws should be complimentary and so encourages closer co-operation between authorities in order to establish a clear legal basis and structure. The EDPS suggests that the relevant oversight authorities clarify the scope of their individual data portability obligations and take steps to provide for effective anonymization and re-identification tests.

Cyber security

Cyber-crime on the rise: the recent increase in unprecedented cyber-insurance claims

Allianz, the well-known German insurer, has reported a 950% increase in cyber-insurance claims over the last three years, and warned of the growing threat to businesses from cybercrime.

The insurer has reported that just seven years ago cyber risk ranked low amongst risk experts from over 100 countries. Today, however, cyber risk ranks top of these surveys, a trend that has become increasingly apparent as the ever-evolving risk landscape has been exaggerated during Covid-19. Whilst companies move towards a work from home environment, they simultaneously face more ransomware incidents, larger data breaches and more robust regulations.

One recent example of such incident is the major data breach of the Foxton’s Group in which hackers stole the personal and financial information of the real estate agency’s customers before uploading the data onto dark web forums. The hope is that the onset of attacks during the Covid-19 crisis will encourage an increase in awareness resulting in more security measures being introduced.

Despite the huge advances companies have made in recent years with regard to their cyber security, the importance of digital assets and their security remains a point of focus in the insurance industry, particularly as the risk of cyber-attacks are continuing to grow.

Regulatory enforcement

The ICO resumes probe into the adtech industry

In our November 2020 bulletin we reported that the ICO had halted its investigation into the adtech industry. As part of the investigation, which began in 2018, the ICO produced a report, in June 2019, which highlighted a number of concerns with certain aspects of the adtech industry – in particular, the use of Real-Time Bidding (“RTB”). The report highlighted a “lack of maturity” amongst market participants and that “[the ICO] do not think these issues will be addressed without intervention”. The ICO’s decision to pause the investigation in September 2020 therefore came as something of a surprise, and prompted threats of legal action from the Open Rights Group.

On 22 January 2021, the ICO announced that its investigation has now resumed. The statement highlighted that the focus of the ICO’s work in the near future will be the continuation of a series of audits focusing on data management platforms. The spotlight will also be placed on the role of data brokers in the adtech ecosystem, a wholly unsurprising move in view of the enforcement action taken against Experian in October of last year.

The ICO warned organisations operating in the adtech arena that they “should be assessing how they use personal data as a matter of urgency”.

News of the resumption follows shortly after the publication of a blog post by Simon McDougall, the ICO’s Executive Director of Technology and Innovation, which suggested that some adtech participants “have their heads firmly in the sand” and that “[i]t is now clear to us that engagement alone will not address all these issues”. McDougall noted that “a number of justifications for the use of legitimate interests as a lawful basis for the processing of personal data in RTB” were “insufficient”, and that the Data Protection Impact Assessments the ICO had seen were “generally immature”. Given the “lack of maturity” in the industry, the ICO “anticipate that it may be necessary to take formal regulatory action”.

Such action has long been called for by campaigners, including Dr Johnny Ryan of Brave, whose September 2018 complaint, co-authored by Jim Killock and Michael Veale, triggered the ICO’s investigation.

The recent $10 million fine handed to Grindr by the Norwegian Data Protection Authority (“NDPA”), in connection with the unlawful sharing of personal data with third party advertisers, has placed further scrutiny on the role of participants in the adtech ecosystem. Various adtech providers embroiled in the facts underlying the Grindr fine continue to be subject to investigation by the NDPA and it remains to be seen whether they too will face regulatory sanction arising from their use of personal data received from Grindr.

Two firms receive fines totalling £270,000 for making unsolicited marketing calls

Two firms have received fines totalling £270,000 for making unlawful marketing calls to numbers registered with the Telephone Preference Service (“TPS”), a service that allows people to opt out of receiving cold calls. Call Centre Ops was fine £120,000 for making 159,461 unsolicited direct marketing calls and House Guards of Bournemouth was fined £150,000 for making 699,966 nuisance calls. It is illegal for companies to make marketing calls to phone numbers registered with the TPS. In order to avoid breaking the law, companies are encouraged to sign up to the TPS and check their call lists against a list that is distributed monthly by the TPS to show the numbers on its register.

Data controller and its data processor fined €225,000 for failing to implement adequate security measures in relation to a credential stuffing attack

The CNIL has fined an anonymous data controller and its processor a total of €225,000. The CNIL found that the controller and its processor had failed to ensure that adequate security measures were in place to protect its customers’ personal data in the event of a credential stuffing attack. The data controller is a website that regularly sells products to several million consumers. The data processor operated the data controller’s website. During the CNIL’s investigation, it was discovered that the data processor had been the victim of a number of credential stuffing attacks. Such attacks allowed unauthorised third parties to gain access to around 40,000 customer accounts between March 2018 and February 2019. The data controller and its processor were found to have breached Article 32 GDPR, which required them to protect the security of the customers’ personal data. In reaching a determination on the sanction to impose, the CNIL’s Restricted Committee found that both companies did not act soon enough to install measures that would effectively resist credential stuffing attacks.

Civil Litigation

Class actions launched against Facebook and TikTok for data protection failings

A class action has been filed in the High Court against Facebook, relating to claims that the social media platform allowed This Is Your Digital Life, a third-party app, to access the personal information of users, and their friends, without their knowledge or consent between November 2013 and May 2015 (which personal data was ultimately supplied to Cambridge Analytica).

The class representative is British journalist, Peter Jukes. Compensation is sought from Facebook on behalf of a class of around one million affected users in England and Wales. Should the claim succeed, Facebook could face a liability running into the billions.

Separately, a class action was also filed in the High Court against TikTok, by a representative on behalf of all other children under 16 years of age who are, or were, users of TikTok and/or Musical.ly. The claim alleges that TikTok misused private information and processed the personal data in breach of the duties imposed by the GDPR. In a judgment issued shortly before Brexit (the claim having been issued prior to Brexit to secure certain procedural advantages), Warby J approved an application that the class representative's anonymity be preserved.

Court of Appeal ruling provides guidance on the assessment of “appropriate technical and organisational measures” in the context of law enforcement processing

The Court of Appeal recently handed down judgment in M v Chief Constable of Sussex Police [2021] EWCA Civ 42, in which it provided some helpful guidance about what constitutes “appropriate technical and organisational measures” in the context of law enforcement processing.

Lieven J in the High Court had dismissed a judicial review application brought by a teenage girl (“M”). M had argued that Sussex’s Police’s safeguards for disclosing sensitive personal data to the Brighton & Hove Business Crime Reduction Partnership (“BCRP”), under an Information Sharing Agreement (“ISA”), was unlawful. The key question for the High Court was whether the ISA met the requirements of Part 3 of the Data Protection Act 2018 (“DPA 2018”), which implements the Law Enforcement Directive 2016/680 (“LED”).

The data protection principles under Part 3 DPA 2018 are very similar to those under the GDPR. The sixth principle requires personal data processed for law enforcement purposes to be processed in a manner that uses “appropriate technical or organisational measures”. Lieven J held that “on a holistic assessment”, the ISA, and the overall package of governance controls, did, in fact, provide sufficient safeguards.

The Court of Appeal agreed. The holistic approach employed by Lieven J was the correct one. The LED (and by extension Part 3 DPA 2018), much like the GDPR, “is not prescriptive about the measures that must be taken, so long as they are “appropriate”. It does not attempt to micro-manage how a data controller complies with its requirements”.

The Court of Appeal's decision is also notable as it confirmed that: "the Judge was right, for the reasons that she gave in her supplementary judgment, to make a nominal award in the sum of £500" in respect of unlawful "sensitive processing", which is broadly equivalent to unlawful processing of special category data for the purposes of GDPR.

Upper Tribunal judgment provides guidance on the application of regulation 22 of PECR 2003

The Upper Tribunal recently handed down judgment in Leave.EU and Eldon Insurance Services v Information Commissioner [2021] UKUT 26 (AAC), in which it provided guidance on the Court’s approach to regulation 22 of Privacy and Electronic Communications Regulations 2003 (“PECR”).

This case relates to the inclusion of marketing content in Brexit-related newsletters. Subscribers had consented to receipt of the newsletters, but not to the receipt of marketing materials, which were contained within the newsletters. The ICO issued the two companies – Leave.EU and Eldon Insurance Services – with fines amounting to £105,000 for spam emailing. The Upper Tribunal refused the two companies’ appeals against the ICO’s decision (those appeals having previously been refused by the First-tier Tribunal).

In reaching its decision, the Upper Tribunal held that:

  • mixed content” emails are caught by Regulation 22: the definition of direct marketing under the Data Protection Act 1998 is “self-evidently” broad; and the reference to “unsolicited communications” in regulation 22 may be read as a reference to “unsolicited information” – an email is “simply the vehicle by which the “communication” (which may contain different types of information”) is delivered to the subscriber – it is not the “communication” itself”. In determining this issue, the concept of a primary purpose test, as had been proposed by the Appellants, was expressly rejected;
  • the focus should therefore be on the content of an email, as opposed to the stated purpose of that email;
  • it was not the case that the subscribers’ acceptance of a “loosely drafted privacy policy” amounted to the provision of consent to receive such marketing material; and
  • the appellant could not rely on the absence of complaints to the ICO as a defence to its breach.

Further guidance in relation to procedural aspects of class actions arising from data breaches pursued under Group Litigation Orders ("GLOs")

In a recent judgment in Weaver & Ors v British Airways Plc [2021] EWHC 217 (QB) (the BA Group Claim), Saini J provided guidance on two issues of importance from the perspective of group claims under GLOs, and which, depending on the Supreme Court's decision in Lloyd v Google, could conceivably become of considerable importance to the conduct of class actions arising from data breaches more generally:

  • End dates for sign-up to the GLO Group: per paragraphs 38 - 40, an argument by the Claimants that the end date for joining the group without further application be extended until a date one year after trial on liability (a split trial has been ordered) was rejected, and a short extension to the end-date ordered.
  • Advertising costs: per paragraphs 47 – 51, Saini J held that the costs of advertising the claim were irrecoverable from BA, noting that such costs were: “essentially general overheads” and incurred by the Claimants' solicitors in “getting the business in”.

As at the date of the hearing, 22,230 Claimants had been joined as Claimants under the GLO (approximately 5% of the total number of affected data subjects based in the UK). Accordingly, assuming that their claim succeeds, and they recover £1,000 in damages, this exposure would eclipse the fine levied by the ICO arising out of the data breach.

Court of Appeal judgment addresses data protection issues in the context of a disclosure order relating to personal devices

The Court of Appeal recently handed down judgment in Phones4U Ltd v EE Ltd & others [2021] EWCA Civ 116.

Roth J in the High Court had ordered certain of the Defendants to request former senior officers to voluntarily provide third party IT consultants, engaged by the Defendants, with personal devices which may have been used to send and receive work-related communications.

The third issue considered by the Court of Appeal holds the most interest from a data protection perspective: was the mechanism directed by Roth J, in relation to the provision of the devices to the third party IT consultants, appropriate and proportionate?

The Court of Appeal said it was. The order required the IT consultant to limit the information disclosed to the Defendants, to return the devices and to delete any copies. It therefore provided a sufficient protection for the non-work related data stored on the devices.  The (seemingly tentative) submission by counsel on behalf of Vodafone, one of the Defendants, that the order breached GDPR was dismissed: it was “clear that any data processing that is undertaken by the IT consultants will indeed (i) be with the consent of the Custodians as data subjects under article 6.1(a) of the GDPR, and (ii) be necessary for the IT consultant as data controller…to undertake “for compliance with a legal obligation to which the controller is subject” under article 6.1(c)”.

Cyber insurers deny coverage to Experian for losses arising out of data breaches

In Experian PLC v Zurich Insurance PLC and anor (claim number CL-2020-000670) two insurers have defended their refusal to pay Experian's $18 million claim for coverage of its U.S. legal fees in a pair of class actions arising from: (1) inaccurate reporting of US consumers' credit history; and (2) a 2015 data breach. Of particular interest is the insurers' defence that the relevant policies exclude coverage for "wilful and/or reckless acts, errors or omissions failing within the Deliberate Acts Exclusion," which it is said applies in relation to losses caused by the data breach because Experian failed to update software, leaving it vulnerable to hackers. The Court's approach on this issue will be of interest to insurers and policyholders alike, as will its analysis as to whether the Court is willing to make a declaration, as sought by Experian that the insurers are liable to cover financial penalties that Experian may face arising from the data breach.