Data Protection update - May 2022

Data Protection update - May 2022

Welcome to our data protection bulletin, covering the key developments in data protection law from May 2022.

Data protection

Cyber security 

Enforcement

Civil litigation

Data protection

European Commission publishes FAQs on new standard contractual clauses

The European Commission has published a questions and answers document ("Q&A") on using its two sets of standard contractual clauses ("SCCs"): those for transfers of personal data between controllers and processors within the European Economic Area; and those for transfers to countries outside the EEA.

The Q&A is based on feedback from stakeholders and their real life experiences of using the SCCs and offers practical guidance on the use of SCCs. Designed to assist organisations with their compliance programmes, the Q&A sets out 44 questions with additional information to be added as new issues or questions arise.  

While the Commission is not itself a competent supervisory authority, as the author of both sets of SCCs, its views are clearly highly relevant. Some key points from the Q&As to note include:

  • Parties using the SCCs "may not include a general exculpation from liability." This suggests that liability caps are prohibited in relation to breaches of the SCCs, not only in relation to data subjects but also as between the contracting parties.
  • On how the SCCs must be executed, the Q&As state that the parties must "enter into a legally binding agreement to abide by" the SCCs and in particular "fill in the annexes to the SCCs and sign Annex I". However, the Q&A also states that there are not "any requirements on how signature should be formalised … [which] is left to national law governing the agreement." This suggests that the SCCs should usually be formalised by signature (which may be by electronic means, where permitted by national law). However, it is unclear whether click-through buttons will be acceptable due to the emphasis on "signature".
  • The Q&As are ambiguous as to whether the SCCs may be incorporated into a commercial contract by reference, as is common practice. They state that the SCCs must be "signed by and binding on all the parties", but also that they may be incorporated into a contract "in accordance with civil law requirements from the chosen jurisdiction". The Q&As also emphasise the importance of making it clear which modules, options and specifications have been chosen. This is not necessarily incompatible with incorporation by reference.
  • It has been confirmed that the transfer SCCs should not be used where the data importer is itself subject to the GDPR, as the Commission states that this would "duplicate and, in part, deviate from the obligations that already follow directly from the GDPR." An additional set of transfer SCCs is currently being developed by the Commission for this scenario. The delay in producing these additional SCCs is a cause for concern for parties working to remediate their SCCs by 27 December, as it means another round of remediation will be needed later, for importers that are directly subject to the GDPR.
  • Inapplicable modules and options should be deleted from the SCCs. The Parties "should only agree the clauses that are relevant for their situation."

By way of reminder, since 27 September 2021, the new SCCs must be used in all new contracts entered into for international data transfers and all organisations will need to transition all existing contracts using the old SCCs to the new SCCs by 27 December 2022.

ICO launches AI and data protection risk toolkit

The Information Commissioner's Office ("ICO") has launched its AI and data protection risk toolkit (the "Toolkit") which accompanies the ICO's guidance on AI and data protection. The Toolkit is designed to help businesses understand AI-specific risks to individual rights and freedoms and sets practical steps to mitigate, reduce or manage them. The Toolkit is optional, but offers a solution to complement data protection impact assessments where processing using AI is likely to result in a high risk to individuals.

The Toolkit sets out risks for businesses to consider at the different stages of developing AI and provides guidance on controls to be carried out. Risk is split into ten categories that align with the key principles of the UK GDPR and the Toolkit covers practical steps across three categories:

  • mandatory steps, which cover the relevant legal requirements;
  • best practice steps, which are recommended; and
  • good practice steps, which are optional.

For each risk area, the Toolkit provides an opportunity for businesses to summarise their assessment of the risk, describe any practical steps that will be taken to reduce risk and gives the option to select a residual risk rating of low, medium or high.

The benefits of the Toolkit include that it provides organisation with an opportunity to have greater confidence that their use of AI is compliant with data protection law. Secondly, it provides a platform for organisations to identify and understand risks to individuals’ rights. Lastly, it provides clarity about the laws that apply to processing using AI, which paves the way for organisations to innovate responsibly.

The ICO is encouraging businesses to provide case studies on their use of the Toolkit, which will be used to help guide its future development. The Toolkit is primarily focussed on AI based on machine learning, but the ICO intends to expand its scope in later versions. 

EDPB consults on guidance to address inconsistency in GDPR fines

The European Data Protection Board ("EDPB") is consulting on guidelines on the calculation of administrative fines under the GDPR ("Guidelines"). The draft Guidelines have been issued to ensure that supervisory authorities ("SAs") across the European Union follow the same methodology in calculating data protection-related fines.

The methodology in the Guidelines will be of interest to organisations that may be subject to a fine from an SA. It sets out a five step process for SAs to follow:

  1. Identifying the processing operations: SAs must establish whether there is sanctionable conduct that has led to one or several infringements of data protection law. During this step, SAs must clarify if any of the infringements are serious enough to attract a fine.
  2. Finding the starting point for further calculation: SAs are then to assess the seriousness of the infringement in order to determine the starting point for calculating the fine. Seriousness is measured as low, medium or high, giving due regard to the nature, duration and gravity of the infringement. SAs should also consider the turnover of the organisation in question with a view to imposing an effective, dissuasive and proportionate fine.
  3. Evaluating aggravating and mitigating circumstances: SAs should evaluate past or present behaviour of the organisation as regards its data protection compliance and increase or decrease the fine accordingly.
  4. Identifying the relevant legal maximums for the different processing operations: SAs must ensure that they do not impose fines that exceed the maximum legal limits prescribed by the GDPR.
  5. Analysing whether the final amount meets the requirements of effectiveness, dissuasiveness and proportionality: SAs should consider whether further adjustments to the fine are necessary. For example, where a fine reduction is necessary due to an organisation's inability to pay.

Although approaches by SAs may continue to differ, the Guidelines offer a standardised approach to enforcement which is important for legal certainty.  

The Guidelines are open for consultation until 27 June 2022, after which a final version of the Guidelines will be officially adopted by approximately the end of 2022.

EU agrees on tougher cybersecurity laws and finalises Data Governance Act

On 13 May 2022, the European Parliament and EU Member States reached a political agreement on a directive on measures for a high common level of cybersecurity across the EU ("NIS2").

NIS2 will expand the scope of the first NIS Directive ("NIS1") in response to the increase in cyberattacks and threats posed since the original legislation was first introduced. NIS2 will cover medium and large entities from more sectors that are critical for the economy and society such as certain healthcare providers, pharmaceuticals, waste management services, digital service providers and manufacturers of critical products. Expanding the scope of the legislation effectively obliges more entities and sectors to take cybersecurity risk management measures. NIS2 also strengthens cybersecurity requirements, streamlines reporting obligations and creates stricter enforcement requirements. The political agreement is now subject to formal approval by the two co-legislators with NIS2 expected to come into force by 2024.

Last month, the European Parliament approved the Data Governance Act ("DGA") which aims to boost data sharing in the EU by giving start-ups and other businesses access to more big data that they can use to develop new products and services. Public sector bodies will be encouraged to share data through 'European data spaces' and new rules will make it easier for organisations and individuals to share data for the benefit of society. The DGA is now awaiting formal adoption by the Council and is expected to come into force from mid to late 2023. We covered some of the key obligations under the DGA in our December / January Data Protection Bulletin.   

French SA publishes cookie wall guidance

On 16 May 2022, the French SA, the CNIL, published guidance on assessing the legality of cookie walls (the "Guidance").

“Cookie walls” are tools used by websites that require the users to accept cookies or other tracking devices in order to access website content. Under the GDPR, consent is only valid if it is freely given.

In a previous version published in 2019, the Guidelines prohibited cookie walls on the basis that they did not allow for freely given consent. However, the French Council of State ruled in June 2020 that the CNIL did not have the legal power to interpret to ban all types of cookie walls, but instead indicated that the legality of cookie walls needed to be assessed on a case-by-case basis.

The new Guidance looks at cookie walls in this context and explains that their legality must be assessed by taking into account the existence of real and fair alternatives if users refuse to consent to a cookie wall. The Guidance specifies that an alternative might be to charge a subscription fee designed to compensate for the loss of advertising revenue, which must be paid before a site can be accessed (i.e. a paywall). The Guidance makes it clear that no cookies should be placed where the user has opted for paid access (except for cookies that are essential for the website to function correctly) and the paywall price must be reasonable. The Guidance notes that a 'reasonable' paywall price is to be determined on a case-by-case basis.

To read the guidance in full (French only) please click here.

ICO publishes materials for best interests of children self-assessment 

When organisations design and develop online services likely to be accessed by a child, standard 1 of the UK Children's Code ("Code") requires the best interests of children to be a primary consideration. Additionally, organisations must consider how using children's data may impact the rights they hold under the United Nations Convention on the Rights of the Child ("UNCRC").

The ICO has created a four step self-assessment (here) to help organisations comply with the Code and the UNRC.

  • Understanding rights: Organisations need to understand children's rights under the UNCRC.
  • Identify impacts: Organisations should map the demographics of the children that use their services as well as maintaining records of how, why and when these services process children's data. Using the ICO's best interest frameworks, organisations should identify potential impacts their products or services could have on the rights of children.
  • Assess impacts: Organisations should consider the likelihood and severity of potential impacts on the rights of the child through an evidence-based assessment.
  • Prioritise actions: Organisations should create a plan of action for addressing the risk areas highlighted in the risk assessment. The ICO's self-assessment risk tool contains an action plan worksheet to assist with this step.

The ICO's four-step assessment is not compulsory and can be adapted as organisations see fit as long as it can be demonstrated that the best interests of children have been thoroughly considered as part of organisations' data protection impact assessments. For more information click here.

Cyber Security

Government requests views on plans to improve app security and privacy

The UK government is holding a call for views on plans to improve the security and privacy of apps and app stores. App developers, app store operators and security and privacy experts are all encouraged to provide feedback. A review of the app store ecosystem was conducted by the government between December 2020 to March 2022, which made two key findings. Firstly, that malicious and poorly developed apps continue to be accessible to users and, secondly, that prominent app stores are failing to (i) adequately signpost app requirements to developers; and (ii) provide sufficient explanations and justifications where an app or update is rejected.

The government intends to ensure that online threats are tackled with a robust set of interventions including a voluntary Code of Practice applying to all app store operators and developers. The Code of Practice would require app stores to have a vulnerability reporting process for each app so flaws can be identified and resolved. The call for views is being held to gather feedback on the proposed interventions and whether additional proposals should be taken forward. App developers are also encouraged to give their views on the review and feedback processes they have encountered when creating apps on different app stores. The call for views will run until 29 June 2022. The government will then review the feedback provided and publish a response later this year.

Enforcement

UK fines Clearview £7.5M for privacy breaches

The Information Commissioner's Office ("ICO") has issued a Monetary Penalty Notice fining Clearview AI Inc ("Clearview") just over £7.5 million for breaches of GDPR and UK GDPR when it collected the images of citizens from social media without their knowledge. The ICO has also issued an Enforcement Notice requiring that Clearview: (1) delete all information of UK residents from its systems; and (2) stop obtaining and using the personal data of UK residents which is publicly available on the internet.

The U.S. company has a database of approximately 20 billion facial images obtained by scraping data from publicly available sources such as social media. Clearview uses this data to operate an AI-based identity matching service that it sells to entities such as law enforcement. This service made headlines in March when it was revealed that the Ukrainian government was using it to identify Russian soldiers killed during the invasion of Ukraine.

In reaching its decision, the ICO considered that Clearview had:

  • failed to provide a lawful basis for collection of the relevant personal data;
  • failed to use personal data in a fair and transparent manner, given that data subjects had not been made aware or would not have reasonably expected that their personal data would be used for such purposes;
  • failed to have adequate procedures in place to prevent personal data being retained indefinitely;
  • failed to meet the higher data protection standards applicable to biometric data, which is defined as "special category data" under both the GDPR and UK GDPR; and
  • that Clearview had impeded data subjects' access rights when requiring additional personal information, including photos, before responding to data subject access requests. The ICO considered that this "may have acted as a disincentive to individuals who wish to object to their data being collected and used".

Hon Ton-That, Chief Executive of Clearview, has said that in imposing this fine the ICO had "misinterpreted my technology and intentions" and that "it breaks [his] heart that Clearview AI has been unable to assist when receiving urgent requests from UK law enforcement agencies seeking to use this technology". Meanwhile, lawyers for Clearview have disputed the jurisdiction of the ICO to impose such a fine, in circumstances where Clearview presently does no business in the UK.

AEPD fines Google €10M for unlawful transfer of personal data and failure to facilitate right to erasure

The Spanish data protection authority (the "AEPD") has imposed a fine of €10 million on Google LLC after finding that the company had violated Articles 6 and 17 of the GDPR.

The complaints against Google concerned the transfer of requests related to the removal of content from Google's various products and platforms, including YouTube and Google's search engine, but also involved a third party, known as the 'Lumen Project'. Individuals wishing to remove content would be required to submit their request to "lumendatabase.org", an independent research project established for the purpose of studying requests for the removal and withdrawal of online content. Specifically, the AEPD explained that to enable the removal of content, Google required users that used the relevant forms to accept the transfer of copies of content removal requests to 'lumendatabase.org', on which they would, subsequently, be published.

In addition, the AEPD found that the information provided by Google to users about the transfer of personal data to the Lumen Project, which was limited to a notice inserted in the Google forms used for the submission of the request, stating that "Google does not suppress any information contained in the requests it receives and that, instead, it is the Lumen Project which anonymises the user's contact details" was inadequate for the purposes of GDPR.

The AEPD rejected Google's claim that its actions served 'legitimate interests' (i.e. that its contribution to the Lumen Project aided transparency and accountability, as well as to avoid abuse and fraud), and instead found that the users were not appropriately informed about the legal basis that would justify the transfer of their personal data to the Lumen Project. The AEPD found that Google's privacy notice contravened Article 6 of GDPR as it stated that Google did not share information with companies, outside of Google, unless the interested party gives their consent. Finally, the AEPD found that Google violated Article 17 of the GDPR because the Google forms used did not provide the user with the right to erase their data or to object to its transfer.

French SA fines processor for failing to enter into Data Processing Agreement

The French Data Protection Authority (the "CNIL") has fined a software publisher that sells and services solutions for medical laboratories, Dedalus Biologie SAS, €1.5m after it found that a data breach by the company led to the dissemination of personal data (including Special Category data) of approximately 500,000 individuals.

The fine provides authority for the fact that processors (as required by Article 28(3) of the GDPR) may be held solely responsible for ensuring the existence of a contract or other lawful basis for processing as between data subjects and the data controller for whom they are processing data.

Civil Litigation

Google sued for using the NHS data of 1.5 million Britons 'without their knowledge or consent'

Google's artificial intelligence arm, DeepMind, is being sued by Andrew Prismall in a representative action over claims that it "obtained and used a substantial number of confidential medical records without patients’ knowledge or consent". Google is alleged to have obtained data belonging to 1.6 million patients from the Royal Free NHS Trust (the "Trust") in London in order to test a smartphone app which could detect acute kidney injuries. Google and the Trust entered into a five-year data sharing agreement for this purpose and this app was subsequently used by the Trust on a discounted basis.

In July 2017, the ICO ruled that the data sharing agreement was illegal, as it deemed the deal to be a breach of the Data Protection Act 2018. As a result of an investigation by the ICO, the Trust were asked to sign an undertaking which committed it to acting in accordance with data protection laws with the assistance of the ICO. The Royal Free NHS Trust has not been named as a Defendant in this representative action.

SMO v TikTok discontinued shortly after commencement

Our March bulletin discussed how the proceedings in SMO v TikTok Inc. and Others [2022] EWHC 489 (QB) were stayed pending the outcome of the UK Supreme Court's decision in Lloyd v. Google [2021] UKSC 50 (which we covered in detail here).

Following the judgment in Lloyd, TikTok applied to have the case against it struck out on the basis that Lloyd was fatal to the SMO's claim (in which the Supreme Court held that a representative action could be brought to establish liability under CPR 19.6, but that damages could only be determined following an individualised assessment. The Supreme Court described this assessment as a "bifurcated process" and emphasised that damages were not available under the Data Protection Act 1998 merely for "loss of control" of personal data and that claimants would instead need to show proof of damage or distress).

It has now been reported that SMO's has elected to discontinue its claim against TikTok, which is just one in a string of discontinuances following the decision in Lloyd, as claimants, like SMO, have been compelled to reconsider the viability of pursuing their claims.

US Federal Judge rules that large data breach class action against Marriott may proceed

In the US, a federal judge in Maryland has granted class certification to 133 million American consumers in a claim against the Marriott hotel chain and its data security vendor Accenture arising out of a 2018 data breach. This breach was discovered by Marriott at the time of its acquisition of Starwood, whereby 133.7 million guest records of Starwood customers, including passport numbers, dates of birth and credit card details, were compromised The case will proceed initially as a class action on behalf of the first group of claimants, approximately 45 million customers in California, Connecticut, Florida, Georgia, Maryland, and New York.

Readers will recall that Marriott was sanctioned by the ICO arising out of this data breach, which resulted in a Monetary Penalty Notice fining it £18.4m.