Data Protection update - December 2023/ January 2024
Welcome to the Stephenson Harwood Data Protection bulletin, covering the key developments in data protection law from December 2023 and January 2024.
In December, the Council of the European Union and the European Parliament reached an historic agreement on laws to regulate the use of AI, marking the world's first comprehensive AI legal framework.
Elsewhere in enforcement news, the Court of Justice of the European Union ("CJEU") delivered three significant judgments in December on: (i) the use of automated processing, including profiling, to create individual credit scores under the GDPR; (ii) the requirement to enable data subjects to exercise their right to object and to obtain erasure of their personal data; and (iii) the conditions under which supervisory authorities can impose fines on controllers for breaches of the GDPR.
In January, data protection guidance was issued by the UK supervisory authority, the ICO, and the French supervisory authority, the CNIL, on (i) keeping employment records; (ii) recruitment and selection; and (iii) undertaking transfer impact assessments.
28 January 2024 was International Data Protection Day. This day marks the anniversary of the Council of Europe's first data protection convention, Convention 108, being opened for signature. The occasion encourages governments, parliaments and national data protection bodies to raise awareness of data protection and the right to privacy.
- UK CoA rules out UK GDPR immigration exemption, again
- CJEU decision on data breaches, accountability and non-material damages
- EU approves all pre-GDPR adequacy agreements
- ICO publishes two new sets of employment guidelines for consultation
- CNIL publishes guidance on Transfer Impact Assessments
- EDPB DPO coordinated investigations identify areas of improvement for DPO role, recognition and resources
- AI Act introduced and unanimously approved by EU Member States
- ICO consultation series on generative AI and data protection
- UK government consider publishing "tests" for triggers for AI regulation
Enforcement and civil litigation
- CJEU publishes new judgment on a "wrongful" infringement of EU GDPR and the fines imposed
- CJEU ruling on automated decision making and credit scoring under GDPR
- EDBP releases urgent binding decision on Meta's behavioural advertising
- Requirement to report even minor personal data breaches
- Victims of Capita breaches head to the High Court
UK CoA rules out UK GDPR immigration exemption, again
On 11 December 2023, the Court of Appeal (the "CoA") ruled that a revised amendment to the Data Protection Act 2018 ("DPA 2018"), which disapplied certain UK GDPR data subject rights for activities relating to immigration control, is still unlawful (the "Immigration Exemption"). This amendment, narrowed in scope to limit the exemption to processing carried out by the Home Secretary only, was introduced after the CoA quashed the wording of a similar exemption in 2021.
It was held that the Immigration Exemption failed to comply with Article 23 of the UK GDPR, which governs the UK government's ability to introduce exemptions to the legislation. The CoA judgment disagreed with the notion that a non-binding policy document that set out the government's approach to abuse, access and transfer, was an adequate safeguard for such exemptions. Further, the exemption failed to address the risks to the rights and freedoms of the data subjects, thereby breaching UK GDPR on multiple grounds.
The CoA ruled that the UK government must set out clear safeguards for the immigration exemption and implement these within three months of the decision. These must address how vulnerable individuals within the immigration system can access their personal data.
CJEU decision on data breaches, accountability and non-material damages
On 14 December 2023, the CJEU published a significant judgment relating to data breaches, accountability and non-material damages in a case involving the Bulgarian National Revenue Agency ("NAP"). In 2019, the NAP suffered a cyberattack, leading to the unauthorised disclosure of personal data relating to more than six million individuals. Although no fraud or identity theft occurred, claims were brought against the NAP for non-material damages caused by the fear and distress of the misuse of their data.
In response to questions referred to them by the Bulgarian Supreme Administrative Court, the CJEU made several key findings:
- A data subject's fear that their personal data might be misused following a data breach is capable of constituting "non-material damage" entitling them to compensation, even if nothing happens to their data after it is obtained;
- Unauthorised personal data access or disclosure does not automatically mean that a controller has failed to meet its data security obligations. Courts must assess the appropriateness of a controller's security measures;
- The EU GDPR's principle of accountability is to be interpreted as meaning that, in an action for damages, the burden of proof lies on the controller when proving that its security measures are appropriate. Courts must conduct an objective assessment of the appropriateness of the measures and an expert report cannot be considered a sufficient proof; and
- A controller will be required to compensate data subjects who have suffered damage in the event of unauthorised personal data disclosure or access by a third party, e.g. a cybercriminal, unless the controller can prove that it is not responsible. To do so, the controller must prove that it has not breached the EU GDPR or that any breach didn't directly cause harm to the individual.
This CJEU judgment reveals the importance for controllers to document every step of a data breach: their security measures, decisions and assessments, any advice taken, incidents experienced, and concerns raised by employees, customers or their Data Protection Officer ("DPO").
EU approves all pre-GDPR adequacy agreements
In a report published on 15 January 2024, the European Commission confirmed that Andorra, Argentina, Canada, the Faroe Islands, Guernsey, the Isle of Man, Israel, Jersey, New Zealand, Switzerland, and Uruguay continue to have sufficient data protection safeguards. This means that the adequacy decisions that had been previously granted to these countries prior to the GDPR coming into force remain in place, so unrestricted data flows to them under the EU GDPR continue to be permitted.
ICO publishes two new sets of employment guidelines for consultation
The ICO has issued two new sets of employment guidelines, which are now available for public consultation.
The first employment guidelines cover employment records. It provides guidance on the type of employee records employers should keep and how to do so lawfully. Other areas covered include appropriate lawful bases for keeping such records, retention periods for workers' personal data, how to handle sickness and injury records and what to consider regarding employment records during mergers and acquisitions.
The second set of employment guidelines focuses on recruitment and selection. It covers issues such as how to process candidates' information fairly and lawfully during various stages of recruitment, examples of lawful bases that would most likely be applicable when processing candidates' data, how to limit what much data is collected and when a Data Protection Impact Assessment would need to be conducted for recruitment purposes.
Consultations on these guidelines close on 5 March 2024.
CNIL publishes guidance on Transfer Impact Assessments
The CNIL has organised a public consultation on a draft guide to conducting a transfer impact assessment ("TIA"). This is aimed at organisations transferring personal data outside of the European Economic Area ("EEA"). It provides a practical methodology and checklist for assessing the level of data protection regulation in the country of destination and the need for the adoption of supplementary measures.
The guide identifies six different steps for a data exporter to follow when carrying out a TIA, as follows:
- Know your transfer.
Describe the transfer.
- Document the transfer tool used.
Determine whether a TIA is required. Conducting a TIA is required only when one of the tools of Article 46 GDPR is used.
- Evaluate the legislation and practices in the country of destination of the data and the effectiveness of the transfer tool.
Identify whether there are any factors that could negatively impact the effectiveness of the appropriate safeguards being put in place to conduct the transfer.
- Identify and adopt supplementary measures.
Identify supplementary measures that need to be put in place where existing security measures, in the country that personal data is being transferred to, are not satisfactory.
- Implement the supplementary measures and the necessary procedural steps.
Outline a suggested action plan for carrying out the operational implementation of the additional measures identified in Step 4.
- Re-evaluate at appropriate intervals the level of data protection and monitor potential developments that may affect it.
When will it be suitable to undertake future reassessments of the transfer?
This is in line with the European Data Protection Board guidance on the same topic. It should be noted that use of the guide is not mandatory but acts as guidance to organisations on what a TIA should contain. The public consultation will end on 12 February 2024.
EDPB DPO coordinated investigations identify areas of improvement for DPO role, recognition and resources
In January 2024, the European Data Protection Board ("EDPB") adopted a report based on findings of investigations conducted throughout 2023 by 25 Data Protection Authorities ("DPAs") across the EEA. These investigations aimed to delve into the roles, challenges, and effectiveness of DPOs five years after the implementation of the EU GDPR. Some of the DPAs are conducting enforcement action on the basis of their findings, whereas others simply treated the exercise as information gathering.
One major finding of the investigation was that many organisations had still not appointed a DPO, despite it being a mandatory requirement for some. In response, the report recommended that clearer guidance was needed from DPAs on the applicable DPO requirements and further awareness campaigns should be introduced to promote existing guidance. In contrast, it was found that some organisations had voluntarily appointed a DPO even where it was not mandatory, due to the benefits they had found of having one.
For organisations with DPOs, challenges that were outlined in the report include: insufficient resources provided to DPOs, insufficient expert knowledge and training of DPOs, DPOs not being fully entrusted with the tasks required under the GDPR, lack of independence and lack of reporting by DPOs to the organisation's highest management levels.
To address these challenges, specific recommendations were proposed. For instance, to address the problem of insufficient resources, controllers and processors should conduct a clear case-by-case analysis of what resources a DPO needs. To provide an additional resource, a deputy DPO should be considered. To enhance expert knowledge, there should be an increased use of certification mechanisms and initiatives, as well as collaboration with universities and use of market-led training courses. To ensure DPO independence, the report suggested that organisations and DPOs formalise DPO duties and conditions for performing them in an engagement letter.
AI Act introduced and unanimously approved by EU Member States
On 8 December 2023, the Council of the European Union and the European Parliament reached an historic provisional agreement on laws to regulate the use of artificial intelligence in the EU (the "AI Act"). The AI Act marks the world's first comprehensive legal framework to regulate the use of AI, aiming to ensure that "AI systems placed on the European market and used in the EU are safe and respect fundamental rights and EU values".
On 2 February 2024, the Council of EU Member States unanimously approved the final compromise text of the AI Act and formally recommended it for consideration by the European Parliament. Before the AI Act can come into force, the European Parliament’s Internal Market and Civil Liberties Committees will need to approve the AI Act on 13 February 2024, followed by a plenary vote provisionally scheduled for 10-11 April 2024.
Formal adoption will then be complete. The AI Act will then enter into force 20 days after publication in the official journal. From that point, the bans on prohibited practices under the AI Act will apply after six months, whereas the obligations on AI models will commence after one year. The remaining rules will come into force after a two year implementation period, except for the classification of AI systems that will need to undergo a third-party conformity assessment under other EU rules, which will take an additional year.
Please see our summary from December on what we know so far about the AI Act and key takeaways. Since our December summary, an unofficial complete draft has been leaked. Noteworthy points cover which AI practices are prohibited, which systems are high-risk, obligations for general purpose AI, and new enforcement measures. However, at the time of writing, an official version of the AI Act text is not yet available. Once available, we will publish a detailed breakdown of the AI Act.
Keep an eye out for upcoming events where we help clients digest the impact of the AI Act and what they will need to do to prepare.
ICO consultation series on generative AI and data protection
On 15 January 2024, the ICO launched a consultation series on generative AI and data protection. It focuses on how aspects of data protection law apply to the development and use of generative AI models. The ICO will share a series of papers outlining its views on how to interpret specific requirements of UK GDPR and Part 2 of the DPA 2018 in the upcoming months. The first chapter in the consultation series covers the lawful basis for training generative AI models on web-scraped data.
The ICO recognises that organisations would appreciate greater clarity on how data protection law applies to the development and use of generative AI and has set out the following questions for organisations to ask when engaging with such new technology:
- what is the appropriate lawful basis for training generative AI models?
- how does the purpose limitation principle play out in the context of generative AI development and deployment?
- what are the expectations around complying with the accuracy principle?
- what are the expectations in terms of complying with data subject rights?
Any stakeholder interested or engaging with generative AI is invited to consider these questions and respond via the ICO consultation. Any input will be considered and used to update the current ICO guidance on generative AI.
UK government consider publishing "tests" for triggers for AI regulation
Following the UK's AI Safety Summit in November 2023, the new AI Safety Institute (a government body comprised of machine learning experts) is now set to publish a series of tests to determine how AI is regulated. These would stipulate criteria for the circumstances in which the UK government would enact regulation of powerful AI models.
If these "tests" are met, this would trigger an intervention and may result in new curbs being enacted on AI models by the UK government. Although the UK has said it currently plans to refrain from creating specific AI legislation in favour of more light-touch methods, the government's new tests seem to suggest continued concerns about the risks posed by the fast-developing technology. These tests will be included as part of the government's response to a consultation on its white paper on the regulation of AI in the UK, to be published in March.
Enforcement and civil litigation
CJEU publishes new judgment on a "wrongful" infringement of EU GDPR and the fines imposed
In a judgment in early December 2023, the CJEU clarified the conditions under which supervisory national data protection authorities can impose fines on controllers for breaches of the EU GDPR.
This comes as a result of a German, as well as a Lithuanian case, both contesting fines relating to EU GDPR breaches. The two courts asked the CJEU to interpret the possibility for national supervisory authorities penalising EU GDPR infringement by imposing a fine.
In its judgment, the CJEU highlighted the following key points on liability for breaches of the EU GDPR:
- A controller may not have a fine imposed on it for infringing the EU GDPR unless the breach was committed "intentionally or negligently".
- The infringement does not have to have been committed by its management body, nor committed by identifiable people.
- A controller may have a fine imposed on it even if operations were being carried out by a processor.
- Joint controllership arises solely where two or more entities have been involved together in the determination of the purposes and the actions of processing, rather than through the allocation of joint controllership in an agreement.
- The total worldwide turnover of the wider group should be considered in relation to the maximum amount for fines, rather than the turnover of the specific company at fault.
CJEU ruling on automated decision making and credit scoring under GDPR
On 7 December 2023, the CJEU delivered two new judgments on the scope and interpretation of the automated decision-making restrictions under the GDPR. German credit-rating agency Schufa Holding AG ("Schufa") was using automated processing, including profiling, to create individual credit scores that inform credit-related decisions such as loans.
The CJEU ruled that calculating credit scores automatically on the basis of probability constitutes automated decision-making that is restricted under Article 22. It is therefore not only lenders, but also those responsible for credit scoring that informs lending decisions, who must comply with the strict conditions under which automated decision-making is permitted. For a full summary of the CJEU decision, please see our blog post here.
EDBP releases urgent binding decision on Meta's behavioural advertising
On 7 December 2023, the EDPB published the text of its Urgent Binding Decision (the "Decision") which related to the processing of personal data by Meta Platforms Ireland Limited ("Meta") for behavioural advertising based on contract and legitimate interests.
On 26 September 2023, the Norwegian Data Protection Authority ("NO DPA") found infringements of Article 6(1) of EU GDPR, resulting in an urgent risk to the rights and freedoms of data subjects. Following this, the NO DPA adopted an order imposing a temporary ban on Meta regarding the processing of personal data of Norwegian data subjects for behavioural advertising relying on the legal bases of contract or legitimate interest. As this only applied for three months and was only applicable in Norway, the NO DPA submitted a request to the EDPB for a final order on the matter.
In November 2023, we commented on the ban of Meta in the EU by the Irish Data Protection Commission on the processing of personal data for behavioural advertising on the legal bases of contract and legitimate interest across the EEA.
In December 2023, the EDPB finally released its full Decision. The Decision bans the processing of personal data collected on Meta's products for the purpose of behavioural advertising across the entire EEA, to the extent that such processing relies on the legal bases of contract or legitimate interest.
The Decision means that consent is required for behavioural advertising using Meta's products. Meta's move in response has been to require users who do not consent to such processing to pay a subscription for their products. This response will reportedly be examined by the EDPB.
Requirement to report even minor personal data breaches
The Polish Data Protection Authority ("Polish DPA") has fined a Polish insurance company approximately €24,000 over a seemingly minor personal data breach. The insurance company argued that the data breach did not require a notification to the Polish DPA (or to the data subject), as it was unlikely to result in a risk to the rights and freedoms of the individual. Although seemingly minor in scope and nature, it was noteworthy that the Polish DPA imposed a fine for a failure to report the breach. Please see our full summary of the incident here.
Victims of Capita breaches head to the High Court
In March 2023, Capita, whose systems are used to administer pensions for around 450 organisations, covering millions of policyholders, suffered a cyber-attack. We commented here on the widely reported cyber-attack and the regulator's response. The attack impacted certain personal data relating to customers, suppliers and staff. The ICO stated in May 2023 that around 90 organisations had reported breaches connected to Capita, including a large number of pension schemes such as the Universities Superannuation Scheme (USS) pension fund, the UK's main pension fund for universities, which has reportedly written to all its 500,000 members.
In January 2024, it has been reported that a number of pension-holders who believe their personal data was compromised in the March 2023 attack, have joined a group action lawsuit. On 14 January 2023, Manchester based law firm, Barings Law, filed a claim under UK data protection law at the High Court. Barings Law is now representing more than 5,000 people in its claim, which it estimates could be worth up to £5 million. It claims that 50 new claimants are coming forward to join the action on a daily basis, with other law firms claiming they are planning further group actions over Capita. This could result in one of the largest data breach group actions seen in the UK.
Each month, we bring you a round-up of notable data protection enforcement action.
Yahoo! was fined due to their alleged violation of the ePrivacy Directive. Visitors to the Yahoo site had cookies placed on their computer without consent and had difficulty withdrawing such consent. Their consent was also linked to availability of their email service, which should have been explained to them.
The ICO has fined HelloFresh for sending out millions of spam emails and spam texts over seven months. It was not clear what Customers were opting into (and included no reference to marketing via text) when they agreed to the opt-in statement.
Polish Ministry of Health
€23,000 (maximum possible penalty)
The Polish Ministry of Health publicly disclosed private health information of a doctor who had criticised new regulations brought in by the minister on a social media site.
Amazon France Logistique
Amazon have been fined for an "excessively intrusive" surveillance system which was set up to closely monitor the performance of staff. The level of heavy tracking of staff was held to be illegal by the CNIL.
Ministry of Defence
Details of 265 people were compromised in email data breaches weeks after the Taliban took control of Afghanistan in 2021. Such errors could have resulted in a threat to life.
The Italian DPA has reportedly notified OpenAI about a violation of GDPR related to collecting mass volumes of personal data for algorithmic training and insufficient age protections. The draft findings have not yet been disclosed and further details are still to be confirmed.