Data Protection update - September 2023

Data Protection update - September 2023

Welcome to the Stephenson Harwood Data Protection bulletin, covering the key developments in data protection law from September 2023.

On 21 September, the Department of Science, Innovation and Technology published the Data Protection (Adequacy) (United States of America) Regulations 2023, which are set to come into effect on 12 October 2023. From this date, UK organisations will be able to transfer personal data to US entities certified under the UK Extension to the EU-US Data Privacy Framework (also known as the "UK-US Data Bridge") without the need to implement further transfer safeguards.

Also in September, the Information Commissioner's Office (the "ICO") announced that a joint Memorandum of Understanding ("MoU") had been signed by the UK Information Commissioner and National Cyber Security Centre. This sets out the collaborative approach of both organisations in responding to major cyber incidents and promotes cooperation relating to cybersecurity guidance going forward.

Elsewhere, TikTok found itself in trouble and facing a €345 million Irish fine over failing to sufficiently protect children's personal data whilst using the video platform. TikTok disagreed with the decision.

In this month's issue:

Data protection

Cyber security

AI

Enforcement and civil litigation

Data protection

UK-US Data Bridge Finalised

From 12 October 2023, the UK's extension to the EU-US Data Privacy Framework – the UK-US Data Bridge – will come into effect. This was published by the UK Department for Science, Innovation and Technology on 21 September 2023. See our blog post for a summary of the key takeaways.

ICO launches investigation into FemTech apps

Earlier this year, the UK’s Information Commissioner stated that the ICO would focus efforts on women's health apps and take action to ensure that non-compliant practices are corrected. This decision follows an ICO poll on FemTech apps, the results of which confirmed that over 50% of women classed concerns around the security of their data and transparency over data use as relatively significant.

FemTech apps are centred on monitoring and improving women’s health, such as fertility, menstrual cycle tracking and pelvic health – and so they rely on extremely sensitive and personal information. The investigation aims to ascertain the extent of user harm generated by FemTech apps, such as unduly complicated privacy policies, excessive and unnecessary data storage, and unexpected targeted advertising. The poll revealed that 17% of app users found an increase in baby or fertility-related adverts distressing.

In addition to ascertaining harms from the way in which FemTech apps use data, the ICO aims to discover best practices on how data is handled in this context. The ICO's call for evidence closes on 5 October 2023. The ICO has forewarned that it "will not hesitate to take regulatory action to protect the public if necessary".

ICO publishes guidance on workers' health

The ICO recently published in-depth guidance to assist employers in understanding their data protection obligations under the UK GDPR and Data Protection Act 2018 ("DPA 2018") when processing workers' health data. Under the guidance, workers are broadly defined as "someone who performs work for an organisation", in turn capturing arrangements beyond the traditional employment sphere and covering informal work such as the gig economy. Under the UK GDPR, health data is special category personal data and in consequence warrants additional protection.

The ICO stated that the guidance is designed to achieve three key aims: increase greater regulatory certainty, protect workers' data protection rights, and help employers build trust with their workers.

The guidance is split into two sections. The first section provides an overview of how UK data protection laws regulate the processing of workers’ health information and includes basics for achieving compliance. The second section explores the most prevalent employment practices concerning workers’ health information, such as sharing employee data and handling records of absence and sickness, and discusses legal requirements and good practice. The guidance is rounded off with checklists which act as quick reference guides for data protection considerations. The guidance is available here.

Department for Science, Innovation and Technology publishes The Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023

On 11 September, the UK Department for Science, Innovation and Technology laid the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 ("Data Protection Amendment Regulations") before parliament for sifting. The EU GDPR continues to apply in the UK following Brexit as the UK GDPR, supplemented by DPA 2018. The Data Protection Amendment Regulations will amend the existing UK data protection regime so that references to "fundamental rights and freedoms" pertain to the European Convention on Human Rights (which has been enshrined by the Human Rights Act 1998) as opposed to the Charter of Fundamental Rights of the European Union.

The most relevant rights in this context include the right to a private life and the right to freedom of expression, both of which will continue to apply in the new definition of "fundamental rights and freedoms". The right to the protection of personal data will not be transposed into the definition, however individuals will benefit from this right in practice by virtue of the right to a private life and the protections contained in the UK GDPR and DPA 2018. At present, the UK Government does not intend to publish supporting guidance on the change because it believes that the practical effect on organisations will be insignificant as the change will result in largely replicating the current position with the added benefit of clarification on the rights that are to be considered within domestic law, and in consequence will not involve an increased regulatory burden.

Cyber security

ICO and NCSC announce intention to collaborate in responding to cyber incidents

The UK Information Commissioner and the CEO of the National Cyber Security Centre ("NCSC") have signed a joint MoU which sets out the collaborative approach that both organisations shall take in responding to major cyber incidents.

The MoU describes how the ICO and NCSC will cooperate regarding the development of cybersecurity standards and guidance. The NCSC CEO explained that the MoU "provides us with a platform and mechanism to improve cyber security standards across the board” whilst the Information Commissioner stated that the MoU "reaffirms our commitment to improve the UK's cyber resilience so people's information is kept safe online from cyber attacks".

The MoU emphasises that if an organisation provides information in confidence to the NCSC, such information will not be shared with the ICO without obtaining prior consent from the disclosing organisation.

The key takeaways from the MoU are as follows:

  • ·organisations will be encouraged to correspond with the NCSC appropriately in relation to cybersecurity matters;
  • such engagement with the NCSC will be incentivised by the ICO, who may offer reduced regulatory penalties to organisations that engage with the NCSC;
  • the ICO will provide the NCSC with cyber incident information in an anonymised and aggregate format;
  • the NCSC and ICO will coordinate their response to minimise disruptions to an organisation attempting to mitigate harm following a cyber incident;
  • the NCSC and the ICO will provide feedback to each other consistently; and
  • the NCSC and the ICO will collaborate to enhance guidance available.

MGM Resorts data breach

The Hotel and resorts operator MGM Resorts recently suffered a cyber-attack which caused slot machines in hotel's casinos on the Las Vegas Strip to stop working, slowed down electronic transfers of winnings, thousands of hotel room key cards to stop functioning and the shutting down of large parts of its internal networks.

The hacking group 'Scattered Spider' is thought to be the culprit for this attack. Scattered Spider conducts such software attacks with the aim of encrypting or stealing data to demand ransom payments. The hacking group is known to conduct such attacks by gathering information about individuals from social media profiles and using such information to make convincing and sophisticated phone calls to target employees and help desks, with the aim of extracting passwords or digital codes required to breach networks.

Although relatively new to the ransomware industry, Scattered Spider has already attacked 100 organisations, typically in the United States and Canada, and has been described by the chief technology officer at Mandiant (a cybersecurity firm and subsidiary of Google) as "very active, very disruptive and causing chaos".

AI

New House of Commons report on the Governance of Artificial Intelligence

Recently, the House of Commons Science, Innovation and Technology Committee released a report on the Governance of Artificial Intelligence. The report appreciates the Government's White Paper on AI Regulation, published in March 2023, but argues for the need to introduce an AI Bill so that the UK can keep pace with AI regulations in the EU and the US. Furthermore, the report included several recommendations for the UK Government to consider when framing their AI approach. The report highlighted twelve challenges to AI governance:

  1. Bias – AI can perpetuate negative biases.
  2. Privacy - AI can allow individuals to be identified easily and their data to be used in undesirable ways.
  3. Misrepresentation - AI can produce material that misrepresents an individual.
  4. Access to data – Access to large datasets, which some AI software requires, is limited.
  5. Access to compute – Access to great computing power, which powerful AI requires, is limited.
  6. Black box - Some AI software cannot explain why or how they produce certain results.
  7. Open-source – Requiring code to be publicly available supports transparency and innovation.
  8. Intellectual property and copyright – Some AI software uses third party content.
  9. Liability – Policy should specify whether any liability lies with the developers or providers of AI technology should it cause harm.
  10. Employment – AI will impact jobs that people do and that are available.
  11. International coordination - A global, coordinated response to the risks of developing AI would be beneficial due to the global nature of AI.
  12. Existential – Protections should be made for national security due to the threat that the advancement of AI may cause to human life.

Enforcement and civil litigation

TikTok receives €345 million Irish fine over protecting children's personal data

After an ongoing inquiry, on 1 September 2023, Ireland's Data Protection Commission (the "Irish DPC") adopted a final decision which found that TikTok had failed to sufficiently protect the personal data of its users.

The Irish DPC's inquiry examined the extent to which TikTok complied with its obligations under the EU GDPR in relation to the processing of personal data relating to child users of the TikTok platform.

Comments and videos posted by children on TikTok were public by default, as were additional features which allowed users to repost videos with their own reactions. The Irish DPC found that each of these factors put children's personal data at risk.

The Irish DPC's decision records infringement of Articles 5(1)(c), 5(1)(f), 24(1), 25(1), 25(2), 12(1), 13(1)(e) and 5(1)(a) GDPR.

TikTok, alongside an order to bring its processing of data into compliance, was fined €345 million, the fifth-largest penalty ever imposed by the GDPR.

A spokesperson for TikTok said: “We respectfully disagree with the decision, particularly the level of the fine imposed. The DPC's criticisms are focused on features…that we made changes to well before the investigation even began, such as setting all under 16 accounts to private by default.” As of 28 September 2023, TikTok are bringing a High Court action to appeal against the decision.

Amazon facing €170 million fine over workplace surveillance: is this a GDPR violation?

 

The French data protection authority, CNIL, has stated that Amazon's logistics unit Amazon France Logistique ("Amazon France") had violated the EU's data protection rules for collecting large amount of workers' data.

On 14 September 2023, an official at the CNIL made a statement at a public hearing saying that Amazon France's collection of large amounts of workers' data was not "necessary and proportionate" for the purpose of regulating productivity at its warehouses, and therefore constituted a violation of EU GDPR.

The CNIL's investigation found that Amazon France had an electronic scanner in place, linked to each employee, which monitored numerous data points about each employees' performance. Notably, this collected data on the employees "idle time" which the CNIL stated was not essential nor proportionate.

Amazon has disputed this finding, their lawyers stating that all information collected about their employees was necessary for the operation of their warehouses and distribution centres and their employees' security. A spokesperson for Amazon France has stated that they are "confident that our systems are fully compliant with French and EU regulations".

The CNIL has suggested that Amazon may be fined EUR170million for the violations in EU GDPR. Amazon France are engaging with CNIL despite disagreeing with the allegations and have stated a desire to work constructively with CNIL in the investigation.

Proposal to exclude any sector not already governed by EU law from the scope of the EU GDPR

On 14 September 2023, an advocate general from the European Court of Justice ("ECJ") controversially proposed excluding any sector not already governed by EU law from the scope of EU GDPR.

This discussion came about as a consequence of controversy surrounding the publication of an athlete's name and allegations of breaches of anti-doping rules by Austria's Independent Anti-Doping Agency (the "Agency"). The athlete in question challenged the legality of this decision.

The ECJ argued that any publication of such data constituted non-consensual data processing.

However, advocate general, Tamara Capeta, has suggested that although the right to protection of personal data in the EU Charter of Fundamental Rights allows the EU to regulate data protection, it does not apply to data processing that falls outside the competence of EU law.

Capeta stated that sport, which the EU does not have competence to regulate (except as an economic activity), would be beyond the ability of the EU and hence, outside the scope of EU GDPR. This opinion could essentially result in any sector out of the EU's ability to regulate being deemed outside the scope of EU GDPR.

However, the ECJ has not reached its final judgment and other professionals have expressed doubts that the ECJ will follow Capeta's opinion.

Round-up of notable enforcement action

Each month, we bring you a round-up of notable data protection enforcement action.

Company

Authority

Fine

Comment

Autostrade per L'Italia Spa

Italian DPA

€1,000,000

The company illegally profiled customers and web visitors without consent for targeted advertising, telemarketing and postal marketing.

Trygg-Hansa

Swedish DPA

€3,000,000

The directorate had failed to implement sufficient data security measures, resulting in un-authorised access to sensitive health data.

Iberico De Engergia

Spanish DPA

€70,000

Creditinfo Lánstraust, a financial information agency, had illegally registered defaulted small loans, which were reported to it from loan provider Commerce 2020 and debt collector AIC.

Groupe Canal Plus

French DPA

€750,0000

Groupe Canal Plus failed to seek users' consent before sending emails relating to subscriptions to the tv service, thereby violating EU privacy rules.