Data Protection update - September 2025

Data Protection update - September 2025

Welcome to the latest edition of the Stephenson Harwood Data Protection update, covering the key developments in data protection and cyber security law in September 2025.

In data regulation news, the EU Data Act has entered into force; the European Commission has published a draft adequacy decision for Brazil; South Korea and the European Union (“EU”) have established a reciprocal data protection framework; the implementation timeline for the UK’s Data (Use and Access) Act 2025 (“DUAA”) has been updated; the European Data Protection Board (“EDPB”) has published guidelines on how the Digital Services Act and GDPR interact; and European Court of Justice (“ECJ”) Advocate Generals provide guidance on refusing subject access requests and public disclosure of data.

In cybersecurity news, the Information Commissioner’s Office (“ICO”) clarifies common misconceptions about storage and access technologies; and a US whistleblower reveals Meta’s effort to avoid liability over potential harm to kids and teens.

In enforcement and civil litigation news, a care home director has been convicted for obstructing a data subject access request; the Court of Appeal held there is no minimum threshold of seriousness for compensation in personal data claims; the Court of Justice of the European Union (“CJEU”) General Court dismissed a challenge to annul the EU-US data privacy framework; and the ECJ ruled that data subjects cannot obtain an injunction to prevent future data breaches and handed down its judgment on the question of whether pseudonymised data constitutes personal data.

As for key US updates, Google is fined for an inadequate privacy policy; and the Chairman of the Federal Trade Commission has warned US tech companies to protect the privacy and data security of American consumers.
 

Data Regulation

Cyber Security

Enforcement and Civil Litigation

Key US Update

Data Regulation

EU Data Act has entered into force

Last year, we published a series of articles on the EU Data Act (the "Data Act"). The Data Act is a comprehensive piece of legislation aimed at fostering a fair and competitive digital environment in the European Union. It focuses on ensuring that data (both personal and non-personal) is shared more effectively, while protecting the rights of individuals and businesses.

Whilst the Data Act entered into force on 11 January 2024, it only became applicable on Friday 12 September 2025. 

The Data Act applies to companies (whether or not based in the EU) that operate and manage connected products and/or related services in the EU. It includes new, harmonised rules governing data access and sharing for connected devices and related services, and contains rules which affect the provision of cloud services, including rules governing contracts for these services. 

You can find our series on the Data Act here, which is intended to provide a helpful reminder of what the Data Act entails, and to share our insights on key parts of the Data Act.

The European Commission publishes Draft Adequacy Decision for Brazil

On 5 September 2025, the European Commission (the "Commission") published a draft adequacy decision which recognises Brazil as a country with an adequate level of protection for personal data. If adopted, it will allow personal data transfers to flow from the EU to Brazil without the additional safeguards under Article 45 GDPR – effectively treating such transfers as “essentially equivalent” to those within the EU.

The adoption procedure will involve the issuance of an opinion from the EDPB and approval by Member State representatives. The draft will also be scrutinised by the European Parliament and Council. Following this, the Commission will formally issue the decision, which will take immediate effect. The Commission will continue to review the adequacy decision at least every four years.

South Korea and the EU establish mutual data protection framework

South Korea has officially recognised the EU’s data protection regime as equivalent to its own, establishing a reciprocal framework for the free flow of personal data between the two regions. Together with the EU’s adequacy decision granted in favour of South Korea in 2021, this marks the first time such a two-way arrangement has been put in place between the two jurisdictions. The arrangement is expected to boost cooperation in trade, travel, research, and global discussions on privacy and digital trust. 

The recognition comes after a 20-month review by South Korea’s Personal Information Protection Commission (“PIPC”) of the EU’s GDPR and related protections, benchmarked against South Korea’s Personal Information Protection Act (“PIPA”).

Effective immediately, the recognition allows data transfers to all EU member states plus Norway, Liechtenstein, and Iceland without additional safeguards, although financial data is excluded due to regulatory independence concerns. 

The agreement was enabled by certain amendments to South Korea’s privacy law in 2023, which introduced more flexible overseas data transfers. The recognition will be reviewed three months before September 2028 and may be amended or revoked at (or before) this point if EU protections are deemed inadequate or if significant risks to South Korean data subjects emerge.
 

Updated implementation timeline for the DUAA

The Department for Science, Innovation and Technology has updated its timeline on its plans for commencement of the DUAA. The DUAA implementation timeline details a four-stage phased approach to bringing the DUAA into force; the first stage of which we covered in detail in our July edition

Stage 2 has seen two further commencement regulations made this month, which bring changes under the DUAA into effect. 

The Data (Use and Access) Act 2025 (Commencement No. 2) Regulations 2025 were made on 2 September 2025. These bring into force Section 124 (Retention of information by providers of internet services in connection with death of child) of the DUAA with effect from 30 September 2025. Section 124 amends the Online Safety Act 2023 and requires social media providers and other regulated online services to retain information when notified to do so by Ofcom, where Ofcom is required by a coroner (or equivalent authority) to obtain this information in connection with an investigation into the death of a child. 

The Data (Use and Access) Act 2025 (Commencement No. 3 and Transitional and Saving Provisions) Regulations 2025 were made on 4 September 2025. These regulations activate several sections of the DUAA that amend Parts 3 and 4 of the Data Protection Act 2018 (“DPA 2018”) governing the processing of personal data for law enforcement purposes and by the intelligence services.

 

  • Section 79 came into effect on 5 September 2025 and revises the DPA 2018’s legal professional privilege exemption for law enforcement processing related to data subject rights. 
  • Section 88 also came into effect on 5 September 2025 and updates the DPA 2018’s national security exemptions.
  • Sections 89 and 90 will come into effect on 17 November 2025 and amend the DPA 2018’s rules on joint processing by intelligence services and competent authorities.

Stages 3, which will include the main changes to data protection legislation, is still expected “approximately 6 months after Royal Assent”. However, changes to the ICO’s governance structure have been moved into Stage 4, which also includes the commencement of provisions on the National Underground Register and the electronic registering of births and deaths.

The ICO is now not expected to be abolished until early 2026, once members of the Information Commission’s new Board have been appointed. Section 103 (Complaints by data subjects), requiring controllers to establish processes for handling complaints from data subjects, is also not expected to be commenced until approximately 12 months after Royal Assent (i.e. 19 June 2026).

The ICO has three open consultations on draft guidance to support the entry into force of the DUAA, expected to be published later in the year. The first open consultation concerns guidance for controllers handling data protection complaints. This guidance aims to help controllers understand their new obligations under the DUAA, which introduces a new requirement for controllers to implement a complaints handling process. As a reminder, once in force, these rules will require organisations to:

  • provide data subjects with a clear process to submit complaints;
  • acknowledge receipt of the complaint within 30 days of receipt;
  • take appropriate steps to respond to the complaint without undue delay; and
  • inform complainants of the outcome without undue delay.

The ICO’s draft guidance explains what organisations must do to comply, offers practical advice, and includes examples of good practice for managing complaints. The consultation on this guidance is open until 19 October 2025.

The second consultation concerns draft guidance on recognised legitimate interests. The guidance covers an explanation of the new lawful basis for processing personal data, the different legitimate interests and how organisations should use this basis. The ICO is also consulting on draft guidance specifically for public authorities regarding the use of recognised legitimate interest when carrying out a public task or official function.  Both of these consultations will close on 30 October 2025.
 

EDPB issues new guidelines on navigating the DSA and GDPR

The European Data Protection Board (the “EDPB”) has issued its first guidelines clarifying how the Digital Services Act (the “DSA”) and the GDPR work together to protect users’ rights online and create a safer digital environment in the EU. The DSA, first introduced in 2022, is applicable to online intermediaries like search engines and platforms, and intersects with the GDPR in several key areas including:

  • notice-and-action systems for reporting illegal content;
  • recommender systems that determine how content is presented to users;
  • protection of minors, including prohibitions on profile-based advertising targeting minors using their data;
  • transparency in online advertising; and
  • ban on profiling-based advertising using special categories of data.

The guidelines detail how GDPR obligations should be met when complying with the DSA and provide practical advice on cooperation between regulatory authorities to ensure consistent enforcement and legal certainty. The guidelines will be subject to public consultation, in due course, providing the opportunity for stakeholders to comment and provide feedback. 

You can find the EDPB guidelines here.


ECJ Advocate Generals provide guidance on refusing subject access requests and public disclosure of athletes’ data

Two noteworthy Advocate-General opinions were delivered this month. 

The first opinion was delivered by the ECJ Advocate-General, Maciej Szpunar, on the circumstances in which data subject access requests (“SARs” or “DSARs”) made under the GDPR can be refused in the EU. He opined that controllers may reject SARs if they can prove the data subject has an abusive intent; specifically, if the requests are made for reasons unrelated to data protection. 

Under GDPR Article 12(5), controllers can refuse or charge for SARs that are "manifestly unfounded or excessive," but must provide strong evidence to support this. This principle also applies to supervisory authorities under Article 57(4). 

The opinion stems from a request for a preliminary ruling from the District Court of Arnsberg, Germany regarding an ongoing dispute between German optician, Brillen Rottler, and a data subject who, the company argued, systematically used SARs to provoke refusals and claim compensation. The Advocate-General stated that a history of seeking compensation alone does not prove abuse, but submitting requests after consenting to data processing may indicate an intent to exploit GDPR rights for non-data protection purposes. 

The opinion also touched on compensation, suggesting that all damages from GDPR infringements may be compensable under Article 82 GDPR, even if not directly linked to the processing of personal data, though this is debated among legal experts. The claimant must still prove actual material or non-material damage. 

The final ECJ judgment is still pending and is expected to influence defence strategies against requests from data subjects that are considered to have an abusive intent and clarify the conditions under which compensation is justified. 

The second opinion was delivered by Advocate-General Spielmann of the ECJ on the legality of publishing data on athletes’ anti-doping violations on the websites of Austrian authorities. The publication includes not only the athletes’ names but also their sport, the length of their suspension, and the reasons for it. In Austria, such publication is provided for by law. Its stated goals are to deter doping and inform potential sponsors or employers of suspensions. Four athletes challenged this practice, arguing that it violates the GDPR – which led the Austrian courts to request an interpretation of the GDPR by the ECJ. 

Spielmann’s opinion emphasises the principle of proportionality under EU law. He questions whether public online disclosure is necessary to achieve the intended objectives. Instead, he suggests that sharing such information only with relevant sports bodies, or using pseudonymised data online, could meet the same goals while better protecting athletes’ privacy and adhering to the GDPR’s data minimisation principle. The final decision by the Austrian court is still pending.

 

Cyber Security

ICO clarifies common misconceptions about storage and access technologies

The UK ICO has released updated guidance to clarify how laws apply to storage and access technologies such as cookies and tracking pixels. The guidance aims to address common misconceptions and follows ongoing efforts to give individuals more meaningful control over online tracking and to support responsible business innovation. 

It is evident that the rules on storage and access technologies are broader than many businesses realise. The Privacy and Electronic Communications Regulations (“PECR”) cover any information stored or accessed on a user’s device. This means compliance is required even if the data isn’t “personal” under the UK GDPR. 

The ICO has also reaffirmed that “strictly necessary” means that storage or access must be essential for delivering a service the user requests. This, importantly, must be judged from the user’s perspective and not simply what is useful for the business or its advertising model. Therefore, consent remains crucial for non-exempt technologies, and legitimate interests cannot be used as a substitute where the PECR requires consent.

The ICO also emphasised that its guidance is not limited to cookies. Any technology that stores or accesses information on a device, such as device fingerprinting, will be covered.

The ICO’s updated guidance aims to provide clarity, support innovation, and ensure privacy rights are respected. Businesses are therefore encouraged to review their use of storage and access technologies, ensure transparency, and obtain valid consent where required. Adopting these practices will help build user trust and keep your organisation compliant as technologies evolve.
 

US whistleblower disclosures reveal Meta's effort to avoid liability over potential harm to kids and teens

Recent whistleblower disclosures have claimed that Meta’s efforts to shield itself from legal liability - especially regarding child safety on its platforms - may have backfired, potentially exposing the company to even greater risk. According to documents submitted to US Congress, Meta’s in-house legal team suppressed internal research highlighting serious risks to children, including harassment and underage use of its virtual reality (“VR”) products.

Six current and former Meta researchers allege that the company’s legal department routinely blocked or limited safety research, rather than addressing actual harms. This approach intensified after the “Facebook Files” were published, with Meta reportedly restricting internal briefings to insulate leadership, including CEO Mark Zuckerberg. 

The whistleblowers claim that Meta’s legal strategy was designed to obscure the company’s knowledge of risks, particularly around children’s use of VR headsets. This has drawn sharp criticism from lawmakers, who compared Meta’s actions to historical cover-ups by the tobacco industry. 

Regulatory scrutiny is seemingly increasing. The US Federal Trade Commission (“FTC”) has already accused Meta of violating a 2020 privacy settlement and proposed strict new penalties. Such penalties include banning the monetisation of users under 18. Meta is challenging these actions in court, but the controversy is fuelling calls for greater accountability - including a potential repeal of Section 230 of the Communications Decency Act that shields online platforms from being held legally liable for third-party content on their platforms. 

For organisations in the data protection and cybersecurity space, this case underscores the importance of transparency, ethical risk management, and a culture that prioritises user safety.
 
 

Enforcement and Civil Litigation

Care home director convicted for obstruction of subject access request

In what we understand to be a first-of-a-kind criminal prosecution, a director of a care home in Yorkshire has been convicted of intentionally obstructing and frustrating a valid subject access request, contrary to Section 173 of the DPA 2018, and ordered to pay a fine of £1,100 plus costs of £5,440. 

In 2023, a daughter of a resident of Bridlington Lodge Care Home submitted a SAR to the care home in respect of her father’s personal data, seeking access to incident reports, CCTV footage, and care notes in particular. The daughter made the SAR on her father’s behalf, under a lasting power of attorney. 

Rather than complying with the SAR and providing the relevant data to the requester, the care home director instead took steps, over the course of approximately one month, to erase, block or otherwise conceal various data, with the intention of preventing its disclosure. The daughter complained to the ICO, which launched an investigation. 

When confronted, the director could not provide any plausible explanation as to why the care home had failed to respond to the SAR. He also attempted to avoid scrutiny by asking for the care home’s ICO registration to be cancelled. 

At trial, the director argued variously that the care home had already provided the requested data through a member of staff; that the care home manager, not himself, was responsible for responding to SARs; that the care home was not a data controller, but merely a building; and that the fact that the care home had previously deregistered from the ICO in 2016 meant that it now had no obligations under data protection law. 

In an unsurprising outcome, the Court rejected these arguments, and the director was convicted of the offence, under Section 173(3) DPA 2018, of altering, defacing, blocking, erasing, destroying or concealing information with the intent to prevent disclosure of all or part of the information that the requester would have been entitled to receive. 

The case highlights the compulsory nature of SARs and the serious penalties that can be imposed – on both organisations and individuals – for non-compliance with SAR obligations. Organisations would be well-advised to review their SAR policies and procedures for compliance with the statutory timeframes, ensure staff are receiving appropriate and regular training and that accurate records are being maintained of SARs received and responded to, and recognise the criminal liability that can result from attempts to obstruct or evade validly made SARs.
 

Court of Appeal confirms low value claims for non-material damage in respect of personal data are viable

The Court of Appeal in Farley and others v Paymaster (1836) Limited (trading as Equiniti) has provided clarity on what could constitute an infringement and compensation claim under the UK GDPR and the DPA 2018. The judgment demonstrates that processing errors can constitute infringements under UK GDPR and there is no need for a third party to have accessed the data. It further confirms that low value claims are viable, with data subjects not prevented from claiming for emotional harm. 

The High Court previously struck out over 95% of the claims brought by Sussex police officers affected by a data breach in 2019, which led to over 750 individuals’ pension scheme details being posted to incorrect addresses. The High Court dismissed the claims on the basis that a claim was only viable if there was proof that the data was ‘opened and read by a third party’. The High Court also stated that a claim could not be based on ‘risk or apprehension’, and a claim had to be for ‘serious distress’ in order to be actionable. 

The Court of Appeal overturned the High Court’s decision and held that disclosure is not necessary to establish infringement. The sending of personal data to the wrong address was found to be sufficient and constitutes processing under UK GDPR. Therefore, claimants are not required to have clear proof that the letters were ‘opened and read by a third party’ to have a viable claim. The Court of Appeal further held that there is not a threshold that must be met to suffer and claim distress, and that the low value of a claim is an insufficient ground to strike out the claim. 

We will provide a deeper dive into the implications of this case in due course.
 

CJEU General Court Upholds EU-U.S. Data Privacy Framework

The CJEU General Court has dismissed a legal challenge brought by French MEP Philippe Latombe against the European Commission’s adequacy decision for the EU-U.S. Data Privacy Framework (the “Framework”), thereby affirming the validity of the transatlantic data transfer mechanism – at least, for now. Latombe’s action sought to annul the Commission’s July 2023 decision (the “Decision”), which recognised the U.S. as providing an adequate level of data protection for personal data transferred from the EU under the Framework. 

In its judgment, the General Court did not address the substantive merits of the Framework itself but assessed the validity of the Framework on the facts and law at the time of the Decision (which accordingly does not take account of any changes implemented by the Trump administration). As a result, the Framework remains in force as the lawful basis for EU-U.S. data transfers, providing much-needed legal certainty for organisations relying on transatlantic data flows. 

This outcome is significant for businesses operating internationally, as it maintains the status quo and avoids immediate disruption to data transfers between the EU and the U.S. However, it is important to note that the Framework may still face further legal scrutiny in the future, and organisations should continue to monitor developments closely.
 

ECJ rules data subjects cannot obtain injunctions to prevent GDPR breaches

In a ruling handed down on 4 September 2025 in IP v Quirin Privatbank AG (C-655/23), the ECJ provided clarification on whether the judicial remedy of an injunction is available to data subjects under the GDPR. The ECJ ruled that under the national courts, data subjects cannot obtain an injunction to prevent future breaches of their data subject rights under the GDPR. 

The case concerned a candidate who took legal action against the German bank. He alleged that the bank unlawfully disclosed his personal data to a third party during the recruitment process. The candidate sought both an injunction to prevent further unauthorised processing and €1000 in compensation for non-material damages. On appeal, higher courts upheld the injunction but denied damages for lack of proven non-pecuniary harm. This case raised a number of questions, namely: whether the GDPR provides the right for a data subject to seek injunctions from controllers. 

In March 2025, the Advocate-General (the “AG”) opined that a data subject could seek an injunction from a controller; inferring such right from Articles 5(1) and 6(1) of the GDPR. The AG’s reasoning was that these Articles provide the lawful bases for processing personal data and, therefore, this includes the right to prevent unlawful processing via an injunction. 

On 4 September 2025, the ECJ’s ruling rejected the inferred right of injunction from Articles 5(1) and 6(1) made by the AG and agreed with the AG that a right of injunction does not arise under Article 17 (right of erasure), Article 18 (right to restrict processing). 

The ECJ ruling confirms that under the GDPR there are no measures available for data subjects to prevent future infringement of their rights by a controller. Rather, the rights arising under Article 82 (right to compensation and liability) of the GDPR are to enable compensatory measures and not injunctive protection. The ECJ did however stipulate that Member States are not prevented from providing such remedies under national law.
 

ECJ provides guidance on personal data and pseudonymisation

The ECJ has reached an important decision in European Data Protection Supervisor (EDPS) v Single Resolution Board (SRB) (C-413/23), providing clear guidance on the status of pseudonymised personal data and confirming that pseudonymised data is not personal data in every case. 

The case arose from the SRB’s handling of comments submitted by shareholders and creditors of Banco Popular Español SA, following its resolution, which were pseudonymised and transferred to Deloitte (acting in its capacity as a controller) for assessment. Shareholders and creditors complained to the EDPS that they had not been informed in SRB's privacy statement that their data, collected as part of a consultation, would be shared with Deloitte. The EDPS concluded that Deloitte had received their personal data and that SRB had failed to meet its information obligations under Regulation 2018/1725 (the EU data protection regime imposing equivalent obligations to the GDPR on EU institutions). 

The SRB challenged the EDPS’s decision before the CJEU General Court on the basis that it was not required to inform the respondents of such processing, as the transmission of pseudonymised data to Deloitte no longer constituted personal data. The CJEU General Court partially annulled the EDPS’s findings. The EDPS then appealed to the ECJ. 

On 4 September 2025, the ECJ set aside the judgment of the CJEU General Court, referring the case back to it. The ECJ clarified several important points, including to reaffirm that information expressing the personal opinion or viewpoint of an individual necessarily “relates to” a natural person and therefore constitutes personal data. Importantly, it held that pseudonymised data is not automatically personal data in all contexts and for all recipients. Whether data is personal data must be assessed on a case-by-case basis, considering the “means reasonably likely” to be used for re-identification. 

The judgment clarifies that the assessment of whether a data subject is identifiable should be made at the time of data collection and from the perspective of the controller, rather than the recipient, of the pseudonymised data. 

Crucially, the ECJ found that SRB’s duty to provide information to data subjects under Regulation 2018/1725 applied prior to the transfer of data, regardless of whether Deloitte could identify the data subjects after pseudonymisation. 

This judgment provides important clarification on the treatment of pseudonymised data under EU data protection law. For organisations, it means that information rights and transparency obligations (i.e. informing individuals about data processing in their privacy notices) will still apply even where data is pseudonymised before transfer to a third party. Controllers cannot avoid their duties simply by pseudonymising data prior to disclosing it, and they must continue to observe the data protection principles of lawfulness, fairness and transparency. 

Crucially though, the judgment does still leave room for recipients to potentially treat pseudonymised data they receive as non-personal data on a case-by-case basis, to the extent it is not identifiable by them, applying a subjective test. This will be of interest to recipients wishing to make greater use of data.
 

Key US Updates

Google ordered to pay $425 million by US federal jury for inadequate privacy policy

A US federal jury has ordered Google to pay $425 million for violating users’ privacy by collecting data when users believed they had opted out. The jury rejected Google’s defence that users were informed and had given consent. The case centred on Google’s privacy policy, which plaintiffs argued was confusing and unclear, while Google claimed it was transparent. 

The dispute involved Google’s Web & App Activity (“WAA”) and supplemental Web & App Activity (“sWAA”) settings. Users thought turning these off would stop data collection, but Google continued collecting data through its software tools. Google argued that its privacy policy and pop-up notifications explained ongoing data collection, but the jury found these disclosures insufficient and not prominent enough. 

Evidence in trial revealed that even Google employees found the policy unclear, and internal emails showed confusion about what data was collected when controls were off. The trial highlighted the challenge of making privacy policies clear and comprehensive, especially for companies with many products and billions of users. 

The verdict shows that whilst tech giants like Google can afford to pay a $425 million fine, relying on privacy policies to defend against privacy allegations may become increasingly challenging and costly.
 

FTC Chairman concerned that EU DSA and UK Online Safety Acts risks censoring US tech companies

On 21 August 2025, the Chairman of the FTC sent letters to leading US technology companies, reminding them of their duty to protect the privacy and data security of American consumers - even when facing pressure from foreign governments. These letters were sent to major providers of cloud, data security, social media, and messaging services, including Amazon, Apple, Meta, Microsoft, and others. 

Chairman Ferguson highlighted growing concerns that foreign laws, such as the EU’s Digital Services Act (the “DSA”) and the UK’s Online Safety and Investigatory Powers Acts, could push companies to censor content or weaken encryption protections for Americans. He cautioned that complying with such demands could expose US consumers to increased surveillance, identity theft, and fraud. 

Importantly, Ferguson emphasised that U.S. companies must still comply with the FTC Act, which prohibits unfair or deceptive practices. If a company promises strong encryption or data security but secretly weakens those protections in response to foreign demands, it could face enforcement action for misleading consumers. 

The FTC’s message highlights a key challenge for companies attempting to comply with multiple international data laws, which often conflict with each other.

 

Round up of enforcement actions

CompanyAuthorityFine/enforcement actionComment
Informa D&BAEPD€1.8 millionThe information services business, Informa D&B has been fined by the Spanish Data Authority for breaching the EU GDPR. Informa had unlawfully processed the personal data from self-employed individuals, thereby breaching Article 6(1) of the EU GDPR, and failed to inform the individuals of such processing.
Iberinform InternacionalAEPD€720,000Iberinform has been fined for processing self-employed persons’ personal data without a legal basis in breach of Article 6(1) of the EU GDPR. Iberinform had sourced personal data from a third party, and its use of that personal data went beyond the original purpose of collection. Iberninform also failed to inform the affected individuals of such processing.
VodafoneHDPA€550,000Vodafone Greece was fined after mis-using a subscriber’s identity to activate multiple pre-paid SIM cards without consent. The failure stems from insufficient contracts, supervision and data accuracy – therefore, acting in breach of its EU GDPR obligations.
Google & SheinCNIL€325 & €150 million respectivelyThe French Data Protection Authority fined both Google and Shein for failure to comply with cookie regulations. Subsidiaries of both Google and Shein have been fined by the French Authority for placing cookies when using the respective sites or making accounts, without the consent of French users. Google’s US and Irish subsidiaries were fined €200 million and €125 million respectively, whilst the Irish subsidiary of the Shein group was fined €150 million.
Azienda Ospedaliero CareggiGarante€80,000This Italian hospital was fined after an inspection revealed it had violated several grounds under the EU GDPR by failing to update its privacy notices and inform patients of their rights, as well as failing to obtain informed and specific patient consent for processing personal data. It was further found that the hospital failed to implement appropriate access controls to restrict unauthorised persons access to health records; implement an alert system to detect unauthorised access; and conduct a DPIA.