Data Protection update – December 2022-January 2023

Data Protection update – December 2022-January 2023

Welcome to our data protection bulletin, covering the key developments in data protection law from December 2022 and January 2023.

Data protection

Cyber security

Enforcement

Civil litigation

Data protection

Irish DPC threatens to take EDPB to court for overreaching its authority following Meta ruling

Following the European Data Protection Board's ("EDPB") binding determinations on two cases related to Meta Platforms Ireland Limited ("Meta"), the Irish Data Protection Commission ("DPC") threatened legal action against its fellow supervisory authorities and the EDPB.

Following an investigation into Meta, the DPC had prepared draft decisions ("Draft Decisions") in which it made a number of findings against the social media giant. However, when these were submitted to the DPC's fellow supervisory authorities for review, as required under the EU General Data Protection Regulation ("EU GDPR"), there was some disagreement over the DPC's conclusions and the proposed fines.

The Draft Decisions

The DPC's investigations into Meta stemmed from the introduction of new Terms of Service on the Facebook and Instagram platforms in May 2018. In these Terms of Service, Meta stated that the lawful basis on which it was processing users' personal data was no longer consent (Article 6(1)(a) EU GDPR), but rather the "performance of a contract" (Article 6(1)(b) EU GDPR) lawful basis. Meta considered that acceptance of the Terms of Service (which had to be accepted for users to continue to use the platforms) created a contract between the user and Meta. Relying on this belief that such a contract existed with the user, Meta claimed that its processing of personal data for, amongst other things, personalised advertising purposes was a necessary part of said contract and therefore that it was able to rely on the Article (6)(1)(b) EU GDPR lawful basis to process users' personal data.

Complainants argued that the manner of acceptance of the Terms of Service amounted to a form of forced consent, as users had to choose to accept the terms or stop using the Facebook and Instagram services. It was argued that reliance on such forced consent was in breach of the EU GDPR.

Following the culmination of its investigations into the alleged breaches, the DPC outlined in its Draft Decisions that Meta had failed to satisfy its transparency obligations in relation to the processing (in breach of Articles 12, 13(1)(c) and 5(1)(a) EU GDPR), but the DPC did not agree that Meta had forced users to consent and found that Meta could use contract as a basis for the processing.

The Draft Decisions were submitted to the DPC's peer supervisory authorities in the EU and EEA (also referred to as the Concerned Supervisory Authorities or "CSAs") for review and approval. Although the CSAs agreed with the finding that Meta had breached its transparency requirements under the EU GDPR, they considered that the fines issued by the DPC was too small and some CSAs raised objections to the DPC's findings in relation to the "contract" lawful basis Meta was relying on.

As a result of these objections, the matters were referred to the EDPB.

The EDPB Determinations

Following the referrals, the EDPB issued its determinations on 5 December 2022 ("EDPB Determinations"). The EDPB Determinations upheld the DPC's underlying position in relation to the breach of transparency obligations on Meta, but found that out of principle, the "contract" lawful basis could not be used to process personal data for the purpose of behavioural advertising.

The EDPB also directed the DPC to conduct a fresh investigation into the use of personal data on the Facebook and Instagram platforms, in particular the use of special category data, as well as making clear that the level of administrative fines to be imposed on Meta should be commensurate with the severity of the breaches of the EU GDPR.

The final decisions

As the EDPB Determinations were binding, they were adopted by the DPC on 31 December 2022 in the supervisory authority's final decisions ("Final Decisions"). In the Final Decisions, the DPC found that Meta had both failed to comply with its transparency obligations under EU GDPR and was not entitled to rely on the "contract" lawful basis to process personal data for the delivery of personalised advertising on Facebook and Instagram, and therefore had breached Article 6 EU GDPR. The DPC also increased the fines issued to Meta for these breaches and imposed a fine of €210 million in relation to the breaches concerning the Facebook platform and €180 million in relation to the breaches concerning the Instagram platform. The DPC also instructed Meta in the Final Decisions to bring its processing into compliance with the EU GDPR within a period of three months to avoid further penalties.

However, in an eye-catching statement announcing the Final Decisions to the public, the DPC took aim at the EDPB, stating that the EDPB "does not have a general supervision role akin to national courts in respect of national independent authorities and it is not open to the EDPB to instruct and direct an authority to engage in open-ended and speculative investigation". Furthermore, the DPC argued that the EDPB's directions were "problematic in jurisdictional terms" and challenged the consistency of the directions with the "cooperation and consistency arrangements laid down by the [EU] GDPR". 

The DPC has claimed that the EDPB's directions amount to an overreach so severe that the DPC may even consider it appropriate to bring an action for annulment before the Court of Justice of the EU ("CJEU") in order to set aside the directions. This would be an unprecedented course of action by a European supervisory authority. 

The DPC has since issued a relatively small €5.5 million fine to Meta in relation to its WhatsApp platform in a related case (as reported on below), with the DPC making similar assertions relating to directions given by the EDPB and once more threatening legal action in response to these directions.

The EDPB's Determinations on Facebook and Instagram can be read here and here.

Noyb, which made the original complaints against the Meta platforms, has published the DPC's decisions on Facebook and Instagram here and here respectively.

The DPC's public statement regarding the fines can be read here.

These fines follow the €265 million fine issued by the DPC to Meta in November, which we reported on in our November bulletin here.

The decisions raise important points for consideration by anyone relying on contract as a legal basis. As a result, controllers that have bundled agreement to several processing activities into their terms of service should review the legal basis they are relying on for these activities.

Irish DPC holds WhatsApp should not rely on necessity for a contract as the legal basis security and service improvement processing

The DPC has held that WhatsApp Ireland Limited ("WhatsApp") cannot rely on the "necessary for a contract" legal basis for its processing for security and service improvement purposes. It fined WhatsApp €5.5 million for unlawfully processing personal data in breach of Articles 12 and 13(1)(c) of the EU GDPR. The DPC also found that WhatsApp had failed to appropriately inform data subjects either of the purposes for which it was processing their personal data or the lawful bases it was relying upon in processing their data. WhatsApp has announced that it intends to appeal this fine.

This fine follows the DPC's recent fines for Facebook and Instagram, as covered above in the Data Protection section of this bulletin. The DPC's fine for WhatsApp was comparatively smaller in comparison to the €225 million fine issued by the DPC to WhatsApp in September 2021 which related to similar issues, as covered in our August 2021 bulletin. WhatsApp now has six months to remediate these breaches.

This decision should prompt controllers to review the activities for which they rely on contractual necessity as the legal basis.

European Commission publishes draft adequacy decision on EU-US data transfers

As mentioned on our Data Protection Blog at the end of last year, the European Commission published a draft adequacy decision for the EU-US Data Privacy Framework ("Draft Adequacy Decision") on 13 December 2022. This followed the signature of a US Executive Order by President Biden on 7 October 2022 ("Executive Order"), which limited US public authorities' access to personal data, including for criminal law enforcement and national security purposes.

The Draft Adequacy Decision has now been transmitted to the EDPB for its opinion. Meanwhile the European Parliament has a right to review the Draft Adequacy Decision if it sees fit. If formally adopted, the Draft Adequacy Decision would recognise that the US ensures an adequate level of protection for personal data transferred from the EU to organisations in the US that are signed up to the EU-US Data Privacy Framework.

More information on the Executive Order and the Draft Adequacy Decision can be found on our Data Protection Blog, where we will also provide updates on the status of the Draft Adequacy Decision.

Privacy by Design to become new ISO standard

The International Organisation for Standardisation ("ISO") is to adopt a new ISO standard for Privacy by Design, ISO 31700, in February 2023.

The concept of Privacy by Design, which is enshrined in Article 25 of the UK General Data Protection Regulation ("UK GDPR"), requires organisations to ensure personal data and the rights of data subjects are protected throughout their undertakings. The new Privacy by Design ISO standard will include detailed guidance and requirements on how to operate an undertaking in a manner compatible with individuals' data protection and privacy rights.

The introduction of ISO 31700 seeks to further incentivise companies to take a best practice approach when considering their obligations under applicable data legislation, by offering certification to companies that comply with the standard's requirements.

ISO 31700 is expected to be published on 31 January 2023 and, once published, will be available here.

Joint EU declaration outlines digital rights and principles for the Digital Decade

In a joint declaration ("Declaration"), the European Parliament, the Council of the European Union and the European Commission (collectively forming the "EU") have outlined their plans in respect of digital rights and principles for the Digital Decade Policy Programme 2030 ("Digital Decade") - the EU's vision for digital transformation across the Union.

The Declaration sets out the EU's "political intentions and commitments" as well as examining "the most relevant rights in the context of the digital transformation", and notes in its preamble that it is also intended to act as a guide for policy makers. The six chapters of the Declaration set out the key principles for policy making in the Digital Decade, namely:

  1. Putting people at the centre of the digital transformation;
  2. Solidarity and inclusion;
  3. Freedom of choice;
  4. Participation in the digital public space;
  5. Safety, security and empowerment; and
  6. Sustainability.

Of particular note is the EU's commitment in Chapter V (Safety, security and empowerment) to ensure that "everyone has effective control of their personal and non-personal data in line with EU data protection rules and relevant EU law", which further enshrines the EU's commitment to ensuring the principles of the EU GDPR and the rights of individuals are upheld.

The Declaration also states in Chapter III (Freedom of choice) that everyone should be able to make "their own, informed choices in the digital environment" whilst being able to benefit from the advantages of artificial intelligence ("AI") and algorithms. Amongst the commitments made in the Declaration to this end, the EU commits to ensuring transparency for individuals when AI and algorithms are used, ensuring that algorithms avoid discrimination and are subject to human oversight, and ensuring that AI cannot be used to pre-empt human choice, particularly regarding health, education, employment and private life (all of which may relate to sensitive personal data).

The Declaration can be read in full here. Look out for a more detailed breakdown of the Declaration on our Data Protection Blog in the near future.

UK Government launches consultation on draft identity verification legislation

The UK Government has launched a consultation on draft legislation aimed at facilitating identity verification and the sharing of data for identity verification purposes by public bodies. 

Pursuant to the public service delivery power contained in the Digital Economy Act 2017 ("DEA 2017"), certain public bodies may disclose information to another public body for the purposes of achieving objectives specified in regulations (s 35(1) DEA 2017). The exact public authorities that may disclose information are listed in the DEA 2017 and may also be amended by regulations. 

The proposed Digital Government (Disclosure of Information) (Identity Verification Services) Regulations 2023 ("Identity Verification Regulations") would introduce a new objective into the DEA 2017 to enable data sharing between government departments. This data sharing would form part of a broader digital identity verification scheme to allow individuals to prove their identity online and use their digitally created identities to access public services. The Identity Verification Regulations would also add certain public bodies to the list of those who can share information under DEA 2017, including the Cabinet Office, the Department for Transport, the Department for Environment, Food and Rural Affairs and the Disclosure and Barring Service.

Under the proposals, individuals would present personal data to be validated against data already held by public bodies to confirm their identities. According to the consultation, only the minimum data required to validate an individual's identity will be required and, once an individual's identity is confirmed, public authorities will only be able to share data if they have completed, amongst other things, a data protection impact assessment and a security plan.

The data sharing powers will apply across England, Scotland and Wales if passed into law. The consultations on the proposals and the Identity Verification Regulations end on 1 March 2023. If you wish to respond to the consultations, you may do so online here. The provisional timeline set out in the consultation paper would see the Identity Verification Regulations enacted in October 2023, with data sharing in place by December 2023.

UK and DIFC commit to strengthening data partnership

In a joint statement published on 15 December 2022, the UK Government and the Dubai International Financial Centre Authority ("DIFCA") have announced plans to strengthen the data partnership between the UK and the Dubai International Financial Centre ("DIFC").

Around 16% of the financial service companies operating in the DIFC, Dubai's principal financial hub and free zone, originate from the UK. Given the high proportion of UK companies operating in the DIFC, smooth and secure data flow between the jurisdictions is essential. Since August 2021 the UK Government and the DIFCA have been in discussions to establish a data bridge between the jurisdictions whilst ensuring that personal data transferred to the DIFC is effectively protected under local data protection laws. 

As part of the renewed partnership, the joint statement notes that the UK Government and the DIFCA are working towards a new Memorandum of Understanding on data aimed at improving data protection regulatory cooperation. The UK is currently in the final stages of its adequacy assessment of the DIFC and improved collaboration between the regulatory authorities in both jurisdictions is only likely to improve the likelihood of the UK deeming the DIFC as an adequate jurisdiction for data transfers.

The joint statement by the UK Government and the DIFCA can be read here.

UK-Japan Digital Partnership established; flow of data between UK and Japan to be reviewed

In addition to the UK-DIFC data partnership, the UK Government has also announced a digital partnership with Japan (the "Partnership"). In a statement released on 7 December 2022, the UK Government confirmed that the Partnership will serve as a non-binding strategic framework for the respective governments of the UK and Japan to "jointly deliver concrete digital policy outcomes for their citizens, businesses and economies".

One of the key goals of the Partnership is to develop cross-border data flows between the countries through improved regulatory cooperation, data innovation and championing the safe international flow of data. Japan is currently considered as providing an adequate level of protection to personal data transferred from the UK, following the European Commission's 2019 adequacy decision. Similarly, Japan has designated the UK as a safe destination for transfers of data from Japan. Given these respective adequacy decisions, the Partnership will target increased collaboration on data innovation, such as sharing information on Japan's Trusted Web initiative, which is a distributed data management system, and increasing the security and resilience of data infrastructure in both the UK and Japan.

The UK Government's outline of the Partnership can be read in full here.

Cyber security

Investigations launched into Twitter following data breaches

The DPC has launched an investigation into Twitter International Unlimited Company ("Twitter") following media reports that datasets containing the email addresses and telephone numbers of approximately 5.4 million Twitter users were made available on the internet as a result of a hack.

Following conversations between Twitter's Dublin branch and the DPC, Twitter claims to have identified the source of the vulnerability, which was a data breach that has since been notified to the DPC, and replied to the additional enquiries raised by the supervisory authority.

After a thorough investigation, the DPC came to the conclusion that Twitter may have infringed either or both of the EU GDPR and the Irish Data Protection Act 2018. Consequently, the DPC has decided to commence further investigations into the situation to determine whether Twitter complied with its obligations as a data controller in relation to the processing of personal data of Twitter users or whether any data protection laws were infringed by the social media giant in this instance.

Twitter has subsequently been the alleged victim of two larger-scale attacks purportedly resulting in the personal data of 400 million and 200 million Twitter users' personal data being leaked on hacker forums in each instance. In a statement on its privacy centre blog, Twitter has claimed that both of these alleged leaks were in fact collections of publicly available information and not as a result of any hack.

The DPC's brief statement on its investigation can be read here.

Royal Mail unable to send post overseas following Russia-linked ransomware attack

The UK's primary postal service, the Royal Mail, has suffered severe disruption to its overseas delivery service following a ransomware attack and was unable to send letters and parcels overseas as a result between 11 and 18 January 2023.

The ransomware attack affected the preparation and tracking of mail for despatch abroad by scrambling the software on machines used to send international parcels. More than one company and its customers, the incident is affecting the communications and businesses of citizens all over the world.

The attack has been claimed by a Russian-affiliated criminal group called LockBit, which gained access to the Royal Mail's printers to print ransom notes threatening the publishing of the stolen data if Royal Mail did not pay the ransom. The ransomware attack itself affected the preparation and tracking of mail for despatch abroad by scrambling the software on machines used to send international parcels.

The Royal Mail has refused to pay the ransom and has reported the incident to the National Cyber Security Centre ("NCSC"), the National Crime Agency ("NCA") and the Information Commissioner's Office ("ICO"), with the ICO confirming its plans to investigate the data breaches. The NCA released a statement confirming that it is working with the NCSC to understand the impact of the ransomware attack.

As a result of the attack, Royal Mail has asked clients to refrain from sending international letters and parcels until the situation is resolved and has also informed customers that some may experience delays or disruption to items already shipped. The company has since confirmed, however, that it has resumed international despatches for parcels and letters sent before the ransomware attack and is continuing to work towards the full resumption of its services in due course.

The Royal Mail's bulletin on the incident, which is updated regularly, can be read here.

TSB Bank fined £48.6 million following 2018 IT crash

TSB Bank plc ("TSB") has been fined a collective £48.65 million by the Financial Conduct Authority ("FCA") and Prudential Regulation Authority ("PRA") following a regulatory investigation into a 2018 IT crash that resulted in customers not being able to access TSB's banking services.

The crash, which resulted from a failed attempt to migrate TSB's systems to a new platform ("Migration"), left a "significant portion" of TSB's customers affected by issues with the continuity of TSB's banking services. It took a total of eight months for TSB's services to return to normal operation, with TSB having to hire over 2,000 new staff members to fix the issues caused by the Migration, and paying out £32.7 million to affected customers to-date.

Although the FCA noted, in its press release detailing the fine, that the Migration was "an ambitious and complex IT change management programme" it also noted the high-risk nature of the Migration, as well as TSB's failure to manage the Migration and the risks arising from its IT outsourcing arrangements adequately.

The FCA's Executive Director of Enforcement and Market Oversight, Mark Steward, noted in particular that TSB "failed to plan for the IT migration properly, the governance of the project was insufficiently robust and [TSB] failed to take reasonable care to organise and control its affairs responsibly and effectively, with adequate risk management systems".

The fines issued were made up of a fine of £29.75 million issued by the FCA and a fine of £18.9 million issued by the PRA. These sums would have been higher had TSB not agreed to cooperate with regulators to resolve this matter, which qualified it for a 30 percent discount on the final penalties imposed.

For more details on the issues caused by the Migration and the investigations by the regulators the FCA's final notice can be read here, and the PRA's final notice can be read here.

Passwords stored in Norton Password Manager may be at risk following cyber-attack

Nearly a million users of Gen Digital Inc.'s NortonLifeLock have had their accounts exposed following a cyber-attack in December 2022. The hackers utilised usernames and passwords exposed in attacks on other sites to access the NortonLifeLock users' accounts.

As well as access to users' anti-malware and anti-virus settings, hackers may also have had access to users' Norton Password Manager accounts, which acts as a virtual vault for an individual's passwords for other websites. This could have exposed users' banking and cryptocurrency or other trading accounts' login credentials, which was likely the motivation behind the attack.

Gen Digital Inc. has confirmed that it became aware of the attack on 12 December 2022 following the detection of an unusually large volume of failed login attempts, and has since advised users to reset their passwords and take other steps to secure their accounts, such as by setting up two-factor authentication (also known as 2FA) in response to the cyber-attack.

Code of Practice to improve app privacy and security released by UK Government

The UK Government has published what it describes as a "world-first" voluntary code of practice to improve consumer protections when using apps ("Code"). The Code is aimed at three groups of stakeholders ("Stakeholders"), who will be the primary bodies responsible for implementation of the Code. These include:

  1. App store operators, who are the persons or organisations responsible for operating an app store and who have capability to remove and add apps to the store;
  2. App developers, who are the persons or organisations that create or maintain apps and who have responsibility for ensuring their apps meet the requirements of the store and any legal requirements; and
  3. Platform developers, who are the persons or organisations responsible for producing operating systems, functionality and interfaces to enable third parties to implement additional functionality through apps.

The Code is based on with eight core principles, namely:

  1. App store baseline security and privacy requirements;
  2. App baseline security and privacy requirements;
  3. Vulnerability disclosure process;
  4. Keep apps updated;
  5. Information for users;
  6. Guidance for developers;
  7. Feedback for developers; and
  8. Handling personal data breaches.

Underlying these principles are specific duties and responsibilities that apply to the relevant Stakeholders. For example, the Code requires apps to include processes to allow experts to report vulnerabilities in app software to Stakeholders, as well as requiring Stakeholders to ensure that necessary updates for app security are signposted to users and that security and privacy information is available to users in a clear and understandable way.

Importantly, the Code also requires all Stakeholders to comply with the broader requirements of data protection law including the UK GDPR and the Data Protection Act 2018.

The UK Government plans to work with the app industry to support the implementation of the Code over the next nine months, as well as exploring if any existing legislation can be extended to cover apps or if new regulation is required in relation to the Code.

Companies who adopt and comply with the Code will be able to declare this on their websites or app stores.

The Code can be read in full here

Australian telecoms giant Telstra accidentally discloses over 100,000 customers' personal data

Australian telecoms giant Telstra Corp Ltd ("Telstra") has disclosed the data of 132,000 customers online due to an internal error, which it says amounted to a "misalignment of databases". Although Telstra has confirmed that this data breach was not the result of a cyber-attack, Australia has recently been subject to a number of high-profile data leaks and cyber-attacks, including the September 2022 cyber-attack against Optus, which compromised 10 million customers' personal data alone, and a similar attack against Telstra in October 2022 that affected up to 30,000 current former staff members.

Enforcement

Fines issued by the ICO tripled in 2022

Research has established that the total amount of the fines issued by the ICO tripled in 2022 by comparison to the previous year. Two of the largest fines given were the £4.4 million fine given to Interserve Group Limited as reported in our October 2022 bulletin and the £7.5 million fine given to Clearview AI Inc as reported in our May 2022 bulletin. With a £27 million fine on the horizon for TikTok Inc and TikTok Information Technologies UK Limited for unlawful processing of children's data, as discussed in our September 2022 bulletin, this trend seems likely to continue.

ICO to publish all reprimands online going forward

The ICO has confirmed that going forward it shall be publishing all reprimands on its website, including retroactively from January 2022 onwards. This reflects the ICO's strategic approach to regulatory action, as summarised by John Edwards the UK Information Commissioner: "members of the public, and those affected by a breach or infringement, are entitled to know that we have held the business or organisation to account, and that they have changed their practices as a result". Currently, only Enforcement Notices, Monetary Penalty Notices and summaries of the ICO's audit reports are published by the ICO.

In the blog post, the ICO flagged that reprimands represent the ICO taking action to raise data protection standards. The ICO wishes to be transparent with the public when they hold organisations to account and about the exact steps those organisations must take to be compliant, so that those organisations can be held accountable in this regard. The addition of reprimands to the website will provide useful guidance as to the standards expected by the ICO and also the practices on which the ICO is focussing its enforcement action.

Five companies making half a million unlawful marketing calls are fined £435,000 by ICO

The ICO has fined five companies a total of £435,000 for making almost half a million unlawful direct marketing calls to people registered with the Telephone Preference Service, in breach of the Privacy and Electronic Communications (EC Directive) Regulations 2003.

The ICO found that the companies were calling in an attempt to sell insurance products and services related to white goods, such as washing machines, kitchen appliances and boiler cover.

According to the investigation, in addition to the unlawful direct marketing calls, the companies were also targeting a specific vulnerable group of people: homeowners over the age of sixty with a landline. In some cases, the complaints were from individuals with dementia whose money was taken from their bank accounts.

There was also evidence of false and misleading statements being made and the use of pressure tactics during the calls. The tactics were aimed at getting the elderly people to provide payment details to the callers.

According to Andy Curry, Head of Investigations at the ICO, the ICO "will not stop investigating and taking robust action against companies, to protect people and especially the vulnerable". In his words, "the pressure tactics, and sometimes false or misleading statements these companies used were completely unacceptable".

The details of the enforcement action taken on each company can be read here.

Meta applies to annul EDPB decision regarding Instagram GDPR child privacy infringements

Meta has applied to annul the Binding Decision 2/2022 ("Decision") of the EDPB which, as discussed in our September 2022 bulletin, related to alleged breaches of the EU GDPR in Instagram's processing of children's personal data.

Meta's application proceeds on the basis of four core arguments. Namely, that the EDPB: (1) exceeded its competence; (2) disregarded the legitimate interests of the affected data subjects in reaching its findings that Meta could not rely on the legitimate interest exemption at Article 6(1)(f) of EU GDPR; (3) did not conduct a procedurally fair assessment when reaching its decision; and (4) breached Article 83 of EU GDPR in determining the fine to which Meta should be subject.

If the Decision is annulled, this is likely to support Meta's challenge to the €405m fine issued to Instagram by the DPC for the breaches of EU GDPR as covered here. This was the second largest fine ever imposed for EU GDPR breaches, behind only Amazon's 2021 fine issued by Luxembourg National Commission for Data Protection.

Meta's application to the EU General Court can be read in full here.

Apple and Microsoft fined millions by CNIL for unlawful tracking and cookie violations

The French data protection supervisory authority ("CNIL") has fined Apple Distribution International ("Apple") €8 million, Microsoft Ireland Operations Limited ("Microsoft") €60 million and TikTok UK and TikTok Ireland (together "TikTok") €8 million for unlawfully tracking users and placing cookies for the purposes of serving digital advertising in breach of Article 82 of the French Data Protection Act ("Article 82") which implements the e-Privacy Directive.

Apple's fine was in relation to unlawful tracking. The CNIL found that when using an older version of the iPhone's operating system, iOS version 14.6, identifiers were automatically "pre-checked" without obtaining consent from users. These identifiers breached the "e-privacy" directive transposed into Article 82 as they were not necessary for the provision of the service. The €8m fine reflected the profit Apple had made from advertising revenues indirectly as a result of the data collected via the identifiers, the number of French data subjects impacted and remedial steps taken by Apple.

In Microsoft's case, the CNIL found that they had breached Article 82 by placing cookies on users' systems without their consent via bing.com. Similarly, TikTok was found to commit a breach by failing to provide users with an option which enabled them to refuse cookies from being placed. TikTok and Microsoft both remedied this breach through the introduction of a "REJECT ALL/REFUSE ALL" button in 2022.

The CNIL's statement detailing the action taken against: (i) Apple can be read in full here; (ii) Microsoft can be read in full here; and (iii) TikTok can be read in full here.

Civil Litigation

High Court Judge finds that ICO is not obliged to investigate and reach a final conclusion on every complaint

Mostyn J has ruled that the ICO is not under an obligation to investigate every complaint to the extent necessary to reach a final conclusion in the context of a judicial review of an ICO decision brought by Ben Delo the founder of BitMEX, a cryptocurrency exchange.

Mr Delo's judicial review related to a decision of the ICO in the context of a complaint by Mr Delo about Wise Payments ("Wise"), formally known as TransferWise, allegedly breaching its duties under the UK GDPR in failing to provide the entirety of the personal data it held regarding Mr Delo following a data subject access request ("DSAR").

Wise had provided Mr Delo with some personal data in relation to him but not a copy of a suspicious activity report that it had submitted to the NCA. Wise held that it was not in breach of its duties under the GDPR as it was entitled to rely on an exemption under money-laundering rules.

Mr Delo submitted two complaints to the ICO which were determined to not require further action from the ICO. In light of these decisions, Mr Delo issued a claim for judicial review in order to obtain:

  1. A quashing order to quash the ICO's decisions;
  2. A mandatory order requiring the ICO to reopen its investigation; and
  3. A mandatory order requiring the ICO to remake its decision.

Mostyn J noted that, although by the time it was heard, his decision in the instant case had become academic (as Wise had by that stage provided Mr Delo with all the personal data regarding him), it was still in the public interest for the hearing to proceed, as the extent of the ICO's obligation to reach a final conclusion on every complaint had not been considered in domestic or European case law since the GDPR came into force.

Mostyn J rejected Mr Delo's claim and held that legislation requires the ICO to receive and consider complaints from data subjects, but also allows the ICO "broad discretion" as to whether it investigates those complaints further and the extent to which it does so. The ICO was, in his view: "an expert Regulator who is best placed to determine on which cases he should focus".

Mostyn J considered that the ICO had dealt with the complaints from Mr Delo "to the letter". It was not obliged to request further documentation from Wise or conclude that Wise had complied with its duties under the UK GDPR. The ICO could conclude on the information available that it "appeared likely" that Wise had been compliant.

Mostyn J also noted the fact that the ICO would have been aware that making a civil claim was an option available to Mr Delo to obtain the undisclosed personal data. The judge opined that as a civil claim was available to Mr Delo it was a further justification for the ICO's decision to not take further action.

The judgment can be read in full here.

CJEU rules that data controllers must provide data subjects with details of recipients to whom their personal data has been disclosed

The CJEU has ruled that data controllers are obliged to provide data subjects with the identities of the recipients to whom the controller has disclosed personal data in response to DSARs.

Following the request for a preliminary ruling from the Austrian Supreme Court, the CJEU held that Article 15(1)(c) of the EU GDPR ("Article 15(1)(c)") must be interpreted as requiring controllers to provide data subjects with the specific identities of recipients who have received their personal data in response to a DSAR. The CJEU held that the right to access is one necessary for the data subject's exercise of other rights, such as the right to rectification or erasure. In the judgment, the CJEU found that informing the data subject of the specific identities of the recipients to whom their personal data has been disclosed effectively ensures the exercise of these rights.

However, the CJEU made clear that data controllers do not have to provide the specific identities of the recipients if:

  • It is impossible to identify the recipients; or
  • The data controller demonstrates that a data subject's request is manifestly unfounded or excessive.

Should it not be possible to provide the data subject with the specific identities of the recipients, the CJEU's decision makes clear that the data controller may provide the data subject with the categories of recipient only.

The CJEU emphasised that "the right to the protection of personal data is not an absolute right" and as a result of some circumstances, it may be impossible to provide the specific identities of the recipients, in which cases the provision of the categories of recipients would be appropriate.

The ruling from the CJEU clarifies the guidance around the interpretation of the right under Article 15(1)(c). It is still unclear whether other rights under the EU GDPR, which similarly provide data subjects to be provided with two possible levels of detail, will be interpreted in a like fashion. The provision of all specific recipients of personal data to data subjects, for some organisations, could be a particularly onerous task, and more so, if this ruling was also applicable in the context of a data subject's right to be informed, requiring a specific list of recipients within a privacy notice.

The CJEU's judgment can be read in full here.

CJEU requires Google to remove data when proven to be "manifestly inaccurate"

The CJEU has ruled that a party requesting the removal of personal data due to inaccuracy does not need to provide evidence that regarding the inaccuracy of that data in the form a judicial determination and that when considering requests regarding the removal of personal data on an image search, the informative value and any text element accompanying the photograph must be taken into account by the website operator to whom the request is targeted.

The request for a preliminary ruling stemmed from the German Federal Court of Justice in which two individuals, managers of a group of investment companies, made de-referencing requests to Google LLC ("Google") as a search of their names lead to an article critical of their group's investment model. The applicants claimed that the content of the article was inaccurate and defamatory. Google denied this request on the basis that they had no awareness of the inaccuracy of the article and referring to the professional context to which the content related.

The CJEU held:

  • A party requesting de-referencing has an obligation to "establish the manifest inaccuracy of the information". However, this did not require them to produce a judicial decision against the publisher regarding the accuracy of the published information; and
  • Operators of a search engine "must ascertain whether displaying the photographs in question is necessary for exercising the right to freedom of information of internet users". It held there must be a weighing-up of the competing rights and interests depending on whether the scenario concerns articles containing photographs or photographs displayed in a list of results in the form of thumbnails. The CJEU noted that thumbnails displayed following a search of a person's name constituted a "particularly significant interference with the data subject's rights to private life and that person's personal data".

This case sheds some light on the commonly disputed exemptions to the right to be forgotten, and on the interpretation of the CJEU when balancing this right under the EU GDPR against the fundamental rights to privacy and freedom of information. The CJEU has emphasised in this judgment that the right to protection of personal data is not absolute and a balancing exercise with other fundamental rights must take place when considering it.

The CJEU's judgment can be read in full here.

Meta settles Cambridge Analytica scandal case for US$725 million

Meta has agreed to pay US$725 million the largest settlement recovered from a US data privacy class action suit, to resolve a class action filed by a large class of Facebook users, alleged to number between 250 and 280 million.

Meta was accused of allowing Cambridge Analytica, the consulting firm well known for, amongst other things, working on Donald Trump's 2016 presidential campaign, access to eighty-seven million Facebook users' personal data, without consent. The class action focused on four principal complaints including:

  1. The access of apps to the friends of a Facebook user once they had installed the app;
  2. Facebook's continued friend sharing with "whitelisted apps" after it claimed to have ended friend sharing;
  3. Facebook's sharing of sensitive personal information to business partners without disclosing this to the user or their friends; and
  4. Facebook's failure to restrict or monitor third parties using sensitive personal information obtained from Facebook despite its stated policies.

Meta made no admission of wrongdoing in agreeing the settlement and the alleged unlawful data-sharing practices are said in the Plaintiffs' notice of motion and motion to certify a settlement class and grant preliminary settlement approval to have been ended by Facebook or be subject to extensive monitoring under a 2020 US Federal Trade Commission ("FTC") consent order.

Epic Games agrees to settle the 'Fortnite' claims at US$520 million

Epic Games ("Epic"), the creator of Fortnite, one of the world's most popular computer games, has agreed to pay US$520 million in a settlement over claims that it violated data protection laws.

The claims, which were brought by the FTC, allege that Fortnite unlawfully collected the personal data of children and manipulated users into making unintentional purchases. In addition, there are allegations that Fortnite breached US child protection laws by not requiring parental consent for children under the age of 13 to access the game.

According to the FTC, millions of dollars were received by Epic from unauthorised purchases, due to a "counterintuitive, inconsistent, and confusing button configuration". These privacy-invasive default settings led to misleading users. In response, Lina Khan, the Chair of the FTC, guaranteed that "protecting the public, and especially children, from online privacy invasions and dark patterns is a top priority for the Commission".

Although Epic has not admitted to any wrongdoing, it has implemented changes to its payment system, including requiring explicit consent from users to save their purchasing details and making it easier for users to obtain refunds.