Data Protection update - August 2022
Welcome to our data protection bulletin, covering the key developments in data protection law from August 2022.
- ICO updates and simplifies guidance on UK Binding Corporate Rules
- ICO guidance for small business on responding to data protection complaints published
- World first: Biometric data code of practice for criminal justice to be passed in Scotland
- Bank fined for creating enriched profiles for advertising purposes without consent
- Criteo subject to potential fine of $65 million for GDPR violations
- Datatilsynet upholds ban on use of Google Workspace against Municipality of Helsingør
- Privacy complaint targets Google over unsolicited ad emails
ICO updates and simplifies guidance on UK Binding Corporate Rules
The Information Commissioner's Office ("ICO") has published updated guidance on the use of binding corporate rules ("BCRs") as a data transfer mechanism for controllers and processors under the UK GDPR.
The ICO recognises that BCR applicants may seek both EU and UK BCRs and that the requirements for both jurisdictions may currently overlap. In response to this, the updated guidance is designed to simplify the UK BCR approval process. Supporting documents and commitments will only be requested once during the UK approval process and the referential table that organisations must complete has been revised.
Organisations already granted approval of their UK BCRs by the ICO will not need to take any further action, although organisations still awaiting UK BCR approval should expect engagement from the ICO based on the new guidance. To read the guidance in full click here.
ICO guidance for small business on responding to data protection complaints published
The ICO has published a short guidance note to help small businesses deal with complaints about how they've used people's information. Although the guidance is primarily directed towards small charities, clubs or organisations, the principles it sets out for good complaints handling are a useful steer for all businesses. The guidance recommends taking the following steps:
- Step one: businesses should acknowledge receipt of a data protection complaint as soon as possible and let the complainant know what type of information they can expect to receive in response.
- Step two: businesses will then need to assess what has gone wrong by establishing all the relevant facts, as thoroughly, fairly and accurately as possible.
- Step three: if the investigation is likely to take some time, complainants should be provided with regular updates following the initial response.
- Step four: a comprehensive record of the complaint should be made. This should include the date of the data protection complaint, the date a response is due, the actions taken and reference to all relevant documents/conversations.
- Step five: once the investigation has come to a close, businesses should respond to the complainant, explaining any actions that have been taken and their decision-making process. The complainant should also be made aware that they have a right to complain to the ICO.
- Step six: after a response has been sent to the complainant, businesses should take the opportunity to reflect and consider whether they have identified anything to improve on.
To read the guidance in full click here.
World first: Biometric data code of practice for criminal justice to be passed in Scotland
Scotland is nearing approval of the world's first statutory Code of Practice on the use of biometric data for policing and criminal justice (the "Code"). On 7 September, the Code will be brought before Scottish ministers and, if unopposed, could be brought into effect as early as 16 November. It is hoped that the Code will provide a guide to organisations working in policing and criminal justice and assist them in making decisions relating to the use of new and emerging biometric applications and technologies. The Code aims to address current gaps in legislation and assist bodies to whom the Code applies in decision-making around the adoption of biometric technologies.
DCMS publishes study on businesses and organisations affected by cyber security breaches
The Department for Digital, Culture, Media & Sport ("DCMS") has published an in-depth qualitative study which explores organisational experiences of cyber security breaches. The study aimed to understand the level of existing cyber security before a breach; the types of cyber-attacks affecting organisations; how businesses act in the aftermath of a breach; and the impact of such breaches. The DCMS hopes that the findings will help businesses and organisations understand the nature and significance of the cyber security threats they face and what others are doing to stay secure. Some of the key findings were as follows:
- All organisations that participated in the study had suffered a serious cyber security attack within the last four years;
- There was a consensus among participants that cyber-crime is a significant and growing business risk, with cyber attacks increasing in both volume and technical sophistication;
- Relatively few organisations were able to accurately quantify the financial impact of data breaches; and
- Very few organisations implemented formal 'lessons learned' process in the aftermath of a data breach.
To read the study in full click here.
Bank fined for creating enriched profiles for advertising purposes without consent
The supervisory authority of Lower Saxony has fined a bank €900,000 for creating customer profiles, enriched with third-party data, for advertising purposes without consent.
The DPA held that such processing of large amounts of data could not be based upon legitimate interest (Article 6(1)(f) GDPR). The DPA stated that processing based on a legitimate interest requires a balancing act between the interest of the controller and the fundamental rights and freedoms of the data subject. In doing so, the controller had to consider the reasonable expectations of the data subjects. In this instance, the DPA held that the data subject could not reasonably expect large amounts of its personal data to be analysed by the controller for the purpose of tailoring its advertising.
The DPA held that, in addition, data enrichment from a third-party source and linking it to profiles could also not be based on legitimate interest. This could potentially link data from all areas of life to an accurate customer profile, which could also not be reasonably expected by a customer.
Criteo subject to potential fine of $65 million for GDPR violations
The publicly traded adtech company, Criteo, has disclosed in its financial filings published on 5 August 2022 that it has been the subject of a proposed fine of approximately $65.4 million for alleged breaches of the GDPR.
While specific details of the investigation and reasons behind the proposed fine are unknown, Criteo’s chief legal officer Ryan Damon issued a statement saying that the firm "strongly disagrees" with the report’s findings, "both on the merits relating to the investigator’s assertions of non-compliance with GDPR and the quantum of the proposed sanction."
This news comes two years after France's supervisory authority, CNIL, launched an investigation into the company's data practices. This investigation arose from a complaint made by Privacy International which raised concerns that Criteo was processing internet users’ personal data – including special category data – without the appropriate user consent frameworks in place, as well as concerns that Criteo were not complying with high-level GDPR principles including fairness, transparency, accuracy and integrity. A final decision on the case and associated fines is unlikely to be finalised until sometime next year, according to Criteo.
Datatilsynet upholds ban on use of Google Workspace against Municipality of Helsingør
The Danish data protection authority, Datatilsynet, has upheld its ban of 14 July 2022 against the Municipality of Helsingør's use of Google Workspace. The decision, covered in our July bulletin, concerns the authority's decision finding that the use of Google’s Workspace productivity suite was incompatible with GDPR as a result of Google’s non-compliant international data transfers. Datalisynet has upheld its general ban on the processing of personal data with Google Workspace until adequate documentation and impact assessment have been carried out and until the processing operations have been brought in line with the GDPR. In upholding the ban, the Datatilsynet specified that the ban applies until the Municipality brings its processing activities in line with the GDPR and carries out a DPIA that meets the content and implementation requirements of the same pursuant to Articles 35 and 36 of the GDPR.
Privacy complaint targets Google over unsolicited ad emails
Austrian advocacy group, noyb.eu, has lodged a complaint with CNIL alleging that Google breached a 2021 Court of Justice of the European Union by sending direct marketing emails to customers without requesting permission.
Google and CNIL did not immediately respond to requests seeking comment.
Sensitive data ruling by CJEU could force broad privacy reboot
A decision of the Court of Justice of the European Union (the "CJEU") handed down on 1 August 2022 could have major implications for online platforms that use background tracking and platforming to target users with behavioural ads or feed recommender engines that are designed produce 'personalised' content.
This decision arose from a referral from the Lithuanian courts and relates to national anti-corruption legislation that required publication of names of officials' spouses or partners. The CJEU was asked to consider whether the publication of a name of a spouse or partner amounted to the processing of sensitive data because it could potentially reveal sexual orientation. The Court decided that it does. By implication, the same rule would apply to inferences connected to other types of special category data too: namely, the mere possibility that an inference could be drawn that data is special category can be sufficient to require it to be treated as special category data. In contrast, the UK position as set out in ICO guidance is that such inference must actually be drawn, or otherwise acted on as if it were true, for the data to be seen as special category. This signifies possible a divergence between the UK and EU approaches to what data is special category, as the CJEU ruling will not apply in the UK.
The judgment has several implications. Large online platforms have traditionally been able to circumvent a narrower interpretation of 'special category' personal data (such as health information, sexual orientation, political affiliation) (which is strictly controlled under GDPR) by triangulating and connecting large amounts of personal information through behavioural tracking to enable sensitive inferences to be drawn about individuals. This CJEU ruling indicates that this tracking seems likely to intersect with protected interests and therefore entails the processing of sensitive data. Importantly, the CJEU has said that even incorrect inferences fall under the GDPR's special category processing requirements.
A tighter interpretation of existing EU privacy laws, therefore, poses a clear strategic threat to adtech companies and will necessitate significant changes in the use of targeted advertising. It may also have complex knock-on effects in other areas, as it may require the application of an Article 9 exemption to many more types of processing where there is potential for special category data to be inferred: for example, if an inference of religious belief could be drawn from CCTV footage of those entering a church, the judgment may make it harder to justify using CCTV around such sensitive locations.