Data Protection update - December 2021/January 2022
Welcome to our data protection bulletin, covering the key developments in data protection law from December 2021 and January 2022.
Stephenson Harwood's data protection hub is now live. The hub is a 'go-to' online resource where you can find all our data protection content in one place, including useful materials such as our handy pdf overview of the UK GDPR, guidance on data protection hot topics and our regular insights.
- IDTA and UK Addendum laid before UK Parliament
- Right of Access Guidelines adopted by EDPB
- Draft Immigration Exemption published
- European Data Governance Act
- UAE introduces new data protection law
- UK Government updates on data protection regime changes
- UK Government launches new National Cyber Strategy
- ICO publishes paper on end-to-end encryption
- UK Government introduces Product Security and Telecommunications Infrastructure Bill
- Google and Facebook fined by the French data protection agency
- European parliament found to have engaged in unlawful data transfers and cookie consents
- Austrian Data Protection Authority (the "Austrian DPA") rules that the continuous use of Google Analytics violates the GDPR
- European privacy campaign group, noyb, files GDPR complaint against Amazon and Airbnb relating to allegedly unlawful automated decision making
- ICO fines update: EB Associates fined £140,000 by ICO for illegal cold calls
- Europol ordered to delete data concerning individuals with no criminal link
- Permission granted to serve proceedings out of the jurisdiction on US-based news organisation
- ECJ indicates support for consumer protection associations
- WhatsApp sues European Data Protection Board after the hefty fine of €225 million issued
- The Gothenburg Court of Appeal upholds a fine of €4,800,000 imposed on Google by the Stockholm Administrative Court
IDTA and UK Addendum laid before UK Parliament
On 28 January 2022, the Secretary of State for the Department for Digital, Culture, Media & Sport laid before Parliament the UK's new transfer tools for international transfers of personal data under the UK GDPR. These tools are key for organisations transferring or receiving personal data subject to the UK GDPR and follow the consultation run by the Information Commissioner's Office ("ICO") last year.
The specific tools which have been presented are: (i) the international data transfer agreement ("IDTA"); (ii) the international data transfer addendum to the European Commission's Standard Contractual Clauses ("SCCs") for international data transfers (the "Addendum"); and (iii) transitional provisions (together, the "Tools").
The IDTA is intended to be used as a safeguard, to comply with Article 46 of the UK GDPR, for data transferred under the UK GDPR as a replacement for the old SCCs or "model clauses". The Addendum is proposed to accompany the EU SCCs to accommodate the UK GDPR. The Addendum may be particularly useful to organisations subject to both the UK and EU GDPR looking to have efficient data transfers documentation.
The IDTA and the Addendum are available for organisations to review here, but it must be noted they are subject to Parliamentary approval and will not enter into force until 21 March 2022. We also note, and look forward to, further guidance being published by the ICO about the Tools and the published consultation responses in respect of the IDTA and the Addendum. In particular, the ICO will be publishing further guidance on conducting a transfer risk assessment ("TRA") and on international transfers of personal data more generally. The draft TRA previously published by the ICO has not yet been finalised. We have previously reported on the importance of the IDTA and the Addendum in our deep-dive on the draft proposals for international transfers post-Brexit here.
We will be publishing a more in-depth review of the new Tools, but our initial key takeaways are:
- The new Tools may be used as soon as they come into force, from 21 March 2022. The ICO website even states that they may be used “immediately”, although if used prior to Parliamentary approval, this would come with a degree of risk.
- Under the transitional provisions, the old Directive EU SCCs will continue to be a valid UK GDPR transfer safeguard until 21 March 2024, provided that the relevant contract was concluded on before 21 September 2022 and the processing operations remain unchanged. The ICO has confirmed that the reference to 21 September 2021 was an error.
- This presumably means that the new Tools will be mandatory for new UK transfers for contracts concluded on or after 22 September 2022. This gives a six-month window to start using the Tools for new UK transfers and a total of two years to remediate existing UK SCCs. Organisations already upgrading to the new EU SCCs may wish to align the timing of their UK remediation exercise with that of their EU SCCs remediation.
- The Addendum continues to take the approach that the new EU SCCs may be used for UK GDPR transfers, with minor amendments, but has been formalised and fleshed out compared to the draft version previously available. It also allows for incorporation of the relevant amendments by reference, which may help to cut down its overall length.
Right of Access Guidelines adopted by EDPB
During the January plenary session of the European Data Protection Board (the "EDPB"), the EDPB adopted guidelines on the rights of data subjects to access their personal data (the "Guidelines").
The Guidelines provide information in relation to: (i) the scope of the right of access; (ii) the information the controller has to provide to the data subject; (iii) the format of the request; (iv) the modalities for providing access; and (v) the notion of manifestly unfounded or excessive requests. All of these areas can cause issues for controllers receiving requests for access from data subjects. The Guidelines aim to analyse the various aspects of the right of access and to provide more precise guidance on how the right of access has to be implemented in different situations by providing a series of examples and variations.
The Guidelines also helpfully provide a series of questions which data controllers can ask themselves when they receive a request. These are annexed to the Guidelines as a flow chart and group the stages of responding to a request as: (i) interpreting and assessing the request; (ii) considering how to answer the request; and (iii) checking limits and restrictions. This structure will be helpful for organisations subject to the EU GDPR which are looking for support when considering how to respond to a right of access request.
The Guidelines will be subject to public consultation for a period of six weeks from being adopted.
Draft Immigration Exemption published
The UK Government has proposed a new draft immigration exemption (the "Draft Exemption") to replace the previous immigration exemption (the "Previous Exemption") in para 4 of Schedule 2 of the Data Protection Act 2018. For more background, please refer to our previous report and update on the Previous Exemption.
The Previous Exemption disapplied certain data protection rights where the processing was carried out for immigration purposes and the Home Office considered that the processing might "prejudice the maintenance of effective immigration control". However, in the Court of Appeal's judgment in R (Open Rights Group and another) v Secretary of State for the Home Department and another  EWCA Civ 800 the Previous Exemption was held to be incompatible with Article 23(2) of the GDPR. The Court of Appeal ruled that the declaration of the Previous Exemption's unlawfulness was suspended until 31 January 2022 to enable the UK Government to amend the Previous Exemption which it now seeks to do through the Draft Exemption. The Draft Exemption will therefore need to enter into force before 31 January 2022 to meet the deadline in the Court of Appeal's judgment.
The Draft Exemption contains key changes to the Previous Exemption, including:
- Requiring the Secretary of State to have an immigration policy document in place, to explain the policies and processes for determining whether and to what extent the Draft Exemption applies in a case;
- Requiring the Secretary of State to decide its application on a case-by-case basis through applying this policy document; and
- Recording any decision taken in respect of the application of the Draft Exemption and informing the data subject of that decision (unless it would prejudice specific matters).
European Data Governance Act
On 30 November 2021, the European Commission, Council of the EU and the European Parliament announced their political agreement to a European Data Governance Act ("DGA"). The DGA was first proposed in November 2020 and, if entered into force, will form the basis of a new European approach to data governance in accordance with EU laws on data protection, consumer rights and competition.
The DGA aims to facilitate the safe reuse of certain public sector data including trade secrets, personal data and data protected by IP. With this in mind, the DGA includes:
- Measures to facilitate trust in the sharing of data;
- Rules on neutrality to enable intermediaries to be organisers of data sharing;
- Methods to facilitate reusing certain data (such as health data) to advance research; and
- Procedures which give data subjects more control over their data by making it safer and easier to voluntarily share data further under clear conditions.
The DGA is now subject to adoption by the European Parliament and endorsement by the European Council and will be followed by a second major legislative proposal focussed on greater data sharing: the Data Act. The European Commission has announced that the results of a consultation on the Data Act (which ran from 3 June to 3 September 2021) will be announced shortly.
Both of these landmark pieces of legislation are outcomes of the European Strategy for Data which is intended to provide common European data spaces to improve the availability of data across society.
UAE introduces new data protection law
The United Arab Emirates ("UAE") has introduced its first data protection law; the Federal Decree-Law No. 45 of 2021 on the Protection of Personal Data.
Some of the content of the UAE's new data protection law appears to take inspiration from recent data protection law introduced in the Dubai International Financial Centre (DIFC Law No.5 of 2021) and the GDPR. For example, the concepts of personal data, processing, controllers and processors, data subject rights, and fundamental data protection principles are all present in the new UAE law. However, there are areas where the new data protection law is different to the GDPR including fewer transparency obligations for controllers.
Look out for our forthcoming analysis of the new UAE law on our data protection law hub.
UK Government updates on data protection regime changes
As part of the UK Government's publication, the "Benefits of Brexit", there is an update on the consultation into the UK's data protection regime (which we reported on in our September update here). The publication says that the UK will create an "ambitious, pro-growth and innovation-friendly" data protection regime and the responses to that consultation will be published in Spring 2022 ahead of legislative change.
UK Government launches new National Cyber Strategy
The UK Government published its new National Cyber Strategy (the "Strategy") on 15 December 2021. The Strategy replaces the previous cyber security strategy (spanning 2016-2021) and commits £2.6 billion of investment into solidifying the UK's position as a "Science and Tech Superpower".
The Strategy has five pillars which represent the broad priority actions to be achieved by 2025. These are:
- Strengthening the UK cyber ecosystem, investing in people and skills and deepening the partnership between government, academia and industry;
- Building a resilient and prosperous digital UK, reducing cyber risks so businesses can maximise the economic benefits of digital technology and citizens are more secure online and confident that their data is protected;
- Taking the lead in the technologies vital to cyber power, building our industrial capability and developing frameworks to secure future technologies;
- Advancing UK global leadership and influence for a more secure, prosperous and open international order, working with government and industry partners and sharing the expertise that underpins UK cyber power; and
- Detecting, disrupting and deterring adversaries to enhance UK security in and through cyberspace, making more integrated, creative and routine use of the UK's full spectrum of levers.
There are specific outcomes and objectives detailed under each of the pillars covering a broad range of specific sectors including seven specific technologies identified in the Strategy which will be targeted including 6G, AI, blockchain, internet of things, and quantum technologies. To support the Strategy, new organisations will be established including a National Cyber Advisory Board to "bring together senior leaders from the private and third sectors to challenge, support and inform the government's approach to cyber" and a National Laboratory for Operational Technological Security to enable testing, exercise and training on critical technologies to build capability in this area.
The Strategy will interface with other primary legislation, such as the National Security and Investment Act 2021, the Product Security and Telecommunications Infrastructure Bill and the Network and Information Systems Regulations. In doing so, the Strategy will ensure cyber security is appropriately considered for the next iteration of interconnected devices and, as we reported here, support the ICO capability to ensure digital providers understand the risks associated with the provision of their services. The cabinet office seeks feedback on the strategy and will provide annual reports on its implementation.
ICO publishes paper on end-to-end encryption
In November 2021, the ICO published a paper on the governance of end-to-end encryption ("E2EE"). The paper explains that E2EE is a technical measure which avoids some of the risks which non-E2EE systems can be vulnerable to. E2EE is also considered important for sharing commercial information because of its enhanced information security. The paper sets out the ICO's view that E2EE is a key enabler for compliance with data protection law and is directly relevant to the principles of integrity and confidentiality. According to the ICO, organisations must have a strong justification for not using E2EE methods given its effectiveness and the confidence it inspires in data subjects.
However, the paper also recognises that E2EE makes the detection of harmful content more difficult, which is a key concern (particularly when the content is accessed by children). The ICO's paper identifies key factors which should be considered in order to reconcile these competing demands. These factors are:
- The demand from consumers for services that safeguard their privacy;
- The requirements that existing legislation places on businesses;
- The effectiveness of existing legislative and technical tools to ensure lawful access to data for law enforcement and national security purposes;
- The potential future development of technical solutions for detecting harmful content without weakening E2EE;
- The necessity, proportionality, targeting and effectiveness of any proposed legislative solutions;
- The social impact of any proposed legislative solutions on online safety and privacy for the population as a whole; and
- The economic impact of any proposed legislative solutions, both in terms of their direct costs to business and any indirect effects of weakening user trust in digital services.
Taken together, these factors demand, in the ICO's view, a multistakeholder approach to find solutions which are suited to each of their priorities. The ICO highlights the various bodies with which it is working (Ofcom, the National Cyber Security Centre, and the FCA) and will seek the views of stakeholders regarding E2E early this year.
UK Government introduces Product Security and Telecommunications Infrastructure Bill
The UK Government has introduced the Product Security and Telecommunications Infrastructure Bill (the "Bill") which aims to create a new regulatory scheme to make consumer connectable products more secure against cyber-attacks. The Bill will apply to consumer connectable products which can connect to the internet or other networks and can transmit and receive data. These include internet of things devices such as smart TVs, connected baby monitors and connected alarm systems. There are also changes intended to accelerate the deployment and expansion of mobile, full fibre and gigabit capable networks across the UK.
The Bill provides the legislative scope to allow for minimum security requirements to be achieved by imposing responsibilities on manufacturers, importers and distributors. These may include measures such as requiring new updates to be automatically installed, banning default passwords, and preventing insecure devices from acting as a point of entry into a network. Until the requirements are published its unclear precisely what will be required of manufacturers, importers, and distributors. However, reference is made to the government's work with the European Telecommunications Standards Institute to create EN 303 645; the "first globally-applicable technical standard for the cyber security of consumer connectable products" the details of which may provide some indication for likely measures within the Bill. New penalties will also be created (both criminal and civil) to prevent breaches of the responsibilities, including a potential fine of £10 million or up to 4 per cent. of an organisation's global turnover in line with the GDPR.
Google and Facebook fined by the French data protection agency
The Commission Nationale de l'Informatique et des Libertes (the "CNIL") has fined Google and Facebook €150 million and €60 million respectively for making it difficult for users to refuse online trackers, which are referred to as "cookies". The CNIL conducted a series of investigations following complaints that were received. Those investigations established that Facebook, google.fr and YouTube each offer one button to immediately accept the cookies compared to several clicks which are required to refuse all cookies. That was deemed to breach Article 82 of the French Data Protection Act which pertains to the obligation to obtain consent (and the conditions of consent) from internet users to place cookies and similar technologies on their devices.
In addition to the fines imposed, the restricted committee, a body of the CNIL responsible for issuing sanctions, ordered the companies to ensure that users of the aforementioned websites, who were situated in France, had a means of refusing cookies as simply as the existing means of accepting them within three months, failing which, each of the companies would be fined €100,000 per day.
Separately, a prior decision by the CNIL fining Google €100 million was upheld by the Conseil d'État (French Council of State) on 28 January 2022.
European parliament found to have engaged in unlawful data transfers and cookie consents
A COVID-19 test booking website that was set up in September 2020 was being operated by Ecolog, a third-party company commissioned by the European Parliament. It attracted numerous complaints, including from noyb, a privacy focussed NGO. These complaints concerned third-party trackers and cookie consent banners. The EDPB found that although health data was not processed through European Parliament's website, cookies associated with Google Analytics and Stripe were being utilised to capture "online identifiers" of the website's visitors, and that such data constituted personal data which was being transferred to the US where those users were not "logged into the European Parliament's network". Further, the EDPB found that although the European Parliament were responsible for such data, there was no evidence to suggest adequate measures had been taken to protect the personal data when transferred to the US, especially in light of the obligations in respect of EU-US data transfers in light of Schrems II (Data Protection Commissioner v Facebook Ireland Limited & Maximillian Schrems (Case C3-11/18)).
Instead, it was determined that Ecolog had copied the code it used when building another website for a test centre in Brussels International Airport, hence why Stripe, a payment company, was present. The EDPS also found that Google Analytics had been included to "minimise the risk of spoofing and for website optimisation purposes."
The EDPS' findings mirror those of the Austrian DPA highlighted below. Given the prevalence of Google Analytics, these decisions may present a significant operational headache both for Google and providers of US-based cloud services more generally.
Austrian Data Protection Authority (the "Austrian DPA") rules that the continuous use of Google Analytics violates the GDPR
Following the decision in Schrems II, noyb filed several complaints in relation to the use of Google Analytics. In the context of the instant complaint, it was found that the website in question, netdoktor.at, had been storing and further processing personal data in the US as a result of Google Analytics. The Austrian DPA determined this to constitute a breach of Article 44 GDPR.
In submissions to the Austrian DPA, Google argued that it had implemented various protective measures to safeguard personal data transferred to the US, including encryption. However, the Austrian DPA did not find consider that these measures were sufficient to prevent US intelligence services from accessing that data, and therefore, the safeguards required for a compliant transfer pursuant to Article 44 GDPR, taking into account the ECJ's decision in Schrems II, were not met. In making this finding, the Austrian DPA emphasised that IP addresses constituted personal data, and that security services, and other processors, often combined such data with other information as part of their profiling activities.
European privacy campaign group, noyb, files GDPR complaint against Amazon and Airbnb relating to allegedly unlawful automated decision making
noyb.eu has filed a complaint against Amazon with the Luxembourg data protection authority for the e-recruiting practices on Amazon's Mechanical Turk platform. It has been alleged that Amazon uses a non-transparent automated decision-making process, which is a breach of Article 22 GDPR. This means that applicants are not afforded the opportunity to understand the criteria behind the decision-making process and therefore are not able to challenge the decision made on their application to work. The Mechanical Turk platform is operated by Amazon Mechanical Turk, an entity part of the Amazon group. It is a crowd-sourcing platform that brings together various business and small independent workers globally, with such workers performing small tasks for small sums of remuneration. Such tasks include data validation or online content moderation.
noyb.eu has also filed a complaint against Airbnb for its use of automated decision practices. It has been alleged that as a result of these practices, users' ratings were improperly downgraded. Reviews and a rating of 4.8 stars and above are absolutely essential to becoming a "Superhost" on the platform. Becoming a "Superhost" comes with many benefits, including higher rental income and travel vouchers. Airbnb employs an algorithm to check these reviews, to ensure that these reviews comply with Airbnb's Review Policy. Those that are detected as biased or irrelevant are automatically removed by the algorithm. In the instant context, it is alleged that Airbnb automatically deleted a 5-star review, which led to a downgrade in the complainants overall rating. Following the complainant's unsuccessful requests for reasons for the removal, the complainant contacted Airbnb's data protection officer to ascertain how her personal data was being processed and on the automated deletions. It has been reported that no response has been received in the 18 months that followed, despite the fact that the GDPR provides that Airbnb had one month to do so.
ICO fines update: EB Associates fined £140,000 by ICO for illegal cold calls
The ICO imposed a fine of £140,000 on EB Associates Group Limited ("EBAG") for breaching Regulation 21B of the Privacy and Electronic Communications Regulations ("PECR"). Regulation 21B PECR prohibits the making of unsolicited calls to an individual for the "purpose of direct marketing in relation to occupational pension schemes or personal pension schemes". The ICO determined that EBAG "positively encouraged" its "introducer appointed representatives" to make direct marketing calls concerning occupational and personal pension schemes by offering between £300 and £750 for each referral made to EBAG.
Europol ordered to delete data concerning individuals with no criminal link
Following an inquiry that commenced in 2019, on 3 January 2022, Europol, the EU's police agency, was ordered by the European Data Protection Supervisor (the "EDPS") to erase information concerning individuals with no proven link to crime. Europol's database contains at least 4 petabytes of data, which includes sensitive data on at least a quarter of a million current, or former terror, and serious crimes suspects, as well as the numerous individuals they all came into contact, with such data having been accumulated over the last six years.
Europol has been given 6 months to assess new datasets and ascertain whether information can be kept, and 12 months to comply with the order received on 3 January 2022.
Permission granted to serve proceedings out of the jurisdiction on US-based news organisation
The Court of Appeal has held, in Soriano v Forensic News LLC & ors  EWCA Civ 1952, that the claimant can serve his data protection claim out of the proceedings against various defendant news organisation. The claimant is suing the defendants on a number of grounds relating to libel, misuse of private information, harassment, and breaches of GDPR. At first instance, Jay J refused permission to serve the claims for breaches of GDPR out of the jurisdiction, which finding was appealed by the claimant. The claimant was required to demonstrate on appeal that he had a realistic prospect of success of proving at trial that the defendants were subject to GDPR. In particular, this required him to demonstrate that:
- The defendants were "established" in the UK/EU and therefore Article 3(1) of the GDPR applied; and
- The defendants activities engaged Article 3(2) of the GDPR because the defendants were either "offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union… or monitoring of their behaviour as far as their behaviour takes place within the Union."
The Court of Appeal held that "establishment" could be considered to include even minimal commercial activity in the UK and offering Patreon subscriptions to UK-based customers was sufficient. In addition, there was a sufficient case that the defendants offered a service, related to the journalistic processing pf personal data, to UK/EU readers which met the Article 3(2)(a) and (b) extraterritorial scope of the GDPR. Finally, the assembly, analysis, sorting and reconfiguration of the claimant's personal data to result in publication could arguably constitute monitoring which would bring the defendants within the scope of Article 3 GDPR. Accordingly, the claimant was granted permission to serve his claims for breaches of GDPR out of the jurisdiction.
ECJ indicates support for consumer protection associations
On 2 December 2021, the ECJ published Advocate General de la Tour's opinion in case C-319/20 (Facebook Ireland) responding to the question of whether consumer protection associations may be permitted to bring representative actions in relation to breaches of data protection law pursuant to the national law of Member States.
In summary, Advocate General de la Tour considers that Article 80 GDPR permits consumer protection associations may be permitted to bring representative actions in relation to breaches of data protection law pursuant to the national law of Member States.
Whilst the ECJ is not bound to follow an Advocate General's opinion, typically this is the case. Assuming that this is the case, there is likely to be a significant uptick in such claims being pursued where national law permits (the UK has not enacted legislation permitting claims to be pursued on this basis).
WhatsApp sues European Data Protection Board after the hefty fine of €225 million issued
As previously reported, WhatsApp was fined €225 million by the Irish Data Protection Commission ("DPC") for breaches of Articles 5(1)(a), 12, 13 and 14 of the GDPR and ordered to take specific steps to bring its processing into compliance (including by updating its privacy notices both for users and non-users so as to ensure compliance with Articles 13 and 14 of the GDPR). The fine was made following an intervention by the EDPB which resulted in the fine levied by the DPC being substantially increased.
WhatsApp has appealed the EDPB's decision, and has now separately launched the first ever legal challenge against the EDPB in the context of its intervention before the EU's General Court.
It is reported that there are seven bases on which WhatsApp submits that the EDPB's intervention was unlawful, which include that the EDPB's conduct represented it "exceeding its competence" and "excessively interpreting" the GDPR.
The Gothenburg Court of Appeal upholds a fine of €4,800,000 imposed on Google by the Stockholm Administrative Court
In March 2020, the Swedish data protection agency imposed a fine of €7,200,000 on Google for breaches of the right to erasure under Article 17 GDPR. This fine was imposed because of Google informing webmasters when search results had been removed, and arising from two complaints received from data subjects, Complainant No. 2 and Complainant No. 8. Google appealed this decision before the Stockholm Administrative Court, which upheld the fine but granted a reduction in the amount thereof. Google then appealed to the Gothenburg Court of Appeal.
The Gothenburg Court of Appeal found that Google was not in breach of Article 17 GDPR with respect to Complainants No. 2 and 8. However, the Court of Appeal consider that by informing webmasters when search results were removed constituted a breach of Articles 5(1)(a) GDPR on the principle of lawfulness, Article 5(1)(b) GDPR on the principle of purpose limitation, and Article 6 GDPR on the absence of valid legal basis for processing.