Data Protection update - August 2021

Data Protection update - August 2021

Welcome to our Data Protection bulletin, covering the key developments in data protection law from August 2021.

Data protection

Cyber security

Enforcement

Civil litigation

Data protection

ICO consults on its draft guidance for international transfers under the UK GDPR

On 11 August 2021, the Information Commissioner's Office ("ICO") published a consultation on its long-awaited draft guidance for international transfers of personal data ("Guidance") and associated transfer tools. These tools are relevant to anyone transferring or receiving personal data subject to the UK GDPR and come in the form of a transfer risk assessment ("TRA") and an international data transfer agreement ("IDTA"). These will be the new UK equivalents of the European Transfer Impact Assessment ("TIA") and Standard Contractual Clauses (SCCs). The use of UK-specific acronyms may demonstrate that the ICO is seeking to take its own path after Brexit. Alongside these documents, the ICO also published a UK addendum to allow use of the European Commission’s own SCCs in a UK context.

The draft tools are designed to help organisations navigate international transfers subject to the UK GDPR in as simple a manner as possible.

For more detailed guidance see our commentary here and here

DCMS statement signifies UK's post-Brexit data-related priorities

On 26 August 2021, the Department for Culture, Media and Sport ("DCMS") unveiled its post-Brexit global data plans which are intended to facilitate its pursuit of a "new era of data-driven growth and innovation". Within these plans, New Zealand's Privacy Commissioner John Edwards was named as the UK Government's preferred next Information Commissioner, with Digital Secretary Oliver Dowding stating that Edward's "vast experience makes him the ideal candidate to ensure data is used responsibly" to achieve the UK's data-related goals.

The DCMS statement said that: "As Information Commissioner and head of the UK regulator responsible for enforcing data protection law, he will be empowered to go beyond the regulator’s traditional role of focusing only on protecting data rights, with a clear mandate to take a balanced approach that promotes further innovation and economic growth."

As part of the government's data-related "shake up", the following measures have also been announced:

  • The DCMS' statement identified the first territories with which the UK intends to achieve data adequacy partnerships with. The list of prioritised territories included the United States, Australia, the Republic of Korea, Singapore, the Dubai International Finance Centre and Colombia. The aim is stated as being "to move quickly and creatively to develop global partnerships which will make it easier for UK organisations to exchange data with important markets and fast-growing economies."
  • In an interview with the Telegraph newspaper, Dowden explained that the plans also included the removal of "endless" cookie pop-ups, which he referred to as pointless.

Speaking more generally (in his interview with the Telegraph newspaper), Dowden stated that: "There's an awful lot of needless bureaucracy and box ticking and actually we should be looking at how we can focus on protecting people's privacy but in as light a touch way as possible." Such inflammatory comments will likely have raised alarm bells for the EU who are already closely monitoring the UK's actions post-Brexit. Continued action in this area could in fact result in the revocation of the EU's data-sharing agreement with the UK.

ICO approves certification scheme criteria in key areas of privacy concern

On 19 August 2021, the ICO approved UK GDPR certification scheme criteria for the first time. Certification has been used to assist organisations in complying with data protection laws by providing a framework to follow.

The ICO has now approved criteria for three schemes: one developed by ADISA, which will be used when IT equipment is re-used or destroyed, and two developed by the Age Check Certification Scheme ("ACCS"), which are an age assurance scheme and a scheme focused on children's online privacy. Meeting the criteria in these schemes can allow relevant organisations to show they have a strong commitment to data protection compliance, appealing to customers, partners and investors.

Anulka Clarke, Acting Director of Regulatory Assurance of the ICO highlighted in a blog post (accessible here) that the schemes covered by these criteria were in "areas where enhanced trust and accountability in how personal data is protected is vital" and added that organisations being able to achieve certification in these schemes will "raise the bar of data protection".

China adopts new Personal Information Protection Law

On 20 August 2021, China announced the adoption of the Personal Information Protection Law ("PIPL") which shall come into effect on 1 November 2021. The PIPL is noteworthy as it provides rules for the processing of personal and sensitive information as well as for personal information protection processors, data subject rights and onward data transfers.

Breach of the provisions in the PIPL can result in penalties, which could amount to a percentage of annual turnover of the organisation.

ICO releases statement on its updated regulatory approach

On 27 July 2021, the ICO released a statement regarding its updated regulatory approach. This marks the latest in a series of updates intended to provide clarity to organisations facing difficulties and uncertainty as a result of the COVID-19 pandemic.

The ICO's updated regulatory approach continues to take account of the ongoing challenges faced by organisations who continue to navigate the ongoing pandemic. In fact, the statement contains only minor amendments to the ICO's previous statement which was released on 24 September 2020.

What is different about this statement, however, is the emphasis placed on the importance of upholding information rights. The ICO have seemingly increased the onus on organisations to respond within a reasonable period of time to the backlogs of complaints relating to information rights that began to pile up as a result of the Covid-19 pandemic. The ICO expects organisations to have "robust recovery plans" in place that will enable them to clear this backlog.

James Dipple-Johnstone, Deputy Commissioner – Chief Regulatory Officer at the ICO, stated in a blog post (accessible here) that: "Data protection has played a central role in the UK’s response to the pandemic, but the effectiveness of data-driven innovation relies in part on public trust. Likewise, people’s trust in decisions made by government and public authorities relies on transparency. A respect for people’s information rights is central to both, and the ICO will continue to work to protect and support those rights."

CDEI publishes guide on adopting Privacy Enhancing Technologies

The Centre for Data Ethics and Innovation ("CDEI") has published a guide on how organisations can adopt Privacy Enhancing Technologies ("PET's"), which are described as being any technical method that protects the privacy or confidentiality of sensitive information, such as ad-blocking browser extensions.

The CDEI highlighted two categories of technologies: traditional PETs and emerging PETs. Examples of traditional PETs include encryption schemes and de-identification techniques such as k-anonymity. Emerging PETs are described as providing novel solutions in modern systems, and the CDEI consider five technologies in this regard: homomorphic encryption, trusted execution environments, secure multi-party computation, differential privacy, and systems for federated data processing.

The guide aims to increase awareness and understanding of PETs so that the technologies can be used more broadly by technical architects and product owners working on projects that involve the sharing or processing of sensitive information. These individuals will answer questions in the decision tree which will result in suggested PETs for their consideration, and they will be signposted to obtain further information on these PETs.

Use of PETs require the prerequisite technical expertise and financial cost so they can prove to be challenging to implement in practice. The guide also contains a section on good practice for sharing and processing data, regardless of whether an organisation adopts PETs.

The CDEI has requested feedback on the guide, which is currently a beta version, and therefore an updated version is anticipated later in 2021.

OECD anticipates global democracy data deal will restore trust where governments access personal data

Diplomats from democratic countries anticipate that an agreement will soon be reached on the seven principles to be applied when governments access personal data held by companies. It is hoped that these principles could reignite public trust in privacy standards where data is transferred internationally.

Representatives from 23 countries and the EU met at the Organisation for Economic Co-operation and Development's ("OECD") headquarters in Paris to finalise the principles, according to the US Council for International Business. These will be presented in October as a recommendation to the OECD Council and could soon lead to a ministerial declaration.

Since the European Court of Justice ("ECJ") invalidated the EU-US Privacy Shield in its ruling in Schrems II, the situation has grown more urgent. The European Data Protection Board ("EDPB") has demanded that EU governments "assess and, where necessary, review their international agreements that involve international transfers of personal data". Some EU countries have since demanded that companies stop using providers of these services that are US-based following the decision in Schrems II.

Negotiators in the OECD's Committee on Digital Economy Policy specified two options: the seven principles would only apply to government access for law enforcement and national security; or they would have broader application but would take longer to agree.

It is understood that the former option is supported by the US, Canada, Australia, Japan and the UK, while the EC is in favour of the latter. The EU executive has proposed a compromise, which is expected to be settled in October, in which all forms of access are included. Thereafter, the OECD will continue its work on clarifying some principles which may be more relevant for different types of government access.

New ICO guidance for public sector organisations

On 4 August 2021, the ICO published new guidance relevant to public sector organisations on how marketing messages are sent directly to individuals.

Public sector organisations (like other organisations) must comply with the Privacy and Electronic Communications Regulations ("PECR") direct marketing rules where they do engage in direct marketing, in addition to the UK GDPR.

The guidance provides an updated definition of direct marketing which includes commercial marketing (e.g. promotion of products and services provided by or in connection with public sector bodies) and also the promotion of aims and ideals (e.g. fundraising and campaigning).Of particular note, however, is the fact that a significant amount of public sector communications have been excluded from the definition of direct marketing, such as service messages and certain public sector promotions to individuals. The guidance states that: " If you are a public authority and your messages are necessary for your task or function, these messages are not direct marketing, even if you decide to rely on consent rather than public task." This is an interesting move from the ICO, as it disapplies the PECR direct marketing rules from such communications and, while the UK GDPR still applies, it invalidates the right to automatically stop such communications (contained in the UK GDPR).

Cyber security

High Court holds claims in privacy and confidentiality to be "ill-founded" in cyber-attack cases

The High Court, in its ruling in Warren v DSG Retail Limited [2021] EWHC 2168 (QB), has clarified the extent to which companies will be liable to individuals where they are victims of cyber-attacks. Claimants will now find it more difficult to claim compensation in these circumstances.

Individual claimants often seek to claim for the premiums in after-the-event ("ATE") insurance policies, which are recoverable in "publication and privacy proceedings", however this does not include data protection claims. Claimants have therefore pleaded privacy and breach of confidence claims to overcome this hurdle. However, the High Court has ruled that such claims would be "ill-founded" and inappropriate.

In this case, the claim originated from a cyber-attack suffered by DSG where the claimant's data was affected. The claimant sought £5,000 in damages for the distress suffered as a result of the lost personal data. All of the claimant's claims were struck out save for the claim under the Data Protection Act 1998 ("DPA"), which was transferred to the County Court.

The ruling makes it clear that a company is not liable for the claimant's data being misused – that unlawful misuse is done by the attacker. The duties of data controllers do not extend to protecting from the actions of third parties; their duties are limited those set out in data protection legislation.

DCMS requests views on the government's proposal to amend Network and Information Systems legislation

On 26 July 2021, the Department for Digital, Culture, Media & Sport ("DCMS") published a policy paper regarding the deficiency in the Network and Information Systems ("NIS") legislation in relation to the effect of the UK leaving the EU on the suitability of incident reporting thresholds for digital service providers.

This was an open call for views from digital service providers as well as Competent Authorities ("CA") (i.e., regulators of organisations under the scope of the NIS) and Operators of Essential Service ("OES") (i.e., providers of water, energy, transport, healthcare, digital infrastructure).

The NIS legislation encompasses the Network and Information Systems Regulations 2018 ("NIS Regulations") and the European Commission Implementing Regulation 151/2018 ("EC Regulation") which still applies in the UK by virtue of the European Union (Withdrawal) Act 2018 .

Currently, digital service providers fall within the EC Regulation, as such, they are required to report "substantial incidents" to CA's who are responsible for enforcing the NIS Regulations where appropriate. The threshold for what constitutes a "substantial incident" by a digital service provider is set by the EC Regulation and applies across the board to all EU Member States and the UK. Prior to Brexit, this arrangement was convenient for both the UK and EU, as digital service providers often offer cross-jurisdictional services and so it made sense for a coordinated regulatory approach, whereby the EU Member State where a digital service provider had their main establishment would regulate the organisation on behalf of and with the cooperation of other Member States. However, now that the UK has left the EU, the incident reporting thresholds remain the same as those applicable to the entire EU, despite the fact that the UK no longer coordinates its regulatory approach with the EU. As such, the thresholds for the UK are too high and the Government refer to this as "a clear deficiency arising from our withdrawal which needs to be rectified to reflect the UK's new position".

The Government is now proposing to lay a statutory instrument to amend these regulations to allow the ICO (as the CA for digital service providers) to set thresholds at appropriate levels through issuing guidance, which would emulate what is already being done in relation to OES.

The Government's consultation closed on 27 August 2021 and its proposed legislative changes will be based upon the finding of the review.

Enforcement

WhatsApp hit with fine of €225 million for breaches of GDPR

The Irish Data Protection Commission (the "DPC") has imposed a fine of €225 million on WhatsApp for various breaches of Articles 5(1)(a); 12, 13 and 14 of the GDPR. In addition to the imposition of an administrative fine, the DPC has also imposed a reprimand along with an order for WhatsApp Ireland Limited ("WhatsApp IE") to bring its processing into compliance by taking a range of specified remedial actions. In particular, WhatsApp IE must bring its processing activities into compliance within three months and update its privacy notices for both users and non-users to include the information required under Articles 13 and 14 of the GDPR as identified in the DPC's decision.

In addition to the size of the fine, which is the second largest under GDPR to date (albeit it represents only 0.08% of relevant turnover) being of note, the circumstances in which it came to be made are of particular interest, representing the third time that other Supervisory Authorities have challenged the adequacy of the DPC's approach to enforcement by way of a reference to the EDPB in the past year (which in this case was successful). In summary:

  • The DPC began its investigation into WhatsApp IE in late 2018 after numerous complaints and a request for mutual assistance from the Federal German Supervisory Authority regarding the transparency of WhatsApp IE's processing of personal data (the investigation was limited to these aspects of WhatsApp IE's compliance with data privacy legislation and there are numerous other investigations by the DPC afoot into other aspects thereof);
  • In late 2020, the DPC circulated a Draft Decision to other concerned Supervisory Authorities ("CSAs") on 24 December 2020 indicating in intention to issue WhatsApp with a fine of between €30 million and €50 million. A number of objections were raised by the CSAs pursuant to Article 60(4) GDPR, in particular regarding the scale of the proposed fine;
  • Following further exchanges between the DPC and the CSAs, in April 2021, it became apparent that, no single proposed compromise position was agreeable to all of the relevant CSAs;
  • Accordingly, the DPC referred the matter to the EDPB for a binding decision pursuant to Article 65(1)(a) GDPR; and
  • In July 2021, the EDPB's binding decision recommended the DPC increase its fine in line with that reflected in the DPC's final decision of 1 September 2021, as well as to adopt a shorter period for WhatsApp IE's. This represented a more than four-fold increase in the €30-50 million fine that was proposed in the Draft Decision issued by the DPC in December 2020.

The EDPB's binding decision and the DPC's final decision set out a detailed analysis on various aspects of GDPR which will be of great interest to privacy practitioners. We will be preparing a detailed breakdown in this regard shortly, but in the meantime, some key points to note are the findings that:

Privacy Notices:

  • Processing information (e.g. which categories of data are used) has to be given for each processing purpose and legitimate interest pursued (e.g. by using a table) where this basis for processing is relied on;
  • The description of the legitimate interests pursued must be sufficiently detailed – statements such as "improving our services" are likely to be too vague; and
  • A party can be sanctioned for breaching Articles 13/14 GDPR on the basis of the content of Privacy Notice as well as the transparency principle, but every breach of 13/14 GDPR won't necessarily be a breach of Article 5 GDPR;

International transfers:

  • Specific adequacy decisions relied on for overseas safeguards should be listed out; and

Approach to quantum of fines pursuant to GDPR:

  • The ultimate parent company of WhatsApp IE, Facebook Inc.'s, total turnover was taken into account in calculating the maximum fine to which WhatsApp IE could be subject, in line with European case law on meaning of "undertaking";
  • The total turnover of an undertaking can be taken into account in determining whether a penalty decided upon is “effective, proportionate and dissuasive” and is not just relevant as a cap on the fine which can be ordered; and
  • WhatsApp has stated that it will appeal the DPC's decision, noting: "WhatsApp is committed to providing a secure and private service. We have worked to ensure the information we provide is transparent and comprehensive and will continue to do so. We disagree with the decision today regarding the transparency we provided to people in 2018 and the penalties are entirely disproportionate. We will appeal this decision."

Amazon hit with £630 million GDPR fine

In our two previous bulletins (here and here) we reported that the Luxembourg National DPC (the "CNPD") had proposed a fine to Amazon.com Inc ("Amazon") totalling £630 million according to an SEC filing made by Amazon. That represented a fine of nearly 15 times the size of the largest fine issued for breaches of GDPR to date (and reportedly more than twice the sum of all previous GDPR fines combined). There is still little public information available about the fine beyond the Bloomberg reports which we noted in our previous update here. Beyond the fact that the decision was issued on 15 July 2021, the CNPD has said that the secrecy laws of Luxembourg means it cannot publish further details before the exhaustion of the appeals process.

Amazon has now confirmed that it intends to appeal the ruling, which it says is "without merit" and it is of the view that "There has been no data breach, and no customer data has been exposed to any third party. These facts are undisputed".

ICO fines call-blocking company for nuisance calls

The ICO issued a monetary penalty of £170,000 against Yes Consumer Solutions Ltd ("Yes") for a breach of Regulation 21 of the PECR.

Over a 12-month period from October 2018, Yes made unsolicited marketing telephone calls to almost 200,000 people, who were registered with the Telephone Preference Service ("TPS"), without their consent. Yes had acquired the customer database of a separate company as a part of an asset purchase. Despite its due diligence checks, Yes made those calls to subscribers who had been registered with the TPS for not less than 28 days, in breach of PECR. The product which was being marketed in the calls was, ironically, a nuisance call blocking system!

ICO fines home improvement company for nuisance calls

The ICO has issued a monetary penalty notice of £130,000 against ColourCoat Limited ("CCL") for breaches of Regulations 21-24 of PECR arising from a number of unsolicited direct marketing calls made over an eight-month period.

CCL made almost a million connected calls which included 450,000 calls to numbers on the TPS or the Corporate Telephone Preference Service (the equivalent to the TPS for businesses). One aggravating feature of the case was that CCL sought to evade identification by preventing call recipients from contacting it and failing to identify itself on the calls and providing false company names. In addition, CCL purported to be calling from the Citizens Advice Bureau and the Government.

Italian DPA fines Deliveroo €2.5 million

The Italian Data Protection Authority ("DPA"), Garante per la protezione dei dati personali ("Garante") has fined Deliveroo Italy s.r.l. ("Deliveroo") €2.5 million for the unlawful processing of the personal data of approximately 8,000 Deliveroo riders. Garante carried out an on-site inspection of Deliveroo in June 2019 which established that Deliveroo was operating a centralised computer system which scored riders based on several factors (such as weekend availability). Garante was concerned that Deliveroo was not sufficiently transparent regarding how this personal data was used to distribute work to its riders via its scoring algorithm. Garante also raised concerns that this personal data, which included precise geo-location data of its riders which was captured every 12 seconds, was kept for up to six months for each rider.

As a result, Garante considered that Deliveroo breached the principles of storage limitation, data minimisation, transparency and lawfulness under Article 5 of the GDPR. In addition to fining Deliveroo, Garante required Deliveroo to implement corrective measures including: preparing appropriate documentation about how it processed personal data; preparing data protection impact assessment records; providing transparency about how the data collected on riders would be used; and verifying, on a periodic basis, the accuracy of the results of the algorithm to minimise the risk of distorted or discriminatory effects. We reported in our July bulletin about the fine Garante issued to Foodinho (of a similar size) for its use of algorithms to manage riders undertaking food deliveries.

Spanish DPA fines supermarket chain for unlawful use of facial recognition

The Spanish DPA, the Agencia Española de Protección de Datos (the "AEPD"), has fined Mercadona S.A. ("M") €2.52 million following an investigation into the use of facial recognition systems in M's supermarkets. M had used a form of video surveillance which had facial recognition capability. The system compared a sample from an image of a person against a database of samples already associated with the identity of that person for every person who entered one of M's supermarkets. M said that it was using the system in order to detect individuals with criminal convictions or restraining orders (in particular those who received a restraining order for a store-based assault) and therefore relied on the "public interest" exemption under Article 6 of the GDPR to process this personal data. The public interest which M identified was the safety of the people, goods, and premises which accords with a particular national law in Spain.

The AEPD found that the data constituted special category data as it was biometric data used for the identification of living individuals in a mass fashion. M was unable to rely on the public interest exception, under Article 9, as the relevant national law must also specify the circumstances, limits, rules and measures for applying the exception and should be proportionate. The national law which M identified did not meet these tests and so was not sufficient. In addition, the AEPD concluded that the processing of the personal data was contrary to Article 5 of the GDPR as it was not consistent with the principles of necessity, proportionality, and data minimisation. Finally, the AEPD found that M's Data Protection Impact Assessment did not consider the specific and unique risks for its employees posed by the facial recognition systems.

We reported, in our June update, on the ICO's guidance with respect to live facial recognition technology.

Civil litigation

First Tier Tribunal Reduces Monetary Penalty Size by Two-Thirds

In Doorstep Dispensaree Limited v The Information Commissioner [2021] FTT EA/2020/0065 the First-tier Tribunal reduced the size of a monetary penalty issued by the ICO primarily for breaches of Articles 5(1)(f), 24(1), 32, and 5(1)(e) of GDPR. The ICO issued a Monetary Penalty Notice (an "MPN") to Doorstep Dispensaree ("DD") in December 2019 to the tune of £275,000 for failing to ensure the security of special category data.  

DD supplied medicines to customers and care homes and left around 54,000 documents in unlocked containers at the back of its premises in Edgware. Those documents included sensitive personal data, of an unknown number of data subjects, such as: medical information; prescriptions; NHS numbers; names; addresses; and dates of birth.

DD argued that it's subcontractor, JPL, which carried out document-destruction as a data processor, was the appropriate data controller and so DD should not be subject to the monetary penalty.

Judge MacMillan disagreed and found that DD was the appropriate data controller (as it "was determining the purposes and means by which any personal data collected by JPL would be processed") and that the breaches identified the ICO's MPN. Amongst other things, Judge MacMillan held that DD did not have in place adequate policies and procedures either to ensure the propriety of JPL's data processing, such as retention and destruction policies.

The original finding from the ICO was based on an estimate of 500,000 documents having been seized. On the basis of evidence produced following an audit of the documents by DD's solicitors, Judge MacMillan held that the actual number of documents containing special category data was 53,871 and so the monetary penalty was reduced commensurately to £92,000.

Austrian Supreme Court requests ruling from CJEU on Schrems case

The Supreme Court of Austria (the "SC") has requested a preliminary ruling of the Court of Justice of the European Union (the "CJEU") on a variety of questions connected to Max Schrems' claim against Facebook. The SC requested a preliminary ruling on the factual basis that Facebook does not rely on consent to process personal data, but on the fact that the personal data is necessary for the performance of a contract pursuant to Article 6(1)(b) GDPR. The SC doubts that the declaration of consent to processing by Facebook, in the contract, can be given the additional processing condition of "performance of a contract" rather than the condition of "consent". The CJEU has been asked to opine on whether Article 6(1)(a)(b) are to be interpreted as meaning that general provisions of contracts, which involve personal data being processed for personalised advertising such as Facebook's contract, should more appropriately fall under the requirement to obtain the data subject's consent. In addition, the SC has asked for the CJEU to opine on whether the data minimisation principle and the filtering of special category data (such as political opinions or sexual orientation) can be respected if the personal data is aggregated for targeted personalised advertising.

European Data Protection Board makes decision regarding data sharing between WhatsApp and Facebook

The EDPB has adopted an Urgent Binding Decision (No 1/2021) following a provisional measure taken by the Hamburg DPA against WhatsApp IE ("WhatsApp") pursuant to Article 66(2) GDPR. This decision is the first urgent binding decision adopted under the mechanism at Article 66(2) GDPR.

WhatsApp implemented a change to its Privacy Policy in 2019 and notified its German users, requesting their consent by 15 May 2021. Part of the change to the Privacy Policy was the permitting of sharing personal data with Facebook Ireland Ltd ("Facebook"). The Hamburg DPA held that Facebook was already processing the data of WhatsApp users in Germany prior to the deadline elapsing. That violated Articles 5, 6, and 12 of the GDPR. The Hamburg DPA, after discussions with the DPC (which was Facebook and WhatsApp IE's lead Supervisory Authority pursuant to the "one stop shop mechanism" under GDPR, and which was already investigating aspects of its business), imposed a provisional ban on the relevant processing, and requested the EDPB adopt an urgent binding decision, under Article 66(2) GDPR, to require the adoption of binding final measures across the EU.

The EDPB shared the concern of the Hamburg DPA as to the sharing of data between Whatsapp and Facebook and considered that "there is a high likelihood that Facebook IE already processes WhatsApp user data as a controller or joint controller for the common purpose of the safety, security and integrity of WhatsApp and the Facebook Companies" and requested an investigation to be carried out, by the DPC, to establish whether that is the case. In addition, the EDPB found that there is a high likelihood Facebook processes WhatsApp users' data as a (joint) controller for the improvement of the Facebook product experience. The EDPB has requested an investigation, by the DPC, to be carried out to verify this point. The EDPB has also requested the DPC investigate the cooperation with other Facebook Companies by Whatsapp. However, the EDPB did not consider that it would be appropriate to require the adoption of binding final measures across the EU as requested by the Hamburg DPA. It considered the EDPB's decision “disappointing”, questioning whether the response of the Irish DPC and the EDPB was sufficient given the gravity of the issues it had sought to address with its provisional ban.

Zoom settles class action privacy lawsuit for $86 million

Zoom Video Communications Inc ("Zoom") has agreed to settle a lawsuit which claimed it violated users' privacy rights by sharing personal data with Facebook, Google and LinkedIn and allowed hackers to enter Zoom meetings in a practice called "Zoombombing". The lawsuit was filed in March 2020 in the U.S. District Court in the Northern District of California. It was alleged that Zoom failed to ensure the security of its platform despite claiming that meetings were "end to end" encrypted. This led, on some occasions, to hackers accessing meetings and displaying inappropriate content to the legitimate participants in the meeting.

The settlement agreement means that those who subscribed to the class action would be eligible for a 15 per cent refund on their core subscription or $25, whichever is larger. Zoom had collected $1.3 billion in subscriptions from the class members but the settlement of $86 million was reportedly considered reasonable by the lawyer's representing the class action. Zoom has also said that it will take additional steps to prevent intruders from Zoombombing meetings. That would include alerting others when meeting hosts or other participants used third-party apps in meetings and would offer specialized training to employees on privacy and data handling. There is a hearing set for October 21 2021 where the presiding judge may approve the settlement.