Data Protection update - September 2020

Data Protection update - September 2020

Welcome to our data protection bulletin, covering the key developments in data protection law from September 2020.

Data protection

Cyber security

Enforcement

Regulatory enforcement

Civil litigation

 

Data protection

The aftermath of Schrems II: Part Two

Following our reports in our July and August bulletins, there have been further developments in the wake of the CJEU decision which invalidated the Privacy Shield as a data transfer mechanism.

Irish regulator takes action against Facebook

Action by regulators was inevitable following the Schrems II ruling in July which raised doubts about the protection provided to personal data transferred to the U.S. because of government surveillance. In the first major action from an EU regulator, the Irish Data Protection Commission (the “DPC”) issued a preliminary decision against Facebook requiring the tech giant to suspend all data transfer from the EU to the US.

However, on Monday 14 September 2020, before the preliminary decision could have any effect, a judge of the High Court of Ireland held that Facebook could challenge the decision by way of an application for judicial review. The hearing is due to be heard later this year and will consider whether the DPC acted too hastily in issuing the draft order without first receiving any guidance on the Schrems II decision from the EDPB.

In the meantime, Facebook can continue to legally transfer data from the EU to the US using standard-contractual clauses (“SCCs”) as the mechanism for data transfer. The Schrems II decision saw the CJEU impose requirements on the use of SCCs, meaning that they can only be used as an appropriate mechanism for data transfers when combined with safeguards that prevent the US government from obtaining disproportionate access to exported personal data. These new requirements have lead the Irish DPC to cast doubt on whether the SCCs can be used to safeguard transfers to the US by Facebook. Although it looks likely that large tech businesses will be the first and main targets of regulators, other businesses could be caught in the fall-out.

Facebook’s Irish win has given other European privacy authorities and companies some breathing room as they begin to deal with the ever increasing influx of complaints that similar transfers are illegal under European data protection law. However, the Irish ruling that Facebook are entitled to a judicial review has diminished any chance of a quick and concrete conclusion around EU-US data transfers. This only adds to the confusion felt by companies and authorities alike about the practical consequences of the Schrems II ruling.

Guidance and revised SCCs to be published by the end of the year

The Facebook case further highlights the importance of formal guidance issued by the EDPB and the widespread hope that such guidance will shed light on how to practically address the CJEU’s concerns. On 23 September 2020, the chairwoman of the EDPB, Andrea Jelinek, commented that she hoped the EDPB’s guidance on the impact of the Schrems II decision would be publicly available within the next two months. This announcement will have undoubtedly provided some much needed reassurance to the many EU-based data exporters and data protection authorities that await the guidance with anticipation. The European Commission is also reportedly working on an amended set of SCCs that may address some of the issues raised by the Schrems II decision and will align with the GDPR. The hope is that the new SCCs will be finalised and ready for use by the end of 2020.

Task forces

The EDPB also announced the creation of two task forces last month, in a move to deal with the consequences arising from the Schrems II decision and the challenges it has presented.

The first task force has been established to assist EDPB members to cooperate in handling the hundreds of complaints filed by privacy groups across the EU and EEA. The task force will analyse the complaints and compile guidance to ensure all members of the EDPB act consistently. If European authorities are to tackle these complaints efficiently and coherently, the EDPB’s promised guidance will need to provide comprehensive direction on SCCs and model contracts.

The second task force has been established to explore supplementary measures that might need to be introduced to ensure data subjects are provided adequate protection when data is transferred to third countries. Alongside these supplementary measures, the task force will compile recommendations to assist controllers and processors with their duty to identify and implement any new measures.

Swiss-US Privacy Shield

September also saw the effects of the CJEU’s decision being felt by other Privacy Shield regimes. Following an annual assessment of the Swiss-US Privacy Shield regime, the Swiss Federal Data Protection and Information Commissioner (“FDPIC”), concluded in a statement that the Swiss-US Privacy Shield guarantees protection to data subjects within Switzerland but it does not provide adequate protection for data transfers to the US.

Whilst it is important to note that the FDPIC does not have the power to prevent the continued existence and operation of the Swiss-US Privacy Shield in circumstances where the United States has not revoked it, in practice companies may no longer rely on the Swiss-US Privacy Shield framework to transfer data from Switzerland if they are to remain compliant with Swiss law.

It follows that data transfers relying upon the Swiss-US Privacy Shield should now cease, and companies should only restart transfers once a new mechanism is in place. The FDPIC has gone even further than the CJEU in relation his comments on the use of SCCs as a transfer mechanism. Moving forward, if Swiss companies seek to rely on SCCs, a risk assessment should be carried out on a case-by-case basis and appropriate safeguards and technical measures be put in place before data transfers take place.  

It is no surprise that there has been further fall-out from the Schrems II decision throughout September and more can be expected as we move in the final quarter of 2020.

EDPB adopts new guidelines for consultation

On 2 September 2020, the EDPB adopted two sets of draft guidelines (the “Guidelines”) which aim to clarify two grey areas of data protection law: the concepts of controller/processor under the GDPR and social media targeting. Both sets of Guidelines are open for public feedback until 19 October 2020.

Controller/Processor

Applying the concepts of controller and processor seems to have become even more complex since the GDPR came into force two years ago. In a move that is hoped to bring some clarity, the EDPB has published draft Guidelines intended to set out more clearly the meaning and scope of those roles.  

There are various aspects to the definition of a controller but the Guidelines explain that determining the “essential means” of processing personal data refers to the type of data being processed or the duration of processing and not non-essential decisions such as the practicalities or technicalities of how a set of data is processed.

The Guidelines are explicit that if a processor goes beyond a controller’s instructions, they would be infringing the GDPR and acting in the capacity as a controller. Although there is nothing to prevent a processor from offering certain services to the controller, it is clear that the controller must actively approve the way the data processing is carried out. Part of this is the notion that processors cannot change any essential means without the approval of the controller and the controller must be able to request changes, where necessary.

With regards to the data processing agreement, the Guidelines provide drafting recommendations. They clearly dissuade the parties from simply restating the relevant provisions of the GDPR. Rather, the terms should include specific information as to how the GDPR requirements are going to be met. Recommendations include setting out a procedure and a template by which the controller can give instructions to the processor, a process to allow the controller to object to the appointment of sub-processors and details of exactly how the processor must help the controller fulfil its obligations under Articles 32 to 36 of the GDPR.

The Guidelines also provide helpful clarification on the concept of joint controllers, by explaining that the processing by each controller must be inextricably linked which, in practice, means that a joint controllership may arise where the parties pursue purposes that are closely linked or complementary.

The Guidelines also include a useful flow chart for assisting organisations with assessing the concepts of controller and processor in practice.

Targeting social media users

The EDPB also adopted guidelines on the targeting of social media users in an attempt to provide practical guidance to social media providers and those using targeting services on social media platforms.

These Guidelines set out the different legal basis for justifying the processing of personal data for targeting services. The starting point here is that social media providers and targeters are considered joint controllers when the display of a specific ad via a targeting tool is an outcome of both the provider and targeter co-determining the purpose and essential means of the targeting. In these instances, both joint controllers must be able to demonstrate the existence of a legal basis to justify the processing of personal data.

The Guidelines acknowledge that both consent and legitimate interest may be appropriate legal bases to justify a targeting activity, they clearly distinguish certain processing activities, such as intrusive profiling and tracking practices for marketing purposes, for which legitimate interest would not be an appropriate legal basis and that would require collection of users’ consent. Further, when processing involves social plug-ins, cookies or pixels, the social media provider and the targeter must comply with both the GDPR and the ePrivacy Directive, and as such must obtain users’ valid consent. Lastly, the collection and use of inferred data typically involves profiling activities. As profiling typically constitutes automated decision-making in relation to personal data, controllers may only rely on the user’s explicit consent.

In addition, the Guidelines provide information on the application of key data protection requirements and on the joint agreement social media providers and targeters must implement. Some key obligations include: compliance with transparency requirements by informing users of the processing; compliance with the right of access by designating a single point of contact for data subjects; and carrying out data protection impact assessments, where required. 

The UK government launched the national data strategy

On 9 September 2020, the UK government launched an ambitious 'National Data Strategy' over two years after it was first announced (the “Strategy”). The Strategy has been described as a central part of the government’s wider ambition for a thriving, fast-growing digital sector. Intended to overhaul the public sector’s use of data, the Strategy has been designed to enable organisations to use it to drive digital transformation, innovate and boost growth across the economy.

The Strategy considers the necessity of both putting data at the heart of the UK recovery from the current Covid-19 pandemic and taking advantage of being an independent sovereign nation after Brexit. The backbone of the Strategy, however, is the five missions it outlines to underpin the broader plan to strengthen the UK’s data policies. These include:

  • unlocking the value of data across the economy;
  • securing a pro-growth and trusted data regime;
  • transforming the government’s use of data to drive efficiency and improve public services;
  • ensuring the security and resilience of the infrastructure on which data relies; and
  • championing the international flow of data.

In order to achieve these missions, the Strategy lays out plans for 500 analysts to be trained with expertise in data and data science across the public sector by 2021 as well as the creation of a new government office (Chief Data Officer) designed to lead the transformation of public services. The government has declared a £2.7mn project, as part of the Strategy, aimed at addressing current barriers to data sharing and prevention of harmful online platforms.  There are also plans to introduce primary legislation to boost participation in Smart Data initiatives, which can give people the power to use their own data to find better tariffs in areas such as telecoms, energy and pensions (see below).

The government has announced a consultation that is to run alongside the launch of the Strategy to “help shape the core principles of the strategy”. News reports have also noted that European sources consider that the Strategy may have an impact on the UK’s application to the European Commission for an adequacy decision. 

BEIS publishes its response to the Smart Data Review consultation

Currently, the UK’s data protection laws give consumers the right to data portability, which allows consumers to request that businesses to provide certain personal data to Third Party Providers (“TPPs”). Despite this, there are concerns that consumers often struggle to stay on top of their essential service contracts and find it difficult to identify the best deal. New Smart Data technologies could have the potential to offer enhanced frameworks for sharing consumer data and, in doing so, address many of the problems consumers face in regulated markets.

In an attempt to gather more information on how data portability can provide positive outcomes for consumers, the government launched the Smart Data Review in September 2018. BEIS published their response to this consultation on 9 September 2020.

The key outcome of that review is an acknowledgement that consumers who regularly switch have to put in a lot of effort in order to switch, whilst those that do not switch often pay considerably more.

Another key conclusion of the review was that the majority of responses favoured mandated industry involvement (as was the case in Open Baking). There also appears to be general support for cross-sector participation and coordination. The conclusion was that focus on Smart Data will improve market competition, offering more choice and lower prices to customers. 

As well as highlighting the general conclusions of the review, the response sets out some key next steps to help achieve the benefits of markets reliant on Smart Data. The first step is to develop primary legislation aimed at extending industry powers to mandate participation in Smart Data initiatives. Alongside this will be the launch of a cross-sector Smart Data working group. The purpose of the working group will be to accelerate existing Smart Data initiatives, coordinate the governments approach and ensure the support, development and delivery of Smart Data infrastructure (with an initial focus on communications, energy and finance sectors).

ICO updated guidance on Track and Trace

From 4 July, businesses across all sectors were charged with managing reams of customer data in a national effort to curb the spread of Covid-19. As a result, many businesses found themselves subject to the GDPR in their new role as controllers of personal data. In light of these extraordinary developments, the Information Commissioner’s Office (ICO) produced guidance for businesses as well as a step-by-step guide outlining roles and responsibilities in simple terms.

With the change of rules in England coming into force on 18 September and the launch of the NHS track and trace app on 24 September, businesses are now required by law to display the official NHS QR code posters and allow people to check in at premises using the app. Accordingly, the ICO has updated its guidance on the collection of customer information.

Despite the new technology, the new guidance reminds organisations that compliance does not need to be complicated. The ICO maintains the following five steps:

  1. Only ask people for the specific information that has been set out in government guidance;
  2. Be clear, open and honest with people about what is being done with their personal information.
  3. Keep people’s data secure. Organisations should not use open log books, and should ensure their customers’ personal information is kept private;
  4. Not use the personal information collected for contact tracing for other purposes, such as direct marketing, profiling or data analytics; and
  5. Erase or dispose of the personal information collected after 21 days.

In relation to the Track and Trace app, the ICO notes that, although displaying the NHS Track and Trace QR code might be mandatory for organisations in certain sectors, those organisations should not make the use of the app mandatory for customers. Rather organisations should provide all visitors the option of using the app but have the means to manually register details as well. The ICO also makes clear that each visitor should only be asked to register their details by a single method: if a visitor chooses to use the Track and Trace app, staff cannot and should not request that same visitor to complete a different type of track and trace registration.

ICO launches accountability framework

Accountability is a fundamental principle of data protection law, as it requires organizations to comply with their legal requirements under the GDPR and also demonstrate that compliance. On 9 September, the ICO launched its accountability framework (the “Framework”) to provided practical guidance on accountability in the context of data protection. For each of the core areas of the GDPR, the Framework identifies practical ways in which organizations can meet their compliance obligations.

There are two practical tools included in the Framework: one for self-assessment and the other for tracking progress. The first is the accountability self-assessment tool which an organisation can use to estimate their level of compliance across the 10 core areas. Those estimates are then used to generate a report designed to assist organisations in focussing on some areas of improvement. In addition, organisations can use the accountability tracker to measure how their accountability compliance progresses over time.

Dubai Data Protection Law

The Dubai International Financial Centre’s Data Protection Law No.5 of 2020, which came into effect on 1 July 2020, became enforceable on 1 October 2020. This means that businesses operating, conducting, or attempting to conduct business in or from the DIFC could now face enforcement measures if they are not compliant.

For all the answers on the key implications of this new law for organisations, steps organisations should take to ensure compliance, and important changes from the previous data protection legislation, see our Expert Q&A.

Cyber Security

Interpol issues report on cybercrime activity during COVID-19 pandemic

According to an Interpol assessment of cybercrime during the Covid-19 pandemic, cyberattacks are on the rise. Reports from one of Interpol’s private sector partners has identified some 907,000 spam messages, 737 incidents related to malware and 48,000 malicious URLs in just four months from January to April. All of these incidents were related to Covid-19 and indicate the alarming rate at which cybercriminals are evolving and multiplying their attacks by taking advantage of the fear and ambiguity caused by the pandemic.

With organisations having to deploy remote working, new opportunities to infiltrate data platforms, steal data and disrupt the status quo have been exploited by criminals since the onset of Covid-19. Further, the assessment highlights a significant target shift from individuals and small business to major corporations, governments and critical infrastructure.

There are a few key areas in which the landscape of cybercrime has adapted during the pandemic. Firstly, by adapting existing online scams and phishing to have a Covid-19 focus and, in particular, by impersonating government and health authorities, criminals are enticing victims into providing personal data and downloading malicious content. Secondly, by using COVID-19 related information as a lure, criminals are deploying data harvesting malware to infiltrate systems and compromise networks. Thirdly, there has been a significant increase of domain names being registered using keywords such as ‘coronavirus’ or ‘COVID’. These domains are used by cybercriminals to take advantage of the increased demand for medical supplies and information. Lastly, an increase in the dissemination of misinformation and fake news to the public. The spread of unverified information and conspiracy theories facilitated the execution of cyberattacks.

The report concludes by noting that further increase in cybercrime is highly likely in the near future as continued remote working and the potential for increased financial benefit will see cybercriminals continue to ramp up their activities and develop more advanced and sophisticated means of committing crimes. Further, cyber attackers are likely to continue proliferating coronavirus-themed angles to leverage public concern about the pandemic.

Welsh data breach exposes information of COVID-19 patients

Public Health Wales has confirmed that on 30 August 2020 the personal data of 18,105 Welsh COVID-19 patients was uploaded to a public server as a result of individual human error. The data was viewed a total of 56 times in the 20 hours that it was online. The risk of identification is low for 16,179 patients as the breach only disclosed their initials, date of birth, geographical area and sex. However, 1,926 patients living in nursing homes or other enclosed settings face a higher risk of being identified as the breach also disclosed their residential settings location. In its statement on the data breach, Public Health Wales has confirmed that there is currently no evidence that the data has been misused.  The ICO and Welsh Government were informed, and an external investigation is expected. This acts as a reminder to readers to ensure that they provide regular training sessions on how to avoid data breaches resulting from human error.

Warner Music group discloses data breach  

In August 2020, Warner Music Group discovered a prolonged skimming attack on a number of its e-commerce stores. An unauthorised third party acquired the personal data of customers by installing data-skimming malware on the sites. The personal data is said to have included customer’s names, email addresses and credit card details. Any customer that may have been affected by the breach has now been notified and Warner has offered free identity monitoring services to those customers for 12 months.

DCMS publishes responses to call for evidence on cyber security incentives

The government ran a call for evidence from 4 November 2019 until 20 December 2019 seeking industry input on the barriers faced by organisations and the economy as a whole in taking effective action to manage cyber risks. In particular, it called on industry to identify the information and assurances that would result in organisations better prioritising and investing in the mitigation of cyber risks as part of their broader organisational resilience and business continuity.

This review outlined three identifiable barriers to cybersecurity. These are:

  • a range of inabilities that organisations may have, from not knowing what to do, to not having the right skills and resources;
  • a lack of commercial rationale or business drivers that stimulate the prioritisation of and investment in cyber risk management; and
  • a complex and insecure digital environment within which organisations base many business operations in this digital era.

Following this review, the DCMS will continue to progress the development of new interventions, and will set out further detail on policy proposals in this area over the coming months.

Regulatory enforcement

Marriott monetary penalty notice reportedly delayed until 30 October

The deadline for the ICO’s final decision on its proposed monetary penalty notice was delayed until 30 September, due to the coronavirus pandemic. With September having come and gone, there are reports that the deadline has been pushed back further until 30 October. Although this has not been officially confirmed, no notice has yet been published. With the original proposed fine amounting to £99 million, we await news of whether this will be reduced and by how much. Nor is there any news of the ICO’s proposed fine of £183 million against BA, the revised deadline for finalising which was also pushed back to 30 September.

H&M fined €35 million by German authorities

The German Federal Commissioner for Data Protection and Freedom of Information (the “BfDI”) has fined fashion retail giant, H&M, €35.3 million for internal data security breaches at its customer service centre in Nuremberg.

The fine, issued on 1 October, relates to an ongoing breach of employee privacy after the BfDI found that H&M employees in Nuremberg have been subject to illegal collection of person data since 2014.

After taking any sick or annual leave, employees were required to take part in “Welcome Back Talks” upon their return. Details recorded from these sessions included symptoms of illnesses and diagnoses. Further, the BfDI reports that personal facts learned through general conversation between supervisors and their team led to some details being recorded and digitally stored. Some of this data was available for up to 50 members of management to access freely and was used to evaluate work performance and make employment decisions.

H&M has issued a statement which apologies for the breach and states that employees affected will be offered monetary compensation. The company also stated that they have implemented improvements to the internal auditing practices and leadership and staff education at the service centre in Nuremberg. 

This fine is the highest imposed by the BfDI since the GDPR came into force in 2018 and it is the second highest across the continent after the French SA fined Google €50 million last year for a GDPR breach. 

ICO fines company for using data collected for free samples in direct marketing

The ICO has fined Digital Growth Experts Ltd (DGEL) £60,000 for sending direct marketing texts to customers promoting a hand sanitising product.

In the monetary penalty notice, the ICO noted that DGEL appear to have acquired the data used for direct marketing from a number of sources: one of which being data obtained via social media advertisements purporting to offer free samples. From the evidence provided by DGEL, individuals were asked to provide data in order to receive a voucher code which could be used to apply for a free sample. In signing up for the voucher code, however, the individuals were automatically enrolled for direct marketing.

The ICO found no evidence that the individuals were advised about the automatic enrolment, nor were individuals given a simple means of refusing the use of their contact details for the purpose of direct marketing. In other words, data subjects were told about the free sample but not about the automatic opt-in to receiving direct marketing messages. This lead the ICO to conclude that the consent obtained by DGEL was invalid because the ‘consent’ being relied on was not freely given.

Data protection legislation does set out a ‘soft opt-in’ exemption which allows businesses to email or text its own customers, but it does not apply to prospective customers or new contacts. The ‘soft opt-in’ exemption was considered not to apply to the individuals in the DGEL case, in part because the act of applying for a free sample cannot be said to be a ‘sale’ or ‘negotiation of a sale’ in this case. In any event, it appears that DGEL sent direct marketing to all individuals who provided their details, and not just those who may have chosen to redeem the voucher for the free sample.

This decision by the ICO provides a clear message to organisations: if you want to process data obtained by way of an application for free samples for any reason other than providing the free sample, including direct marketing, then the data subject must provide consent through an option to opt-in or out of direct marketing.

The ICO issues a fine of £130,000 for making unauthorised cold calls

Swansea based company, CPS Advisory Ltd, has been fined £130,000 by the ICO for making 106,987 unauthorised direct marketing calls to people about their pensions.

In an effort to prevent people falling victim to scams, changes were made to the Privacy and Electronic Communications Regulations ("PECR") in 2019. These changes significantly limited the categories of people authorised to make pensions marketing calls. The ICO notes that under this law, companies can only make calls to people about their pensions if:

  • the caller has authorisation from the Financial Conduct Authority to do so, or "is the trustee or manager of an occupational pension scheme or a personal pension scheme"; and
  • the person being called has consented to being called, or already has a relationship with the caller.

In light of the above criteria, the ICO investigation concluded that CPS Advisory Ltd was not authorised to make calls to people about their pensions. In its monetary penal notice, the ICO stated that the calls represented “a significant intrusion into the privacy of the recipients of such calls”. This case should serve as a reminder of the ICO's consistent focus on taking enforcement action arising out of breaches of PECR, particularly in the financial services context.

Private hire drivers launch legal action against Ola

Two private hire drivers from London have reportedly launched a legal action in the Netherlands against Ola, a ridesharing company. The drivers allege various violations of the GDPR. In particular, it is understood that the drivers have requested their personal data be provided to the data trust of their union. It appears that the drivers' focus is on the algorithm used by Ola to manage its platform, and it is alleged that Ola has failed to provide all of the personal data requested. The case may well provide further insight into the enforcement of alleged GDPR breaches involving the processing of personal data by algorithms.

An Irish Government department is being investigated by the Irish DPC over its collection of personal data in relation to COVID-19 payments

Ireland’s DPC has made clear its intention to launch an investigation into the Department of Employment Affairs and Social Protection (the “DEASP”). The investigation is centred on whether the DEASP illegally collected and retained personal data without reasonable justification. The DEASP collected and used personal data in relation to travel plans as part of its continued effort to stop and prevent abuse of the COVID-19 pandemic unemployment payment system. The collected data was used to stop and question passengers travelling out of Dublin airport. As a result of these airport checks, hundreds of pandemic unemployment payments were stopped. The DPC’s investigation will determine if the DEASP’s collection of such data is lawful. This is yet another clear example of data privacy issues emerging from the fight against COVID-19.

Twitter’s 2018 and 2019 data breaches have been referred to the EDPB

Ireland’s DPC has now concluded its investigation into a number of data breaches suffered by Twitter in late 2018 and early 2019. As part of its investigation, the DPC also considered whether Twitter had satisfied its obligation under the GDPR to make timely disclosure of the breaches.

As users will be aware, under the powers conferred by the GDPR, the maximum fine Data Protection Authorities (“DPAs”) can impose is the higher of:

  • 4% of the company’s global revenue; or
  • €20,000,000.

In coming to a decision, the DPC were under an obligation to submit a draft decision to other concerned Supervising Authorities (“SAs”) and to take due account of their views. Certain objections were made and maintained by other SAs and, as a result, the DPC has made a referral to the EDPB under Article 65 of the GDPR.

This is highly significant as it is the first time that the EDPB has had the opportunity to use one of its key powers in order to resolve a dispute among DPAs in the European Union. Article 65 of the GDPR requires the DPC to adopt the EDPB's decision within one month of its announcement.

The CNIL has commenced investigation into TikTok

The French data protection authority, the CNIL, is the latest data protection authority to commence an investigation against TikTok, the Chinese video-sharing social networking service. As part of its investigation, the CNIL will consider:

  • the processing of TikTok users’ personal data;
  • TikTok users’ rights of access to personal data;
  • the flow of personal data going out of the European Union; and
  • the procedures currently in place to protect children and their personal data.

TikTok is already being investigated by the Committee on Foreign Investment in the United States, the European Data Protection Board and the Dutch Data Protection Authority. The outcome of these investigations will provide crucial guidance on how data privacy violations will continue be dealt with across the globe.

The CNIL imposes a €250,000 on Spartoo

The CNIL has imposed a penalty of €250,000 on Spartoo, a French-based company specialising in online sales of shoes. After inspecting Spartoo in May 2018, the CNIL identified multiple GDPR breaches. One data breach related to Spartoo’s practice of permanently storing recordings of entire telephone conversations and customer bank details for the purpose of training and fraud protection. This practice was found to be excessive and contrary to the principle of data minimisation under Article 5(1)(c) of the GDPR.

This decision acts as a reminder to readers that regulators are committed to undertaking audits of GDPR compliance and if material deficiencies are identified, financial sanctions are likely.

Civil litigation

The High Court have refused permission for judicial review in a data breach case

The High Court recently handed down judgment in R (on the application of AB) v (1) Northumbria Healthcare NHS Foundation Trust (2) Cumbria, Northumberland Tyne & Wear Foundation Trust [2020] EWHC 2287. In this case, the Defendant had refused to delete inaccurate information relating to a 17-year-old disabled boy’s sexual behaviour from his medical records. The boy’s mother argued that her son’s data protection rights would be infringed if the incorrect information was not deleted. Before seeking permission to judicially review this decision, the mother had complained to the ICO. The High Court found, however, that two other alternative remedies should also have been pursued before resorting to judicial review. The first was an application for a compliance order under s.167 / s.168 Data Protection Act 2018. The second was a county court claim under s.114 Equality Act 2010. The mother’s application for permission for judicial review of the decision was therefore rejected. This decision serves as a reminder that judicial review applications are only appropriate when all other resolution methods have been exhausted.

Class action against YouTube for unlawful use of children’s data

YouTube has allegedly been unlawfully handling and processing the personal data of up to five million children under the age of 13 in England and Wales. A representative (Mr McCann) is making a claim against YouTube in an effort to prevent any further breaches of UK and European data protection laws. If the claim is successful, every child in England and Wales may be entitled to compensation if they used YouTube after 25 May 2018. Technology is increasingly becoming more embedded at every level of society, it is therefore unsurprising that calls for the protection of children’s data and privacy rights are becoming more frequent. If this claim succeeds, it could result in YouTube facing a liability running into billions.

High Court ruling provides guidance on numerous and repetitive DSARs

The High Court recently handed down judgment in Lees v Lloyds Bank plc [2020] EWHC 2249 (Ch). It was claimed that the Defendant had failed to provide adequate responses to the Claimant’s various Data Subject Access Requests (“DSARs”). The Claimant alleged that the Defendant’s refusal to provide data constituted a breach of the GDPR. The Court disagreed and found that the Defendant had provided adequate responses to the Claimant’s requests.

In these circumstances, the Court noted the following as "good reasons" for refusing to exercise the court's discretion in favour of the Claimant:

  • the Claimant’s numerous and repetitive DSARs were abusive;
  • the real purpose of the DSARs was to obtain documents rather than personal data;
  • the DSARs served collateral purposes; and
  • the data sought would have been of no benefit to the Claimant.

Of particular interest in this case the apparent inconsistency between Master Marsh's decision and:

  • the Court of Appeal in Dawson-Damer v. Taylor Wessing LLP [2017] EWCA Civ 74,in which the Court of Appeal expressly rejected that the underlying purpose of the DSAR was relevant to the exercise of the Court’s discretion to order compliance with a DSAR and
  • the ICO's guidance with respect to DSARs, which, based on the Court of Appeal's decision in Dawson-Damer, is that they should be treated as being “motive blind”.

Application of the GDPR and DPA 2018 in the context of internal disciplinary proceedings

The High Court recently handed down judgment in Hopkins v Commissioners for HMRC [2020] EWHC 2355. An individual (the “Claimant”) brought a claim against its employer (“HMRC”) for numerous alleged breaches of the GDPR.

The High Court dismissed the Claimant's claim against HMRC for various alleged data protection legislation breaches. The Claimant had been arrested for very serious sexual offences. Pursuant to her employment contract, the Claimant disclosed this to her line manager. The Claimant was suspended on full pay by HMRC, pending disciplinary proceedings. At the date of judgment (1 September 2020), the Claimant had not been charged with any offences. The claim concerned the processing by HMRC of the Claimant’s personal data, including criminal offence data, and the way in which her ongoing disciplinary proceedings have been handled. In respect of the claim that the processing of the Claimant’s personal data was unlawful under GDPR and DPA 2018, twenty separate breach allegations were made. The Court held that only one alleged breach had merit, namely that HMRC had failed to respond to a DSAR in the required time period. For the remaining 19 alleged breaches, HMRC were found to have acted lawfully. Some interesting points in relation to the Court’s findings are listed below:

  • Article 10 of the GDPR and section 11(2) of the DPA 2018 do not create a discrete obligation to “acknowledge” that personal data is criminal offence data.
  • HMRC’s processing of the Claimant’s criminal offence data as part of the disciplinary investigation met the requirements of Article 10 of the GDPR and, in accordance with Article 6(1)(b) of the GDPR, the processing was necessary for the performance of the Claimant’s employment contract.
  • HMRC had lawfully shared the Claimant’s personal information internally in relation to the investigation.
  • HMRC was required, under the Revenue and Customs (Complaints and Misconduct) Regulations 2010, to share personal data externally with the Independent Office for Police Complaints.It was therefore shared lawfully.
  • HMRC’s Staff Privacy Notice met the requirements of Article 13 of the GDPR – it was widely available on the staff intranet and had been provided to the Claimant along with HMRC’s conduct and discipline rules.

This case provides a practical analysis of the application of the GDPR and DPA 2018 to internal disciplinary proceedings. Readers are encouraged to ensure that good staff privacy notices and policies are in place.