Data Protection update - May 2020
Welcome to our data protection bulletin, covering the key developments in data protection law from May 2020.
- COVID-19 contact tracing: privacy notice published and data protection expectations on app development
- EU contact tracing apps get new interoperability guidance
- ICO publishes its key priorities during Covid-19
- ICO pauses investigation into adtech, real-time bidding due to Covid-19 developments
- ICO publishes new guidance for employers on workplace testing for coronavirus in anticipation of some sectors returning to work
- EDPB publishes updated guidelines on online consent to process personal data and “consent cookie wall”
- EU Court of Justice to rule on data transfer mechanism in July 2020
- The European Commission set to review the GDPR
- Territorial scope of the right to be forgotten under EU law limited by French administrative court decision
- Cyber-attack on EasyJet exposed 9 million customers’ details
- GCHQ calls on public to report coronavirus-related phishing emails
- Third party supplier of government hacked
- Hackers target Robert Dyas to steal customers’ payment card details
- Data security flaw exposes details of thousands of legal documents
- Atkinson v Equifax class action withdrawn
- Brave alleges that lack of resources is stymying GDPR enforcement
- Dutch company fined for unlawfully processing employees' fingerprints
- Dutch court requires grandmother to delete her Facebook posts of her granddaughter
- Belgian court fined a company for GDPR breach for appointing its Head of Compliance as DPO
- Irish Data Protection Commission's investigation of Google's smartphone tracking challenged
- DPC issues first fine pursuant to GDPR
- Facebook ordered to pay penalty following Canadian Competition Bureau investigation
- Privacy of non-Germans protected under German law following Constitutional Court ruling
COVID-19 contact tracing: privacy notice published and data protection expectations on app development
The UK contact-tracing app is expected to launch in the UK in June after being piloted on the Isle of Wight. Unlike other European countries, the UK is taking a centralised approach to contact tracing. The centralised approach means that data will be sent through to the NHS’s centralised server if the user informs the app that they have Covid-19 symptoms. This contrasts to the decentralised model which has been adopted by countries such as Germany, Italy and Ireland which does not send the data through to the health services but rather stays on the individual’s phone. Critics of the centralised approach are calling for transparency on adequate safeguards to secure public trust in the project and sharing of data.
On 4 May, the ICO appeared at the Human Rights Joint Committee to discuss and clarify some of the concerns around data privacy arising from the UK’s contact-tracing app. A recording of the meeting can be viewed here. Key issues were addressed such as the ICO’s conflicted role of advisor and regulator for the app; proposals for how data would be used, stored and deleted on the app; the debate on a centralised/decentralised model; and who would have access to the data on the app.
On 28 May 2020, Public Health England (“PHE”) released the NHS Test and Trace privacy notice which clarifies exactly what personal information is collected by the app; how the information is used; how it is protected; and how long it is kept for. According to the privacy notice, information collected will include names, dates of birth, NHS numbers, postcodes, telephone numbers and Covid-19 symptoms. The information will be held by PHE for 20 years for people who have reported Covid-19 symptoms and 5 years for those who did not report any symptoms but were identified as coming into contact with people with symptoms. The privacy notice explains that this is to help control future outbreaks or provide new treatments. The notice also introduces some of the third parties who are assisting with contact tracing such as NHS professionals, Serco UK and Amazon Web Services who will each act as data processors on the instructions of the Department of Health and Social Care. Interestingly, Amazon Web Services is the only third party not able to see any of the information collected by NHS Test and Trace, whilst the notice explains that all the other third parties will have access to the data but have undergone training to protect the confidentiality of those with Covid-19 and their contacts. The privacy notice will no doubt be scrutinised by those who are concerned about privacy rights being interfered with and it will be interesting to see how the privacy elements of the app evolve over the coming months.
In relation to the ICO’s combined role as expert advisor and enforcer for the contact-tracing app, Elizabeth Denham defended the position explaining that the ICO is not signing off on or approving any of the documents or designs for the app but rather giving NHSX (the UK Government unit with responsibility for driving digital transformation in the NHS) expert advice on some of the key data privacy issues arising in the initial phases. The ICO remained confident that there would not be a conflict when it came to regulating the app and ensuring its use complied with data protection law further down the line.
Matthew Gould, CEO of NHSX, was asked by a committee member for more detail on data usage and how they planned to manage the risk of “mission creep” (the gradual expansion of the project beyond its original scope and focus). Gould explained that any data on the app which had not been passed to the NHS would be deleted (a) whenever the user chooses to delete the app; and/or (b) automatically on the app after a 28 day cycle. Gould suggested that if the data had been passed to the NHS (for example, in a situation where the user had reported Covid-19 symptoms), then it would be kept by the NHS for an “appropriate public health reason” and associated research only and in compliance with the GDPR. He also confirmed that the NHS would not be sharing any data with Apple or Google but that NHSX were working closely with both technology companies throughout the development of the app to ensure they could produce and roll out a workable product on their existing platforms. Elizabeth Denham reassured the committee that a centralised system is just as capable of meeting the data protection requirements as a de-centralised system and she felt encouraged by the fact that NHSX had approached the ICO early on in the development phase of the app to ensure compliance and that data privacy issues were dealt with from the outset.
Guidelines have been published by eHealth Network to ensure developers across the EU can achieve interoperability in the various contact tracing apps as lockdowns start to lift. The guidelines promote collaboration across the different countries to ensure each respective app will work with each other when citizens travel across borders. Advice includes aligning epidemiological criteria such as the ‘triggering’ durations of time people spend with each other and the distance between them. It also raises the issue of ensuring the safe and efficient transfer of data between national health authorities which eHealth Network have advised must be a “trusted secure mechanism” which safeguards personal data.
With the app set to launch in the UK at the end of May, there is still some concern around how authorities will manage, share and process the personal data in compliance with their data privacy obligations. It is believed the NHS has now completed its DPIA, with assistance from the ICO, for the contact-tracing app; however the document is not yet publically available. As mentioned above, the centralised design of the app makes privacy risks and concerns even greater which should be detailed in the DPIA. Whilst management and regulation of these risks concerns remains to be seen, we can expect some clarification once the DPIA is made available to the public.
The ICO continues to monitor the pandemic and adapt their approach accordingly to protect public interest at the same time as supporting economic growth and innovation. This month, Elizabeth Denham, the UK Information Commissioner, released an updated list of priorities for the ICO over the coming months. The updated list focusses primarily on protecting the public interest, enabling responsible data sharing and monitoring intrusive and disruptive technology. The key priorities comprise:
(a) protecting our vulnerable citizens;
(b) supporting economic growth and digitalisation, including for small businesses;
(c) shaping proportionate surveillance (including with respect to contact tracing and testing);
(d) enabling good practice AI;
(e) supporting organisations to be transparent about decisions affecting individuals; and
(f) maintaining business continuity: developing new ways of working in readiness for recovery.
The ICO makes clear that, while it is narrowing its regulatory focus to some extent to make allowances for the impact of the Covid-19 crisis on organisations’ ability to comply with data protection rules (including with respect to meeting timeframes for breach reporting and responding to investigations), the “impact” of the pandemic must be a genuine cause for any delay.
In light of the reassessment of priorities during the Covid-19 pandemic, the ICO has issued a statement in which it explains it has decided to pause the investigation into real time bidding and the Adtech industry. In June 2019, the ICO issued a report identifying a range of issues permeating the industry including breaches of the GDPR’s requirements for transparency, processing personal data and for information to be kept secure. The ICO gave the industry six months to work on the points it raised in the report. An ICO spokesperson said “it is not our intention to put undue pressure on any industry at this time but our concerns about Adtech remain and we aim to restart our work in the coming months, when the time is right.”
ICO publishes new guidance for employers on workplace testing for coronavirus in anticipation of some sectors returning to work
As the UK starts to prepare for an increase in people returning to work in the coming months, the ICO has published guidance for employers to help them comply with data protection laws when managing the Covid-19 health and safety risk in the workplace. The guidance focusses on how to comply with data protection law when testing employees for Covid-19 symptoms or contracting the virus itself. Given the sensitive nature of health data, there are extra precautions to be aware of when collecting data on employees such as conducting a Data Protection Impact Assessment (“DPIA”) and ensuring you collect and retain only the minimum amount of information you need to fulfil your purpose. The guidance stresses the need for employers to find a balance between being transparent to employees at the same time as avoiding the disclosure of too much information such as naming individuals. The guidance also touches on the use of temperature checks or thermal cameras on site as part of the ongoing monitoring of staff saying that employers will need to give specific thought to the purpose and context of its use which will need to be necessary and proportionate.
For many organisations, it will present a delicate challenge to overcome but the ICO’s consistent message should reassure organisations that data protection law should not be viewed as a barrier to managing the pandemic.
EDPB publishes updated guidelines on online consent to process personal data and “consent cookie wall”
On May 6 2020, the European Data Protection Board published updated guidelines on consent under the GDPR. The focus of the clarifications made to the 2018 guidelines, is on: (a) the validity of consent provided by the data subject when interacting with “cookie walls”; and (b) whether scrolling through a webpage could constitute clear and affirmative consent under the GDPR.
In relation to the “cookie walls”, the guidelines highlight the fact that, in order for consent to be freely given, access to a website or mobile app must not be made conditional on the consent of a user to allow that website and/or mobile app to store information or gain access to information already stored on the user’s equipment. Obtaining consent in this way does not present the user with a genuine choice and is therefore deemed unlawful. Separately, the guidelines indicate that scrolling or swiping through a webpage does not meet the conditions for valid consent as there is no affirmative action. The act of scrolling or swiping is too ambiguous and would make it difficult to enable that user to withdraw consent in a manner that was just as easy as granting it.
These clarifications are in line with the decision of the Court of Justice of the European Union in the case of Planet 49 which we covered in the October 2019 bulletin.
The EU Court of Justice (“CJEU”) will hand down the judgment for the case of Facebook Ireland and Schrems (case C-311/18) on 16 July 2020.
The case stems from a preliminary reference to the CJEU by the Irish High Court in 2017, seeking guidance on the validity of the Standard Contractual Clause ("SCC") used by Facebook to transfer data between the EU and the USA. SCCs are standard contractual terms which data senders and receivers adhere to in exporting data from the European Economic Area ("EEA"). As covered in our previous update, in his opinion on 19 December 2019, Advocate General Henrik Saugmandsgaard Øe opined that the SCC mechanism is valid. However, this Opinion is not legally binding; it was prepared to assist the CJEU in its decision-making.
The CJEU's decision will have a significant impact on many international companies based in the EEA who rely on the SCC mechanism for the data flow from the EEA to third countries. If the SCC mechanism is held to be invalid, alternative mechanisms will need to be explored and implemented, at potentially high cost.
We will, of course, provide a further update as soon as the CJEU’s judgment becomes available.
25 May 2020 marked the second year anniversary of when the GDPR took full effect and the European Commission (“EC”) is now set to undertake a full review of the legislation. The EC will specifically report on the international transfer provisions and consistency mechanisms between supervisory authorities. The EC is currently gathering feedback from citizens and stakeholders in the “roadmap” phase following which they will develop a summary report setting out how the EC intends to proceed. We expect the report will be published in the second quarter of 2020.
Territorial scope of the right to be forgotten under EU law limited by French administrative court decision
On 27 March 2020, the Conseil d'Etat (France's highest administrative court) confirmed the limited territorial application of the right to be forgotten under EU data protection law, and, accordingly, annulled a €100,000 fine in the case of Google v CNIL (France's data protection authority).
The case arose from a contested fine of €100,000 imposed by the CNIL on Google, arising out of Google's failure to remove search results containing personal data relating to several individuals on all of its domain name extensions worldwide.
Google challenged the fine before the Conseil d'Etat on the basis that EU data protection legislation merely required it to remove links to search results from Google domains within the EU, and not globally. The challenge relied on the landmark decision of the EU Court of Justice on 24 September 2019 that the right to be forgotten under EU data protection legislation was limited in scope to within the EU territories, and therefore the removal of search results need only be carried out on domains within the EU (Google Inc v CNIL, Case C-507/17, 24 September 2019). In its judgment, the Court of Justice nevertheless concluded that competent national courts can order the de-referencing to be carried out on a global scale if this is provided for under the national law, and provided that this is balanced with the right to freedom of information.
In a statement of 24 September 2019 (available here in French), CNIL interpreted the Court of Justice's judgment as permitting it to impose an obligation on Google to carry out the de-referencing worldwide if this was justified by reference to protecting the individual's right to privacy.
The Conseil d'Etat, in its 27 March 2020 ruling, disagreed and found that the CNIL had misinterpreted the law; annulling the fine. The Conseil d'Etat further held that, in any case, the need to remove links worldwide failed in this instance to balance the individual's right to privacy and the public's right to freedom of information.
The full judgment can be assessed here (in French).
EasyJet has revealed that the personal information including names, email addresses and travel details of approximately 9 million customers has been accessed in a “highly sophisticated” cyber-attack. Of the 9 million affected, 2,208 had credit card details stolen. EasyJet has confirmed that all customers affected have been contacted and offered support. EasyJet has also confirmed that, after extensive investigation, it does not appear that any of the personal information accessed has been misused. Whilst it is not clear how the attack came about, EasyJet has confirmed that they took immediate steps to close down access to the data and have now “bolstered [our] defences to further enhance [our] systems security”. EasyJet is working with the NCSC and the ICO to navigate next steps. It comes at a time when airlines are under immense financial pressure from the impact of Covid-19 so it will be interesting to see if and when the ICO takes steps to issue a fine to EasyJet under the GDPR for data security breaches and the amount of any fine levied. EasyJet’s chief executive has issued an apology to those customers who have been affected and advised customers to be extra vigilant in the current Covid-19 environment, which poses an increased risk of online scams.
A little over a week since EasyJet revealed the details of the cyber-attack, law firm PGMBM has issued a representative action in the High Court against the airline seeking damages of up to £18 billion. EasyJet has not yet commented on the class-action.
Since the start of the Covid-19 crisis, the NCSC, part of GCHQ, has taken down over 2,000 internet scams seeking to target people looking for advice or services related to the pandemic. Scams have included fake online shops selling fraudulent coronavirus-related items, malware distribution sites and phishing sites seeking personal information such as passwords and credit card details. To tackle the issue, the NCSC has now launched a suspicious email reporting service asking the public to report any dubious emails. The NCSC’s automated scanning system will check for scam emails and immediately remove criminal sites.
A company that is one of the UK government's "strategic suppliers" and maintains schools, hospitals and transport networks such as the London Underground, is recovering from a cyber-attack that may have seen the details of up to 100,000 people stolen. Hackers hit the infrastructure of Interserve over the weekend, accessed a human resources database on 9th May and stole information on current and former Interserve employees. Details taken include employee names, addresses, bank details, payroll information, next of kin details, HR records, dates of absences and pension information. The company has informed the ICO of the incident.
Robert Dyas revealed it incurred a data breach, which exposed confidential information including the names, addresses and credit card numbers of its users. The hardware store discovered card-skimming malware on its e-commerce website in March, which had been infecting the platform for around 23 days. As soon as Robert Dyas discovered the malware, their IT Security team took steps to block it from the website. Robert Dyas is working with the relevant authorities and has apologised to its customers advising them to contact their banks and credit card providers to check for any suspicious activity. The ICO is aware of the incident and if it finds fault with Robert Dyas’s security processes, it could impose a financial penalty.
One of the largest software companies in Britain, Advanced Computer Software (“Advanced”), experienced a data security fault which left more than 10,000 legal documents containing sensitive details of commercial property owners unsecured for years in an online database. The documents had been scanned and uploaded by law firms using Laserform, a product from Advanced. The breach is reported to have affected over 190 law firms. The open database was discovered by security researcher, TurgenSec, who then made contact with the affected law firms and Advanced.
Advanced has responded that the data relating to commercial property transactions predated 2017 and was largely available to the public; however it conceded that the documents included email addresses and security verification responses such as mothers’ maiden names and passport numbers which were not in the public domain. A spokesperson at Advanced said in a statement: “We discovered some exposed data on one of our historic software platforms and took immediate steps to address the issue, secure the data and make contact with the small number of affected customers.” We understand that the data incident was not reported to the ICO.
A representative action which was being pursued against Equifax has been withdrawn. The action was being taken on behalf of a class of 15 million UK customers whose personal data had been impacted by the huge cyber-attack that Equifax suffered in 2017, seeking damages totalling over £100m for “loss of control” over the class members’ personal data (in line with the Court of Appeal’s judgment in Lloyd v Google LLC  EWCA Civ 1599 – indeed the instant proceedings were issued a matter of days after this decision).
It is understood that the claim was withdrawn shortly after Equifax filed its Defence which argued that the Court of Appeal’s judgment was wrong in law, and was, in any event, not applicable to claims arising out of cyber-attacks.
Anya Proops QC, who acted for Equifax, said: "[t]his is a major development, and one that carries within it an important cautionary message for all those claimants (and their representatives) who may be inclined to rush precipitously to mount large-scale privacy actions."
There remain several representative actions before the Courts arising out of data breaches, including, cyber-attacks (e.g. the BA data breach) and this nascent area of law is still developing, so the extent to which this development is a portend of things to come is unclear. The Supreme Court’s approach in Lloyd v Google LLC is likely to be instructive in this regard.
Brave, the privacy focussed browser, has filed a complaint with the European Commission with regard to the lack of enforcement action since the GDPR came into force.
“If the GDPR is at risk of failing, the fault lies with national governments, not with the data protection authorities”, said Dr Johnny Ryan of Brave.
“Robust, adversarial enforcement is essential. GDPR enforcers must be able to properly investigate ‘big tech’, and act without fear of vexatious appeals. But the national governments of European countries have not given them the resources to do so. The European Commission must intervene.”
It is hard to disagree with Brave’s analysis. Notwithstanding the relative wealth of resources enjoyed by the ICO by contrast to other supervisory authorities, its two flagship enforcement actions to date, arising out of the BA and Marriott data breaches, remain in stasis, as does its investigation into the adtech and real time bidding industry.
On 30 April 2020, the Dutch Data Protection Authority fined an unnamed company €725,000 for unlawfully processing its employees' fingerprints. The company had used fingerprint scanners to register its employees' attendance and for recording the time they spent at work. The Dutch Data Protection Authority found that the company:
- Could not rely on any exceptions for processing the employees’ fingerprints, which are considered a special category data (as they are biometric data). In particular, the Dutch Data Protection Authority held that the processing of fingerprint data was not necessary for the company's security.
- Had failed to obtain its employees' express permission to process their fingerprints, in breach of data protection legislation.
The company has appealed the fine.
Readers are reminded that a legal basis is required to process personal data, and to ensure that their policies and procedures do not inadvertently result in employees' personal data being processed without valid consent.
A mother of three brought a claim in the Dutch Court of First Instance against the children's grandmother. She sought an order requiring the grandmother to remove photographs of one of the children (a minor) posted by the grandmother on Facebook without the mother's consent. The mother argued that the posts breached the Dutch GDPR Implementation Act, which requires a legal representative's consent for a photo of a minor to be posted on social media platforms. The Dutch Court of First Instance decided that it was uncertain whether the posts fell within the "household exemption" under Dutch law, and since the posts were made without the legal representative's consent, they had to be taken down.
This case is a reminder that caution should be taken when circulating pictures of children. The ICO has previously issued reprimands to two primary schools for wrongly disclosing photos of their pupils without parental consent.
On 28 April 2020, the Litigation Chamber of the Belgian Data Protection Authority fined a company €50,000 for breaching Article 38(6) of the GDPR. Article 38(6) requires a company to appoint a Data Protection Officer ("DPO") and ensure that the DPO is able to carry out their duty without conflicts of interest.
The company had appointed its Head of Compliance, Risk Management and Audit as the DPO. The Litigation Chamber held that this appointment created a conflict of interest, noting specifically that:
- The Head of Audit has decision-making power with respect to the dismissal of employees. If the DPO had this power through their position as the Head of Audit, this would be incompatible with the DPO's role as a confidential advisor for matters related to data protection.
- In their capacity as the Head of Compliance, Risk Management and Audit departments, the DPO is responsible for the data processing activities of those departments. Therefore, a DPO who is also the head of those departments would not be able to exercise any independent oversight over the departments' data processing activities.
This case highlights the importance of appointing an independent DPO. It suggests that the question of whether a DPO who is also the head of another department would be in a position of conflict of interest should be assessed on a case-by-case basis. Readers are encouraged to review whether their DPOs may have any conflict of interest issues.
In November 2018, seven European consumer rights groups filed complaints to their respective national data protection authorities against Google, alleging that Google was tracking smartphone users through location history and web and app activities without consent. However, as Google indicated that its EU headquarters was in Dublin, the Irish Data Protection Commission (the “DPC”) was appointed lead supervisory authority in August 2019. By 26 November 2019, the Irish Data Protection Commission had not issued any decision on whether Google had breached the GDPR.
Since then, on 4 February 2020, the DPC announced that it was commencing, on its own volition, a statutory inquiry into Google's alleged breaches, proposing to assess whether Google had breached the GDPR based on practices in place as of February 2020.
On 1 May 2020, three European consumer rights groups initiated a judicial review action against the DPC's decision in the Irish High Court. The groups are Consumentenbond (Dutch), Forbrukerradet (Norwegian) and dTest OPS (Czech), acting together with Beuc (a European umbrella consumer group). They contend that the Irish Data Protection Commission investigation should be into Google's practices in 2018, when their complaints were initially filed. The judicial review hearing is currently scheduled for 29 June 2020.
A state agency, Tusla, has become the first entity to be fined in Ireland for a data protection breach under the GDPR. The DPC filed a Circuit Court action, which is required under Irish legislation implementing GDPR, to confirm a fine of €75,000 against Tusla in respect of three separate data breaches.
Following an investigation into Facebook's practices between August 2012 and June 2018, the Canadian Competition Bureau found that Facebook had not limited the sharing of Canadian users' personal information with some third-party developers, despite appearing to give users the option to opt out of this data sharing through the "Privacy Settings" page on its website and Messenger application.
Canadian competition law prohibits companies from making false or misleading claims about a product or service, including claims regarding the collection and processing of information from customers. On this basis, the Canadian Competition Bureau concluded that Facebook had made false or misleading claims about the privacy of Canadians' personal information on its website and its Messenger application.
Subsequently, Facebook entered into a settlement with the Canadian Competition Bureau, under which it would pay a fine of $9 million CAD, and an additional $500,000 CAD in respect of the Canadian Competition Bureau’s costs. Facebook has also agreed that it will not make false or misleading representations on its platform and application with regards to the disclosure of users' personal information, including representations on the extent to which users can control access to their personal information.
The German Constitution enshrines the right to privacy. On 19 May 2020, the German Constitutional Court held that this right also extends to foreigners' online data, even if they are based outside Germany.
This was decided in relation to the Bundesnachrichtendienst's ("BND") ability to gather, evaluate and share communication data generated by non-German nationals based outside the country. This power, granted under a 2016 law, allegedly allows Germany's foreign intelligence agency to monitor for security threats. However, following a challenge by a group of journalist and civil liberties organisations, the German Constitutional Court found that this pre-emptive purpose was not clear enough grounds for violating non-German nationals' right to privacy, in breach of Article 10 of the German Constitution.
The German government was given until 2021 to amend the 2016 law which gives the BND this right, so that the 2016 law reflects the universal right of privacy for both German and non-German nationals. However, the Constitutional Court stopped short of striking down the 2016 law in its entirety.