Data Protection update - December 2019/January 2020
Welcome to a bumper edition of our Data Protection Bulletin, covering the key developments in data protection law from December 2019 and January 2020.
- Advocate General advisory opinion endorses standard contractual clauses for cross-border data transfers
- EDPB publishes guidelines on right to be forgotten in search engine cases
- Danish supervisory authority publishes set of standard form controller-processor clauses
- ICO releases draft direct marketing code for consultation
- ICO publishes detailed right of access guidance
- ICO publishes final Age Appropriate Design Code
- UK to establish big tech regulator
- New Year's honours list blunder
- Kenya enacts new data protection legislation
- ICO delays call on BA and Marriott fines
- Mixcloud data breach exposes over 20 million user records
- JPMorgan to ban fintech apps from using customer passwords
- MoJ to develop cyber security logs and aggregation platform
- Cyber insurance market set to reach $15 billion by 2022
- Cyber attackers target UK nuclear industry
- National Crime Agency brings down Trojan marketplace
- Android vulnerability leaves apps vulnerable to attacks
- Travelex consults cyberattack agency after virus strike
- Multiple security flaws discovered in TikTok app
- NSA alerts Microsoft to security vulnerability
- Dixons Carphone fined £500,000 after hackers target tills
- First GDPR fine issued by the ICO
- ICO fines 340 organisations for not paying data protection fee
- Former council worker sentenced for unlawfully obtaining personal data
- Former reablement officer sentenced for obtaining data without authorisation
- Another former social worker fined for unlawful disclosure of personal data
- EUR 9.5 million fine for German telecoms provider
- German court held Facebook's user consents obtained as part of sign up process were invalid
- German court hands down 15,000 EUR fine for failure to comply with Art 15 GDPR
- FTC issues Final Order in respect of Cambridge Analytica
Advocate General advisory opinion endorses standard contractual clauses for cross-border data transfers
The Advocate General (“AG”) has endorsed the use of standard contractual clauses (“SCCs”) for cross-border data transfers in an Opinion on the Schrems II case that will provide comfort to organisations relying on the transfer mechanism. By way of (brief) background, the case concerns a complaint made by Max Schrems to the Irish Data Protection Commissioner about Facebook’s transfers of personal data to the US.
The AG’s crucial point was that the CJEU should not find that Commission Decision 2010/87 – which sets out and approves the SCCs – is invalid. The validity of such clauses “depends only on the soundness of the safeguards which those clauses provide in order to compensate for any inadequacy of the protection afforded in the third country of destination”. The use of SCCs should be assessed on a case-by-case basis, with consideration being given to “all of the circumstances characterising each transfer, which may include the nature of the data and whether they are sensitive, the mechanisms employed by the exporter and/or the importer to ensure its security, the nature and the purpose of the processing by the public authorities of the third country which the data will undergo, the details of such processing and the limitations and safeguards ensured by that third country.” Responsibility is placed firmly on the data controller (and, as a backup, the supervisory authority) for this assessment and any subsequent decision to suspend or prohibit transfers if necessary.
The AG also noted that it would be “premature” for the CJEU to decide upon the validity of the EU-US Privacy Shield in its judgment in this case.
The Opinion is not binding on the CJEU, which is expected to release its judgment in the first half of this year. If the CJEU follows the AG’s Opinion, it will be interesting to note if it gives any commentary on how controllers should carry out their assessment of the safeguards for each transfer.
Following an increase in the number of complaints to supervisory authorities concerning the refusal by search engine providers to delist links, the EDPB has published draft guidelines on the criteria of the right to be forgotten in search engine cases. The Guidelines address: (1) the grounds on which a data subject can rely on for a delisting request pursuant to Article 17.1 GDPR; and (2) the exceptions to the right to request delisting pursuant to Article 17.3 GDPR which a search engine provider may rely upon in seeking to reject such a request.
As to (1), the Guidelines recognise that an individual will most likely be able to request delisting because: (i) it is no longer necessary for the search engine to process their data (Article 17.1(a)); or (ii) the individual has exercised their right to object to the processing of their personal data on grounds relating to their personal situation under Article 21.1 (Article 17.1(c)). If it is the latter, a balancing exercise will have to be carried out whereby the individual’s situation is set against the search engine’s compelling legitimate grounds to maintain the search listing.
As to (2), the key takeaway is that the exceptions under Article 17.3 “do not appear suitable in [the] case of a delisting request”. Instead, Article 21 – and the balancing exercise touched on above - should be applied in connection with such requests. The Guidelines give an example of a compelling ground which will tend to tip the balance in favour of the search listing being maintained, that is where its retention is strictly necessary for protecting the freedom of information of internet users.
The EDPB is seeking comments on the guidelines – which are currently in draft format – by 5 February 2020, and they are currently in the process of developing more general guidelines in respect of Article 17 GDPR.
A standard form set of SCCs for contracts between controller and processor - as adopted by the Danish supervisory authority - has been published by the EDPB. The SCCs are aimed at helping organisations to meet the requirements of Article 28 (3) and (4). A copy can be found here.
The ICO has released its draft Code of Practice on Direct Marketing. Interesting takeaways include:
- Encouraging individuals to tell their friends and family about your services – essentially marketing on your behalf – is not generally allowed, as there is likely to be no valid consent;
- Consent, not legitimate interests, is the most appropriate legal basis for processing under the GDPR if consent is required under the Privacy and Electronic Communications Regulations (“PECR”). If you have obtained consent in compliance with PECR, using legitimate interests (when you already have GDPR-compliant consent) would be unnecessary;
- Generally speaking, incentivising people to provide consent to direct marketing is allowed; after all, there is usually some inherent benefit to individuals if they consent to marketing. However, the line will be crossed if consent is made a condition of accessing the service or benefit;
- There is a new section on data protection by design and data protection impact assessments (“DPIAs”) for direct marketing;
- If you infer special category data, and use this inference to direct market (for example, by marketing to everyone you assume is pregnant on the basis of their purchases on a loyalty card), you will need an Article 9 GDPR condition to do so;
- A “good practice recommendation” to have a six month time limit on using marketing consents from new customers obtained via a third party; and
- There is a detailed section on online ads. If you advertise on social media using “custom audiences”, you must be upfront and transparent about this and consent is likely to be the legal basis for processing. As for “lookalike audiences”, you are likely to be joint controller with the social network and you will need to satisfy yourself that the social network has provided all necessary transparency information to the relevant individuals.
The consultation is open until 4 March 2020, and a final version is expected to be released later this year.
The ICO has published draft guidance on data subject access requests (“DSARs”) (the “Guidance”). Evolution, rather than revolution, is the flavour of the day: the Guidance is essentially a more detailed version of the previous iteration, published in April 2018. One key point of difference is worth noting. Under the previous guidance, if you asked the requestor for further information to clarify his/her request, the clock stopped whilst you waited for this to be provided. The new Guidance explicitly states that the clock will no longer be considered to stop whilst you await receipt of any further information. This small – but potentially practically important – change is already reflected in an updated version of the right of access guidance on the ICO website. The website was in fact updated in August 2019, before the consultation started, so it looks like this aspect of the guidance could be here to stay, regardless of the outcome of the consultation process.
The consultation on the Guidance is open until 12 February 2020.
The ICO has published a final version of the Age Appropriate Design Code - setting out fifteen standards to help protect children’s privacy – following an extensive consultation process which began in April last year. The code must still be laid before Parliament for approval and, assuming that is forthcoming, there will be a further 12 month period in which organisations should update their practices before the code comes into full effect (likely to be autumn 2021). The Code can be found here.
The Financial Times has reported that the UK is set to establish a new regulatory body to monitor big tech companies. The news comes in the wake of the Furman review – led by Jason Furman, chief economic adviser to Barack Obama – which recommended the establishment of a dedicated regulator to monitor the “emergence of powerful new companies” in the tech sector. It is envisaged that any such regulator will be armed with powers to enforce a new set of rules, including the introduction of an enforceable code of conduct. Diane Coyle, who was part of the Furman review, struck a firm note of warning: “A reckoning is coming for the biggest digital platforms”.
The Cabinet Office has apologised and referred itself to the ICO after it leaked the personal addresses of 1,097 individuals on the New Year honours list. The addresses were available online for around one hour on Friday 27 December. Among the individuals whose data was compromised were celebrities – such as Elton John, but also, more concerningly, senior diplomatic and military figures, as well as counter-terror police. The leak was described by Jon Trickett, shadow Cabinet Office minister, as “unacceptable”: “If the government can’t get sensitive details right then how can it possibly expect us to believe that it can sort out the big issues facing the country”.
On 25 November 2019 Kenya’s new data protection legislation – the Data Protection Act 2019 – came into force. A useful summary of the legislation, as well as a discussion as to the implications of its enactment on the data privacy environment in Kenya, can be found here.
The ICO has seemingly stalled on confirming the monetary penalties which were expected to be handed to Marriott International and British Airways at the start of January, extending at the eleventh hour the period for doing so by a further three months. The regulator had six months from serving a notice of intent (which was given to BA on 8 July 2019 and Marriott a day later) to confirming the monetary penalty. It has been reported that the ICO has made the following (non-public) statement: “Under Schedule 16 of the Data Protection Act 2018, [both BA and Marriott] and the ICO have agreed to an extension of the regulatory process until 31 March 2020. As the regulatory process is ongoing we will not be commenting any further at this time”.
Mixcloud, a UK music-streaming platform, has admitted it has suffered a significant data breach affecting more than 20 million people’s personal data. Data including email addresses, usernames and passwords have been hacked and put up for sale on the ‘dark web’.
In response, Mixcloud have urged users to change their passwords, albeit reassuring people that the breach does not affect their credit card or home address details.
Mixcloud are actively investigating the incident and have apologised to those affected.
JP Morgan Chase have said they will ‘ban’ third party FinTech apps from using customer passwords to access their bank accounts. In an attempt to increase customer security, instead of using passwords, the bank plans to issue ‘tokens’ to these third parties that produce a narrower range of data that is ultimately more secure for the bank’s customer.
The reason behind the move is unsurprising. The interaction between banks and fintech apps has increased with the rise of open banking. Back in 2016, the CEO of JP Morgan Chase, Jamie Dimon, warned against the risks of sharing data with third party apps by noting that “Many third parties sell or trade information in a way customers may not understand, and the third parties, quite often, are doing it for their own economic benefit – not for the customer’s”. It is therefore unsurprising that JP Morgan intends to move away from passwords and towards a more secure platform to protect its customers’ sensitive data.
Third party app Aggregator Yodlee became the first company to use the token system in all its interactions with JP Morgan Chase, whilst another app – Plaid - is in the pipeline to start using it. Sima Gandhi, head of strategy at Plaid, has said her company are working with JP Morgan Chase to ensure that the tokens allowed it to capture all of the data its customers needed.
The bank has not yet decided on the exact date it will completely ban password sharing.
The Ministry of Justice (“MoJ”)’s security and privacy team are working to develop a centralised cyber security platform. The decision to do so comes as a result of the strains caused by having its security logs held in multiple systems, making it harder to understand and query the organisation’s cyber position.
The aim of the new centralised system is to create a central store of logs that can be accessed in one place, making it easier to analyse trends to help spot cyber-attacks and track cyber activity.
The MoJ has allocated a budget of £280,000 and invited contractors to bid for a solution that should be implemented in the first quarter this year.
Having made plans to update its systems, the MoJ also plans to draft new policies, standards and guidance to strengthen its cyber security.
The Financial Times has predicted that cyber is becoming one of the fastest growing industries in insurance. Due to the increasing number of cyber-attacks worldwide, cyber protection is on the rise, with the cyber sector expected to reach $7.5 billion in value by the end of the decade.
The driving factor behind this expansion is simple: cyber-attacks are expensive. For example, the 2014 Marriott data breach cost the hotel chain over $100 million dollars even before fines were issued. On top of this, the hotel’s insurance policies paid out $102 million.
The insurance market has picked up on the costs and sees this as an opportunity for growth. Not only can insurance cover some of the losses caused, but insurance companies are also offering other services to assist customers, such as forensic investigators and public relations experts to deal with the aftermath of attacks.
A freedom of information request has revealed that a nuclear power station in the UK fell victim to a digital cyber-attack, with the National Cyber Security Centre being called in to help the recovery process.
Board minutes from the power station suggest this was the first time a cyber-attack has successfully infiltrated the organisation. However, the extent of the attack is unknown and it is unclear whether it put the public at risk.
David Lowry, an independent nuclear research consultant, commented: “They are very aware that they only need one incident and it will destroy the reputation of the resilience of the entire system.”
A cybercriminal group which sold hacking tools online has been found and shut down by the National Crime Agency. The malware, called ‘shockwave’, was a remote tool that allowed hackers to steal sensitive data from individuals by using methods such as recording keystrokes, taping videos and accessing log in details. It is unclear how many individuals were affected by the attacks, but given the malware was being sold for as little as $25 it is thought that at least tens of thousands may have had their machines broken into.
The NCA made a distinction between those holding a licence to use the malware and those who actively hacked computers. The law recognises the act of breaking into the machine as the criminal offence, whereas merely owning a license for the malware is not illegal.
In order to find and expose the cybercriminal group, the NCA drafted 85 search warrants across nine countries. Out of these warrants, 21 were executed across the UK including London, Manchester, Leeds, Somerset, Essex and Merseyside. In total nine people were arrested in the UK, while 14 were arrested globally.
As Steven Wilson, head of the European Cybercrime centre, has commented, this is a sharp reminder that “We now live in a world where, for just US$25, a cybercriminal halfway across the world can, with just a click of the mouse, access your personal details or photographs of loved ones or even spy on you.”
Researchers at the Norwegian app security company Promon have recently discovered an Android flaw, which has been given the name “StrandHogg”. StrandHogg is essentially malware that poses as a legitimate app, enabling hackers to unknowingly steal user’s personal data such as numbers, log-in details, phone recordings and even video recordings stolen from the phone’s camera and microphone.
The flaw was discovered when Promon realised apps had been stealing money from bank accounts. In total 60 financial institutions were targeted by hackers acting through the various apps, which suggests hackers have been utilising the security hole for a while.
Tom Hansen, chief technology officer at Promon, has said “We’d never seen this behaviour before. As the operating system gets more complex it’s hard to keep track of all its interactions. This looks like the kind of thing that gets lost in that complexity.”
Travelex suffered a cyberattack via a software virus on New Year’s eve, causing it to shut down all its online systems.
Since the attack, the exchange company has been “working closely” with the National Cyber Security Centre to analyse the extent of the occurrence.
Travelex apparently did not report the incident to the ICO. Travelex has claimed no personal or customer data was taken during the attack.
Tik Tok, an increasingly popular social media app amongst teenagers, which allows individuals to record music videos, has come under scrutiny after the discovery of security flaws.
The app, originally created in China, has been downloaded over 1.5 billion times. The security flaws allowed hackers to conduct multiple attacks such as taking control of the user’s account and, inter alia, stealing sensitive personal data, sending false requests to other users and sending links that redirect users to malicious websites.
A message went out warning users to update to the latest version of the app to ensure they are protected.
This is not the first time the app has faced security problems but a spokesman of TikTok’s security team has nonetheless assured the public they are “committed to protecting user data”.
The NSA has warned Microsoft that its operating systems, specifically Windows 10, contains a hazardous flaw that may be used by hackers to create and use malicious malware. This is the first time the US agency has warned Microsoft about a bug, allowing the company to fix the issue, rather than taking its traditional approach of “weaponising” flaws to use for its own purposes.
The NSA has come under harsh criticism for sitting on malicious bugs in order to develop its own hacking and spying tools, in the knowledge that these same tools may fall into to the hands of cyber criminals who exploit them.
Anne Neuberger, NSA Cybersecurity Directorate head, noted that “We wanted to take a new approach to sharing and also really work to build trust with the cyber security community”.
The ICO has fined Dixons Carphone £500,000 in relation to a data breach that affected the personal data of 14 million customers.
The attack was caused by a failure in its computer systems. Between July 2017 and April 2018, the attacker installed malware on 5,390 tills at Curry’s PC World and Dixon Travel stores. Private information was being collected for 9 months before the attack was uncovered.
The ICO found that the company had poor security arrangements and had failed to take adequate steps to protect customer personal data, in breach of the Data Protection Act 1998. ICO director, Steve Eckersley, commented: “It is very concerning that these failures related to basic, commonplace security measures, showing a complete disregard for the customers whose personal information was stolen”.
On 20 December 2019, Doorstep Dispensaree Ltd, a London based pharmacy that provides medicine to care homes, was fined £275,000 by the ICO, the first time the regulator has handed down a fine under the GDPR.
The company was found to have left around 500,000 documents in a number of unlocked containers at the rear of its Edgware premises. The documents contained personal customer information, including addresses, names, NHS numbers, dates of birth, prescriptions and other medical information. There was no evidence that third parties had accessed the documents, but they were not secure, not marked as confidential waste, and had suffered water damage. The ICO found that the company was in breach of its obligations under Articles 5, 13, 14, 24(1) and 32 of the GDPR.
In addition to the fine, the ICO has ordered the company to update its data handling policies and operating procedures to ensure compliance with the GDPR going forward.
The ICO has reported that between 1 July and 30 September 2019, they issued 340 monetary penalties to organisations that did not pay their data protection fee. 119 of those penalties were handed to organisations in the health sector.
Paul Arnold, deputy chief of the ICO, has reminded small businesses that it is illegal not to pay the data protection fee unless an exemption applies.
Michelle Shipsey, a former social services support officer at Dorset Country Council, has been prosecuted for the unauthorised access of social care records.
An internal investigation by the council found that Ms Shipsey had “inappropriately” accessed and viewed social care records of four individuals known to her “without a business need to do so”, in breach of section 170 of the Data Protection Act 2018 (“DPA 18”). She was sentenced by the Poole Magistrates’ court to a 6 month conditional discharge and ordered to pay costs of £700.
Head of Investigations at the ICO, Hazel Padmore, noted that “Individuals accessing social services support are often already in a vulnerable position and have the absolute right to expect their dealings are treated with the utmost respect and in accordance with data protection laws.” It was noted that, in light of Ms Shipsey’s attendance at data protection and cyber security training, she should have had regard to the importance of her personal duty of upholding client confidentiality.
In December, shortly after the prosecution of Michelle Shipsey, the ICO also took action against Dannyelle Shaw, a former Reablement Officer at Walsall Metropolitan Borough Council. She was similarly prosecuted for accessing social care records from seven adults and nine children from a database “without authorisation” and found guilty of breaching section 55 of the Data Protection Act 1998. Appearing before Wolverhampton Magistrates’ Court, Ms Shaw admitted to one offence of unlawfully obtaining persona data and was fined £450.
Continuing in a similar vein, a former social worker, Leo Kirk, has been prosecuted for unlawfully disclosing referrals for residential and foster care placements for vulnerable young people aged between 16 and 18. The information that was disclosed included sensitive personal data relating to the children in care, such as health records and details of the risk of child sexual exploitation and any history of abuse. Mr Kirk admitted two offences of unlawfully disclosing personal data before Stockport Magistrates’ Court and was fined £483.
German telecoms provider 1&1 Telecom GmbH has been fined EUR 9.55 million by the Federal Commissioner for Data Protection and Freedom of Information (“BdFI”) for deficiencies in its call centre authentication procedure. The BfDI noted that callers could, simply by providing the name and date of birth of a customer, obtain extensive further personal data in respect of that individual. The authentication flaw was deemed to constitute a violation of Article 32 GDPR, according to which the company is obliged to implement appropriate technical and organisational measures to protect the processing of personal data. The fine was reduced in light of subsequent improvements made to the authentication process. 1&1 have signalled their intention to challenge the fine.
The Berlin Superior Court of Justice (“Kammergericht”) has partially upheld a complaint from the Federation of German Consumer Organisations that various Facebook terms were not GDPR-compliant. In particular, the court noted that Facebook had failed to obtain informed consent in respect of: (i) the use of profile pictures and names for commercial purposes; (ii) location services (which are activated by default and reveal the users’ location to chat partners); (iii) transfers to the US; and (iv) (future) amendments to Facebook’s data policies and provisions.
Facebook has responded by pointing out that the case dated back to 2015 and that the terms cited in the complaint “have long ceased to exist”: “Independent of these German proceedings, we substantially revised our Terms of Service and Data Policy in the spring of 2018”.
A copy of the ruling (in German) can be found here.
Elsewhere, Facebook has been hit with a $1.6m fine from Brazil’s Ministry of Justice for the improper sharing of 443,000 Facebook users’ data with the developers of the App “thisisyourdigitallife”.
The Wertheim Local Court in Germany has handed down a fine of 15,000 EUR to a company for a failure to meet its obligations under Art 15(1)(g) GDPR. Art 15(1)(g) provides a data subject with the right to obtain from the controller, “where the personal data are not collected from the data subject, any available information as to their source”. Whist some information seems to have been provided, it was not done so in a “precise, transparent, comprehensible and easily accessible form in clear and simple language” as required under Art 12(1). So, whilst a “company U.P. GmbH” was mentioned as a source, this was qualified by the use of “e.g.”, which was deemed insufficiently precise. The court also noted that the information provided should include not only the type/category of data, but the specific data processed, as well as details as to when the data was transferred to the controller.
The Federal Trade Commission has issued a damning Opinion on Cambridge Analytica, stating that the political consulting firm violated the FTC Act through deceptive conduct – both in relation to its participation in the EU-US Privacy Shield framework and in respect of the collection of Facebook users’ personal data for profiling and targeting. The FTC also issued a Final Order, prohibiting Cambridge Analytica from making representations about the extent to which it protects the privacy of personal information and its participation in the EU-US Privacy Shield. It is also required to continue applying Privacy Shield protection to personal data it collected, or return or delete this information. The FTC noted that Cambridge Analytica, which is now defunct, never responded to the agency’s request for a court judgment, nor to its legal complaint.