Data Protection update - May 2019
Welcome to the May 2019 edition of our Data Protection bulletin, our monthly update on key developments in data protection law.
- 1 year of GDPR
- Spotlight on data protection issues in the financial services sector
- ICO issues new guidelines on the processing of children’s data
- Electronic payments subject to two factor authentication across the EEA
- ICO launches campaign to help people become more data aware
- Landmark judgment reinforcing the supervisory jurisdiction of the of the High Court
- Prince Harry uses GDPR to win legal battle against paparazzi
- The French Conseil d’Etat reduces fine imposed by the French data protection authority
- Companies' stock value dropped 7.5% after data breaches
- Adtech veteran Quantcast is latest tech giant to face GDPR privacy probe
- Exposed database leaks addresses and income info of millions of americans
- More than half of British firms report cyberattacks in 2019
- Facebook in the news again over repeated breaches; Turkish watchdog imposes fine
- WhatsApp warns users of 'targeted' surveillance attack
- Uniqlo says 460,000 online accounts accessed in Japan hack
- SNP faces fines for data protection breach after election mailing error
- F&B drop the ball
- Voice data to be wiped
- Three million unlawful spam texts
- Fine and enforcement notice for funeral planners
We thought we should acknowledge the passing of the first anniversary of GDPR by sharing a couple of statistics from the year provided by the supervisory authorities of the European Economic Area (“EEA”) in the recent publication from the European Data Protection Board:
- over 140,000 queries and complaints and over 89,000 data breaches have been logged by the EEA supervisory authorities. 63 per cent. of these have been closed and 37 per cent. are ongoing; and
- 67 per cent. of European citizens polled indicated that they have heard of the GDPR. This is a 20 percentage point increase in the figures from 2015. No doubt the increasing awareness of data protection rights will continue to lead to a rise in queries and complaints.
Like many other industries, the financial services industry has increasingly made use of artificial intelligence (“AI”) to assist several of its working processes, such as risk assessment and fraud detection.
The growth of AI has amplified concerns as to its compliance with data protection laws. As AI becomes more complex and integral to working practices, cybersecurity becomes ever more critical.
The ICO and National Institute for Health Research-funded Greater Manchester Patient Safety Translational Research Centre jointly commissioned a citizens’ jury to investigate whether the public think that: (i) people should always get an explanation for an AI-generated decision, even if that means the AI will not reach such accurate conclusions; and (ii) when and why explanations for AI-generated decisions are important. The findings will help to form the guidance being produced by the ICO and The Alan Turing Institute that will guide companies in their use of AI.
This comes at a time when Lloyd’s of London has also published two reports looking at the potential uses and associated risks of using AI in the insurance sector more widely.
The ICO has released a new code (the "Code") that applies to information society services, provided for remuneration, which may be accessed by children. The Code defines children as individuals under the age of 18. It is important to note that the Code applies to both services targeted at children as well as services that are merely likely to be accessed by children. The Code is therefore likely to apply to a number of online services.
The Code applies to all service providers that are (i) based in the United Kingdom (“UK”); (ii) based outside the UK but have an establishment in the UK (such as an office or branch); or (iii) based outside the EEA and are offering their services to, or monitoring users based in, the UK.
The Code offers 16 principles that service providers may adhere to, including:
- a requirement for privacy settings to be set to high by default;
- the collection of geolocation information to be set to “off” by default;
- personal data should not be shared with third parties unless there are compelling reasons to do so (for example, safeguarding a child’s safety);
- avoiding collection of identifying information, such as name, email and phone number, and making use of avatars and pseudonyms instead; and
- the provision of privacy notices to both the children and parents.
Additionally, the Code dictates that service providers should build internal accountability programmes, implement policies and offer appropriate staff training to promote awareness of the Code and demonstrate compliance. The provision of online services relating to children also requires a data protection impact assessment in accordance with Article 35 of the GDPR.
Online service providers wishing to avoid the application of the Code should implement age-verification measures (asking someone to self-declare their age will not be sufficient, however the Code does encourage the use of third party age verification services) or provide evidence to demonstrate that the site is not being used by anyone under the age of 18.
The Code is open for public consultation until 31 May 2019.
As of 14 September 2019, certain electronic and remote payments in the EEA must be subjected to two-factor or “strong customer authentication” (“SCA”), under regulatory standards relating to the second EU Payment Services Directive (“PSD2”), which took effect in January 2018. Two factor or SCA authentication uses checks with the customer based on two details, such as information from your mobile device, passwords, or biometric information and will probably be in the form of an extra check in the online checkout process. This will not only challenge online retailers but also change the online consumer experience. The online payments affected may include online card payments as well as payments in e-money and online bank transfers.
Whether a transaction must have SCA is contingent on the application of PSD2. Therefore, SCA will apply to retailers accepting online payments from consumers based in the EEA even if the retailer itself isn’t based in the EEA. Some payment services and transactions may be completely out-of-scope of PSD2 either due to the currency and/or geographic location of the participants and therefore SCA will not be applicable. Some payment activities may be in-scope of PSD2, but specifically excluded under PSD2; for example, transactions using commercial agents acting only for the payer or payee. Again, in this instance, SCA would not apply. Even if a transaction is in scope and not excluded under PSD2, a transaction may be subject to certain exemptions under the SCA standard; for example, low value transactions (up to €30 per transaction (cumulative limit of five separate transactions or €100) and recurring transactions (such as subscriptions) are exempt.
These rules will continue to apply to the UK after Brexit (whether the UK leaves with or without a deal).
The ICO is launching the ‘Be Data Aware’ campaign to raise awareness of how companies might use data to target individuals and how those individuals can control who targets them. The campaign also aims to teach individuals about how they are targeted with social media adverts and political marketing.
The campaign offers downloadable factsheets on privacy and advertising settings and explanations of how online microtargeting works in order to educate people about their rights under the GDPR and how to exercise them.
- Google’s failure to adequately inform its users about the purposes of the data processing and the recipients of the data;
- Google’s failure to specifically list the countries outside of the European Union to which it transferred data; and
- Google’s failure to appropriately provide its users with information about cookies (Google had offered some explanation of how to object to cookies but users had to click on several hyperlinks before they could get the information).
The decision is poignant as the court’s decision would undoubtedly have been the same had it been judged against the French data protection law as it now stands (i.e. in line with the GDPR).
The Supreme Court has found that the decisions of the Investigatory Powers Tribunal (“IPT”) may be subject to judicial review by the High Court.
The IPT is a specialist tribunal that hears complaints relating to the use of investigatory and other powers by the intelligence services. The Supreme Court’s decision arose from Privacy International’s claim that the Secretary of State’s power to authorise the hacking of computers, including mobiles and network infrastructure, was subject to judicial review.
Prior to this latest decision, the Divisional Court and the Court of Appeal found that the effect of section 67(8) of the Regulation of Investigatory Powers Act 2000 was to prevent judicial review of the decisions of the IPT.
Lord Lloyd-Jones in his judgment stated that it is a “necessary corollary of the sovereignty of Parliament that there should exist an authoritative and independent body which can interpret and mediate legislation made by Parliament”.
Lord Carnwath also considered that it was consistent with the rule of law that a clause that attempts to exclude the supervisory jurisdiction of the High Court to review a decision of an inferior tribunal should not be binding.
The judgment is available here.
Prince Harry has won a legal dispute against Splash News, a photo agency which used a helicopter to take pictures inside his home. According to a statement delivered to London’s High Court, the Duke of Sussex not only argued that the photographers had invaded his privacy but that they had handled his data inappropriately and illegally under the GDPR. This is a new approach to the use of the GDPR and highlights the fact that a photograph of your home can constitute personal data in the form of your home address.
The details of Prince Harry’s argument are unclear as the case was settled out of court before a trial could take place. Splash News apologised to Prince Harry in a statement but did not admit wrongdoing.
In a decision dated 17 April 2019, the Conseil d'Etat (the Supreme Administrative Court) confirmed the sanction issued by the French data protection authority (known as the "CNIL") on an optical company relating to a data breach. The Conseil d'Etat did, however, reduce the fine imposed from €250,000 to €200,000.
The CNIL received information regarding the breach in July 2017. The CNIL's investigation revealed that the breach exposed customer's personal data, including their invoices, names, addresses, health data and social security numbers. The breach was remedied in August 2017.
Despite the remedy, the CNIL imposed a sanction on the company. They held that:
- the company had failed to comply with its legal security requirements;
- restricting access to clients' personal documents was an essential measure that should have been implemented by the company. They had failed to do so, resulting in personal information being available without logging-in; and
- the company had already been fined for a security breach on its website in 2015 and thus was fully aware of the security requirements.
The company challenged the decision of the CNIL and the case came before the Conseil d'Etat to be re-examined. The court confirmed the decision of the CNIL, but reduced the financial sanction (which was held to be disproportionate), taking into consideration the speed and good faith with which the company had rectified the breach.
Bitglass, a Cloud access broker specialising in secure cloud data moves, has analysed the top three data breaches of the last three years. Their analysis reveals that a decrease in share value was a notable consequence of breaches for publically traded companies.
For such companies, there was an average drop of 7.5 per cent. in their stock values and a mean market cap loss of $5.4 billion per company following a data breach. Research also showed that these breaches cost an average of $347 million in legal fees, penalties and remediation costs.
This is a stark warning of the importance of tight data security, and the emphasis placed on secure storage of data when judging the value of a public company. More information on the study can be found here.
The Irish Data Protection Commission (“IDC”) has opened an enquiry into the AI technology provider, Quantcast. The IDC is the lead regulator for many of Europe's biggest technology companies, and thus its interest is significant.
The investigation arose from a complaint submitted by Privacy International, who raised concerns about Quantcast's products, including technology that produces targeted advertising. Privacy International was also concerned about Quantcast's consent management tools, and questioned the company's legal authority to hold people's data. In a statement to TechCrunch, the IDC said:
“The purpose of the inquiry is to establish whether the company’s processing and aggregating of personal data for the purposes of profiling and utilising the profiles generated for targeted advertising is in compliance with the relevant provisions of the GDPR. The GDPR principle of transparency and retention practices will also be examined”.
If found in breach of the GDPR, Quantcast could be liable for a fine, and even forced to stop collecting data altogether. The full article can be found here.
A public database, which contains 24 GB worth of data about roughly 80 million American households, has been discovered on a Microsoft cloud server. The owners of the cloud have yet to be identified, but startlingly, the data set contained member numbers and "scores" – giving an indication of how it was potentially being utilised.
The data identified subjects by 'household' rather than as individuals, but the leaked information includes full addresses, exact longitude and latitude, ages and full names. The numerical values given to the data related to such things as gender, marital status and income. This could indicate that the data belonged to a mortgage or insurance company, or even a scam operation, but its discovery on a cloud server remains a mystery.
More information can be found here.
A survey by Hiscox has revealed that around 55 per cent. of UK firms reported cyberattacks in 2019, up from 40 per cent. in 2018. The group survey consisted of more than 5,400 small, medium and large businesses across seven countries, including the UK, Germany, the US, Belgium, France, the Netherlands and Spain.
Another finding from the survey was that Hiscox deemed almost 75 per cent. of the firms to be "novices" in terms of their readiness to face a cyberattack, with many incorrectly feeling they were not at risk. The insurer reported that UK firms fared particularly badly in the survey, and were, with American firms, joint-least likely to have staff appointed in designated "cybersecurity" roles. The takeaway for businesses is to engage in creating a specialised team to deal with and prepare for such attacks, which have experienced a "sharp increase" (as borne out by the reporting data).
Facebook has garnered publicity for several cybersecurity related issues this month.
Turkey’s Personal Data Protection Authority (KVKK) issued a 1,650,000 Turkish lira ($270,000) administrative fine to the technology giant over data breaches and a failure to report those breaches to authorities. The investigation into Facebook was launched in December 2018, and found that during the period of the data breach, Facebook “may have affected up to 6.8 million users and up to 1,500 apps built by 876 developers” by failing to take prompt action to rectify it. This was compounded by Facebook failing to report the breach to the KVKK.
It was also announced that Facebook harvested the email contacts of 1.5 million users without their knowledge or consent when they opened their accounts. Facebook failed to tell users that they were harvesting the data of their contact books when they signed up and verified using email accounts, inadvertently leading to the improper obtaining of people's data. An in-depth analysis of the functionality and its operation can be found here. Facebook now plans to inform the 1.5 million whose data was improperly harvested and delete it from the system.
In further damning coverage, NBC broke the news that Facebook CEO Mark Zuckerberg used the data of the social network's users as a bargaining chip when negotiating with competitors and business contacts. Leaked documents revealed that Zuckerberg and the board of Facebook utilised the data of users as leverage in meetings and pitches, finding ways to access and manipulate that data to gain the upper hand. This included denying some companies access to the data, while rewarding favoured companies by letting them use it. Facebook denies using the data as leverage and denies breaking the law.
In its quarterly financial reports, Facebook disclosed that it is expecting to pay $5 billion to the US Federal Trade Commission (“FTC”). Facebook has consistently been on the wrong side of the FTC; the company was accused of not protecting its users’ data from Cambridge Anaylytica, a British political consulting firm that was harvesting data without consent (for more information please refer to our October update), and suffered a data breach that exposed the personal information of nearly 50 million users.
On 14 May 2019, the messaging app WhatsApp warned users of a potential cyber hack and urged them to download an updated version of the app immediately.
The supposedly 'secure' and ‘hack-proof’ software, which has been both lauded and criticised for its end-to-end encrypted service, reported a targeted attack on a "select number" of users, which was orchestrated by "an advanced cyber-actor". Once the user's app is infected with the software, the attackers would be able to operate surveillance on the infected device, potentially allowing them to read messages. The limited targets of the attack are suspected to be journalists, lawyers, activists and human rights defenders, although WhatsApp issued the warning to all users. Some news outlets are linking the attack software to an Israeli-based security developer called NSO Group. For anyone concerned about their own device, the ICO offers further advice here.
Uniqlo, a high street retailer owned by Asia's largest retailer, may have been subject to a cyber hack. Fast Retailing Co., who own Uniqlo, say almost half a million UK customers' data may have been accessed by hackers.
The data that might have been accessed includes personal information, some parts of credit card numbers and purchase history. The breach is believed to have occurred between 23 April and 10 May 2019, leaving many customers' data at risk of being stolen.
Fast Retailing Co. has advised customers to change their passwords and usernames for Uniqlo and GU websites in the UK. The breach was a 'list-based' attack, which occurs when customers used the same passwords and usernames for multiple websites. Those completing purchases online are advised to use different username and password combinations for different websites in order to mitigate the risk of such attacks.
The SNP reported itself to the ICO after sending out tens of thousands of European Election leaflets to the incorrect addresses. The Scottish Conservatives have suggested that one woman received up to 30 letters from the SNP, all addressed to people not living at that address.
The SNP have identified this as a "clerical error" and issued an apology to any affected voters; if found guilty of a data breach by the ICO, the SNP may be handed a fine. However, they were keen to reassure voters that the error was purely administrative and not connected to the electoral registers. The ICO confirmed that an investigation was underway.
Farrow & Ball (“F&B”) was one of the first to receive a fine for failure to pay the statutory fee for data controllers. Failure to cough up the £40, £60 or £2,900 fee (dependant on the size of the controller entity) resulted in a £4,000 fine for F&B. F&B appealed for discretion to be exercised because it didn’t receive a reminder from the ICO about the fee after the first notification was overlooked internally. The fine was paid as soon as the error was recognised. The Information Rights Tribunal adopted a "reasonable excuse" approach and considered the "expected standards" of a reasonable data controller to conclude that F&B's failure to comply with the fee legislation was sufficient to attract a penalty.
The ICO has issued an enforcement notice requiring HMRC to delete all of the biometric data it holds under its Voice ID system for which it did not have explicit consent.
Following a watchdog complaint, the ICO found that HMRC was collecting and processing biometric data using its voice recognition service. Callers were subject to a brief recorded message about the benefits of the Voice ID service before being asked to repeat: "my voice is my password". There was no clear option to opt out, nor details of how to seek further information. This resulted in HMRC holding the data of roughly 5.5 million customers for which it did not have explicit consent.
The ICO has fined Hall and Hanley Ltd (“HH”) £120,000 for sending over 3.5 million direct marketing text messages about PPI compensation claims. HH fell foul of a catalogue of GDPR infringements. They had contracted the job out to a third party, but the ICO found there was no valid consent. HH claimed consent had been collected via any one of four websites, despite only being listed in the privacy notice of two of those websites. Further, people were required to consent to receiving marketing materials from third parties as a condition of subscribing.
Between March and November 2017, Avalon Direct Limited ("Avalon") made 52,000 calls to people who were registered with the Telephone Preference Service ("TPS"). The numbers had been purchased from a third party provider but Avalon failed to carry out any due diligence, check the numbers against the TPS register or obtain consent. Without specific consent, it is illegal to call any number registered with TPS.
Two of Avalon's directors had already been involved in an unconnected ICO investigation, and so in addition to the fine, Avalon was served with an enforcement notice ordering it to improve its practices.