Data Protection update - November 2019
Welcome to the November 2019 edition of our Data Protection bulletin, our monthly update covering key developments in data protection law.
- EUR 18 million fine for Austrian postal operator
- Facebook agrees to pay £500,000 fine over Cambridge Analytica data leak
- ICO urges caution in police use of live facial recognition technology
- Berlin DPA imposes EUR 14.5m against Deutsche Wohnen SE
- ICO issues guidance on DPIAs and Artificial Intelligence (“AI”)
- Spanish DPA fines Vueling
- ECtHR rules on whether covert surveillance of employees violates their human rights
- ICO investigates Brexit Party for handling of DSARs
- FT investigation reveals health websites sharing sensitive data with advertisers
- EDPB adopts final version of guidelines on territorial scope
- EDPB guidance on data protection by design and default
- ePrivacy Regulation impasse continues
- ICO submits Age Appropriate Design Code of Practice to the government
- Home Group customers warned after data breach
- UniCredit lifts lid on 2015 data breach
- Websites and national TV station compromised in major Georgia cyber attack
- WhatsApp sues Israeli surveillance firm
- Hackers target UK political parties
- North Korean hackers reportedly target Indian space agency
The Austrian Data Protection Authority has imposed a fine of EUR 18 million on the Austrian national postal service (“Post AG”) for improper use of customer data. Post AG used customer data (including names, addresses, ages and gender details) to predict political party affinities, and used these findings for marketing purposes by selling them to businesses and political parties. The penalty is not yet final and may be challenged in the Austrian Federal Administrative Court.
Facebook has agreed to pay a £500,000 fine that was imposed by the ICO in October 2018 for the inappropriate sharing of personal data in the wake of the Cambridge Analytica data leak. The agreement reached between the social media giant and the ICO sees both parties withdraw their appeals in respect of the ICO’s decision making process. Facebook has made no admission of liability in relation to the penalty notice.
The ICO’s investigation into the misuse of personal data in political campaigns came to a head on 24 October 2018 when it issued a monetary penalty notice to Facebook under section 55 of the Data Protection Act 1998. Around a month later, Facebook filed an appeal with the First Tier Tribunal (General Regulatory Chamber) against the penalty notice. An interim decision was issued by the First Tier Tribunal in June 2019 to the effect that materials in connection with the ICO’s decision making process should be disclosed in order to consider allegations of bias and a lack of procedural fairness. The ICO appealed this decision in September 2019.
The agreement also paves the way for Facebook to continue its own internal investigation into issues concerning Cambridge Analytica, which had previously been put on hold at the ICO’s request.
The Information Commissioner has issued the first Opinion under the Data Protection Act 2018 (“DPA 2018”), in connection with the use of live facial recognition (“LFR”) technology by law enforcement authorities. The Opinion is published in the wake of the ICO’s announcement in August this year that it was launching an investigation into the use of facial recognition technology in the Kings Cross area (discussed in a previous bulletin here) and the recent decision of the High Court in Bridges (discussed here) in relation to the use of the technology by South Wales Police.
The overarching message is one of caution. A number of comments are worth highlighting:
- The Commissioner “does not consider that the decision of the High Court [in Bridges] should be seen as a blanket authorisation to use LFR in all circumstances”.
- In order to promote public confidence in police use of LFR, more detail is required in data protection impact assessments (“DPIAs”), which controllers must ensure are in place prior to each LFR deployment.
- The requirement under section 35(5) of the DPA 2018, of “strict necessity” in respect of the circumstances in which sensitive processing may take place for law enforcement purposes, requires a detailed proportionality assessment. Such an assessment should consider how the benefits of the use of LFR balance against the invasive nature of the technology and the findings should be detailed in all DPIAs dealing with LFR.
The Commissioner has signalled her intention to issue more detailed guidance (potentially in addition to a government-issued statutory and binding code of practice) on the obligations in respect of the use of LFR in the near future.
The Berlin Commissioner for Data Protection and Freedom of Information (“Berlin DPA”) has imposed a EUR 14.5m fine on real estate company Deutsche Wohnen SE for improper data storage and retention. Following an on-site audit in June 2017, the Berlin DPA found that the company had an extensive archive of tenant personal data (including personal and financial details) and the system was such that data was not removed after a set period. Historic data continued to be stored without any examination as to whether the retention was legitimate or necessary.
Despite Deutsche Wohnen implementing a remedial project to improve its archiving system, a subsequent inspection in March 2019 revealed that the company was still unable to establish that the data was being stored lawfully. The Berlin DPA subsequently found there to have been a breach of Article 25(1) of the GDPR (data protection by design and by default) and Article 5(1) of the GDPR (including the obligations in respect of data minimisation and storage limitation). Whilst the remediation project was taken into account as a mitigating factor, the deliberate setting-up of the archive and the significant length of time over which data was processed in an inadmissible manner were aggravating factors in determining the amount of the fine.
Maja Smoltczyk, the head of the Berlin DPA, warned against the mass hoarding of data and the creation of “data cemeteries”, signalling an appetite on behalf of authorities to sanction “such structural deficiencies under the General Data Protection Regulation before the worst-case scenario data breach occurs”.
Simon Reader, Senior Policy Officer at the ICO, has published a blog post containing guidance for organisations when carrying out a DPIA for the processing of personal data in AI systems.
The post sets out the key components that should feature in an AI-related DPIA:
- A systematic description of the processing
* Include detail as to when AI processes and automated decisions may affect individuals, the scope of human intervention or review, and details of external providers in the event that AI systems are partly or wholly outsourced.
- An assessment of necessity and proportionality
* In conducting the proportionality assessment, organisations should consider “any detriment to data subjects that could follow from bias or inaccuracy in the algorithms and data sets being used”, whether “data subjects would reasonably expect the processing to be conducted by an AI system” and “how the project might compare human and algorithmic accuracy side-by-side to better justify its use”.
- Identification of risks to rights and freedoms
* Consider impact beyond individuals’ data protection rights. Equalities legislation may be breached if, for example, machine learning systems reproduce discrimination from historical data patterns.
- Measures to address the risks
* Organisations should involve DPOs from the earliest stages of AI projects and should ensure that individuals responsible for developing, testing, deploying and monitoring AI systems are trained and appreciate data protection implications of the processing. Technical measures should be implemented to minimise risks to the accuracy and security of AI systems.
- A “living” document
* Ensure that the DPIA is subject to regular review.
The fine comes just a month after the CJEU provided useful clarification on cookie consent in its ruling in relation to Planet 49 GmbH, a summary of which can be found in our previous bulletin.
The ECtHR has held that a Spanish supermarket’s installation of covert surveillance to monitor employees suspected of stealing did not violate a person’s right to respect for their private and family life, in a ruling which provides useful guidance for employers. A detailed summary of the case can be found here.
The ICO has launched an investigation in relation to the Brexit Party’s alleged failure to respond to DSARs submitted during the European elections in May this year. A spokesperson for the party suggested that at the time there was “a coordinated attempt by campaigners to flood the Brexit Party with Subject Access Requests”, prompted by “inaccurate claims circulated on social media…claiming we had acquired people’s addresses improperly”.
The Brexit Party has been given until 22 November to answer the requests, with a spokesperson confirming that whilst the party has “responded to the vast majority of letters…around 0.2% are currently being dealt with and we will meet the deadline agreed with the ICO”.
An FT investigation has found that some of the largest healthcare websites, including Healthline, WebMD and Bupa, have been sharing sensitive personal data with technology platforms such as Google, Amazon and Facebook. The analysis unveiled that 79% of the 100 health websites investigated used cookies without users’ consent. Examples of data shared with third parties included: symptoms inputted into WebMD (subsequently shared with Facebook), menstrual and ovulation cycle information from BabyCentre (shared with Amazon Marketing), and drug names entered into Drugs.com (sent to DoubleClick, Google’s advertising arm). Further investigation into 10 health websites showed that eight of these transmitted specific identifiers linked to the web browser, opening up the possibility of the information being linked to specific individuals.
The ICO’s executive director for technology policy and innovation, Simon McDougall, stated that the investigation “further highlights the ICO’s concerns about the processing of special category data in online advertising, as well as the role that site owners and publishers play in this ecosystem” and that “we will be assessing the information provided by the FT before considering our next steps”.
The EDPB has released the final version of Guidelines 3/2018 on the territorial scope of the GDPR (the “Guidelines”). The finalised Guidelines incorporate contributions and feedback received in respect of the draft version published just over a year ago.
Clarifications of note include the following:
- Article 3 of the GDPR is aimed at determining whether a particular processing activity falls within the scope of the GDPR, not whether a controller or processor is subject to it.
- The mere presence of an employee in the EU is not in itself sufficient to trigger the application of the GDPR under the “establishment” test.
- The mere fact that a controller instructs a processor not established in the EU to carry out the processing on its behalf will not mean that the processing falls outside the scope of the GDPR.
- The incidental or inadvertent offering of goods or services to individuals in the EU will not trigger the application of the GDPR.
- Non-EU processors who target/monitor the behaviour of EU individuals on behalf of a controller will fall within the scope of the GDPR by virtue of Article 3(2).
The EDPB has published draft Guidelines 4/2019 (the “DPbDD Guidelines”) on the data protection by design and by default (“DPbDD”) provisions in Article 25 of the GDPR, setting out useful practical steps that controllers can take to comply with these obligations. The publication is particularly timely in light of the Berlin DPA’s decision in respect of Deutsche Wohnen SE (discussed above).
The following points are worth highlighting:
- Data protection by design:
- Controllers may demonstrate the effectiveness of measures by using key performance indicators (quantitative or qualitative metrics or the provision of the rationale behind the assessment of effectiveness).
- Safeguards may include enabling data subjects to intervene in the processing, having a retention reminder in a data repository, implementation of a malware detection system and basic “cyber hygiene”.
- Controllers must take account of current technological processes to comply with the “state of the art” obligation.
- Data protection by default:
- Data protection by default applies to the following elements of processing: volume of personal data collected, period of storage, extent of processing and accessibility.
- Practical application of data protection principles:
- The DPbDD Guidelines set out key design and default elements for each of the data protection principles and practical examples.
- Certification regime:
- Certification pursuant to Article 42 GDPR may be used to demonstrate compliance with DPbDD as well as providing a competitive advantage for both technology providers and controllers.
The Permanent Representatives Committee of the Council of the European of the European Union has rejected a fresh proposal for the new ePrivacy Regulation. The proposal was issued by the Finnish presidency on 15 November, with a view to it being addressed in the European Commission meeting on 3 December. The Croatian presidency, incoming in January 2020, will now have to decide whether to withdraw the proposal or redraft it in an attempt to garner increased support.
The ICO has submitted a final version of the Age Appropriate Design Code of Practice to the government, following an extensive consultation process which began in April this year. The Code will not be published until a new government is formed.
A data breach at one of the UK’s largest housing associations has compromised the personal data of 4,000 customers. Home Group warned that the breach involved contact information, customer names and addresses, but no financial data. The issue was resolved within 90 minutes, according to the Newcastle-based company, and all affected customers were informed.
UniCredit has unveiled that a 2015 data breach has affected the personal data of three million of its Italian customers. A statement by the bank confirmed that the compromised data included names, telephone numbers, and email addresses and that “no other personal data or any bank details permitting access to customer accounts or allowing for unauthorised transactions have been compromised”. The cause of the breach has not been disclosed.
Two television stations and thousands of websites have been hacked in Georgia in a major cyber attack. Websites targeted included those of local municipality offices and the president’s office, with home pages replaced with an image of the former Georgian president Mikheil Saakashvili holding a poster captioned “I’ll be back”. Georgia’s interior minister speculated that the attack “may have been carried out from both inside and outside the country” and confirmed that Georgian police are “actively co-operating with the law-enforcement agencies of partner countries”.
WhatsApp has filed a lawsuit against Israeli surveillance company NSO Group amidst accusations of facilitating government hacking in 20 countries, including the UAE, Mexico and Bahrain. The instant messaging company alleges that NSO exploited its video calling system to send malware to around 1,400 users’ mobile devices, allowing NSO to “access messages and other communications after they were decrypted on target devices”. NSO have denied the allegations “in the strongest possible terms”.
The websites of both the Conservative and Labour party have been hit by back-to-back cyber attacks. The National Cyber Security Centre confirmed that the first attack on the Labour party website was a distributed denial of service attack, where hackers attempt to disrupt websites by overwhelming them with traffic. The Labour Party confirmed that it had “experienced a sophisticated and large-scale” attack but that no data was compromised.
The Indian Space Research Organisation (“ISRO”) has been warned of a suspected cyber attack by North Korean hackers during the Chandrayaan-2 moon mission. ISRO’s systems were unaffected by the attempted attack, which was said to have involved phishing emails which, when opened, caused malware to be installed. The news comes hot on the heels of a report that India’s Kudankulam nuclear power plant was also subject to a North Korean cyber attack.
The ICO has invited views on a proposal for the regulator to be granted access to investigation and associated powers under the Proceeds of Crime Act 2002 (“POCA”).
The following powers are being sought:
- To apply to the court for Restraint Orders;
- To apply to the court for Confiscation Orders;
- Cash seizure, detention and forfeiture from premises;
- Asset seizure and forfeiture from premises;
- To undertake investigations (including search and seizure warrants) to support the proceedings sought above; and
- Access to information relevant to the investigation of money laundering offences.
The proposal is driven by a recognition that personal data has a monetary value, and that it is increasingly being viewed as a commodity which can be stolen and sold for financial gain. The granting of powers to the ICO would enable it “to assist the court in the identification of assets and to determine the value of a criminal’s proceeds from crime”.
The consultation closes on 6 December 2019.