Data Protection update - March 2018

Data Protection update - March 2018

Welcome to the latest edition of our Data Protection update, our review of key developments in Data Protection law covering March 2018.

Data protection

Cyber security

ICO enforcement

Data protection

Cambridge Analytica and Facebook – the Data Protection Implications

Data protection has been headline news in recent days in relation to Facebook's failure to protect 50 million of its users in the Cambridge Analytica scandal. The allegations that Cambridge Analytica played a role in Trump's election victory and the Brexit referendum result mean that this is the most high profile and politically significant data investigation the ICO has carried out so far.

On 23 March 2018 the ICO executed the warrant to inspect the premises of Cambridge Analytica as part of its investigation into Facebook for improper use of data. The ICO is investigating the matter as part of a larger investigation into the use of personal data and analytics by political campaigns, parties, social media companies and other commercial actors and has stated that it will “need to assess and consider the evidence before deciding the next steps and coming to any conclusions”.

Under the Data Protection Act 1998 (DPA), the maximum fine that Facebook could face would be £500,000 whereas from 25 May 2018 under the GDPR, the maximum fines issued could be the greater of 4% of Facebook's gross turnover or €20 million.

Personal data was harvested from Facebook through a personality test app called thisisyourlife (the App) that captured the personal information of both the test taker and their circle of Facebook friends. This data was then passed onto Cambridge Analytica, in breach of Facebook’s Terms of Service. Facebook has stated that it knew about the improper use of the data in 2015 but was satisfied with assurances from the developers of the App and from Cambridge Analytica that the data had been deleted (it is now investigating whether in fact this was the case).

As well as the ICO, Facebook is facing pressure from a number of other regulators around the world, posing questions as to whether it should have informed them, and potentially Facebook users, about the privacy breach when it happened. These regulators include those charged with the supervision of data, financial regulators and law enforcement agencies. The ICO’s quick action in requesting that Facebook cease its search of Cambridge Analytica’s premises suggests that it does not approve of internal investigations that can confuse or even destroy evidence. Ahead of the GDPR, Data controllers should be wary that internal investigations of potential data breaches for reporting purposes may be tracked or swiftly cut short by the ICO.

The ICO's intervention is a timely reminder to data controllers of the risks of providing data to third parties and may be a prelude to an increasingly assertive approach from the ICO following the implementation of the GDPR. The Information Commissioner, Elizabeth Denham, has had difficulty getting appropriate engagement from the organisations involved - in particular, Cambridge Analytica - which failed to respond to her demand on 7 March 2018 for records and data. She has requested stronger enforcement powers from Parliament, including the power to compel testimonies from individuals and impose criminal penalties for failure to comply with a compulsory audit. Notably, there is cross party support for the strengthening of the ICO’s powers in the Data Protection Bill which is currently making its way through Parliament. So long as the ICO obtains the resources to match its new powers, it will pose a considerable law enforcement threat to any organisation controlling or processing data.

See here for the "ICO's statement: investigation into data analytics for political purposes" which contains a series of statements relating to the Cambridge Analytica case and regular updates.

 

 

ICO publishes introduction to Data Protection Bill 2017-19

On 3 March 2018, the ICO published an introduction to the Data Protection Bill 2017-19 (the "Bill"). The document is intended as an introduction to the content and structure of the Bill for organisations and individuals who are already familiar with data protection law and the GDPR. It seeks to help readers to navigate the Bill and focus on the most relevant sections. The introduction covers:

  • the general processing and processing by law enforcement and intelligence services;
  • the role of the Information Commissioner;
  • the enforcement of the Bill's provisions; and
  • a discussion of the background to the introduction of the Bill and its passage through Parliament.

The ICO is intending to publish further detailed guidance on the Bill once it has been enacted.

See here for the full Introduction to the Data Protection Bill.

 

 

Spanish data protection guidance 'should help with GDPR compliance'

Spain's data protection authority, La Agencia Española de Protección de Datos ("AEPD"), has issued guidance that will help businesses determine what security measures they need in order to comply with new EU data protection laws. The guidance deals with how to assess the risks involved in personal data processing operations. It also updates existing guidance on conducting data protection impact assessments. The guidelines are aimed at defining the security measures that must be applied in relation to each processing operation.

AEPD makes a number of recommendations and includes templates which can be used for the risk assessment procedure. These deal with questions on the type of data processed and the purpose of the processing. AEPD suggests using a registry of processing operations as a starting point for the risk assessment. Additionally, there is a template that assessed the life-cycle of personal data from the moment the data is collected to the moment that the data is destroyed.

See here for a link to the guidance.

 

 

Government publishes technical note on Brexit separation issues

The UK government has published a technical note on the "Other Separation Issues" i.e. the issues relating to the UK's separation from the EU where further discussion is required and that do not relate to citizen's rights, Northern Ireland and the financial settlement. These "Other Separation Issues" include, amongst others, the use of data and protection of information obtained or processed before withdrawal from the EU. The government’s approach to this issue is as follows:

  • The UK and the EU should agree to continue to provide appropriate protections for data and information exchanged before exit and pursuant to the Withdrawal Agreement. The EU will process all UK data and information received prior to the date of withdrawal in accordance with Union law;
  • The UK has strong domestic standards, and will continue to play a leading global role in the development and promotion of high data protection standards and cross-border data flows. The UK will be aligned with the EU at the point of exit and the new Data Protection Bill will give effect to the GDPR; and
  • The UK’s objective is to agree early in the process a basis for the continued free flows of data between the EU (and other EU adequate countries) and the UK from the point of exit until such time as new and more permanent arrangements come into force.

See here for the full technical note.

 

 

Metadata privacy regulation at heart of broader telecoms market debate

The debate over whether and to what extent electronic communication service providers should be allowed to process metadata reflects the broader debate about how regulation should apply to different businesses in the communications market.

There are different views across national governments in the EU on the rules that should apply to metadata processing. Metadata is information that is connected to communications but which does not include the content of those communications. Such information can include numbers called, websites visited, geographical location or the time and date a call was made. The EU has previously determined that such metadata can be considered to be just as sensitive as the actual contents of communications because of insights that the data can offer into people's private lives.

MEPs previously endorsed plans to treat metadata as confidential information that should never be passed on to third parties. However, such a move would impose stricter conditions on the processing of metadata by the e-Privacy rules than under the terms of the GDPR.

Telecommunications providers are particularly worried that stricter controls on processing metadata might put them at a disadvantage in comparison to Over-the-Top ("OTT") service providers (e.g. Whatsapp and Netflix). OTT service providers have functionally similar services to telecommunications providers and also have access to large swathes of consumer data. This includes valuable data such as GPS location which may escape the definition of metadata under the draft e-Privacy Regulation, whereas the functionally equivalent data generated by a mobile network could fall within the scope of the law.

A coalition of telecoms industry bodies last year called for the revised e-Privacy framework to "be fully coherent with the overarching aim of increasing network investment, allowing more space for innovation, boosting the competitiveness of Europe’s vertical industries and creating further choice for European consumers".

For more information on the e-Privacy Regulations, please see our October 2017 edition of the Bulletin here.

 

 

IAB consults on new framework for online behavioural advertising

Online behavioural advertising ("OBA") is the collection (via cookies) of information about an individual's online browsing habits over time (such as websites visited, search terms entered and adverts clicked on) and its use to serve advertising targeted at that individual's interests, likes and dislikes.

Most OBA technologies track user behaviour by placing a tracking cookie on the user's terminal equipment when they access a website.

The Interactive Advertising Bureau ("IAB"), a broad coalition of advertising, marketing and online businesses in Europe, has published proposals for its new GDPR Transparency & Consent Framework (the "Framework") for public comment. The Framework is intended to allow website owners to:

  • Control which third parties (online advertising providers and others) access their users’ browsers and devices and process their users’ personal data;
  • Seek user consent to process personal data under the E-Privacy Directive (for setting cookies or similar technical applications that access information on a device) and the GDPR; and
  • Share information about consent status with others involved in online advertising.

The Framework envisages a "Global Vendor List" of entities that wish to collect personal data from website users. Such entities would be required to sign up to various commitments, including not processing personal data without a consent signal from the system. Website owners would be free to choose which vendors on the global list to work with.

When consumers visit a website, they would be asked to consent to purpose-specific processing by the website’s selected vendors. The website owner could choose whether to seek consent just for the vendor’s collection of data via that website owner’s site, via all sites operated by its affiliates or by any site. Requests for consent would have to comply with minimum standards, and suggested wording is provided. A consumer’s consent or refusal to consent would be recorded and consulted before processing could take place.

It will be interesting to see the reaction of the Article 29 Working Party, given its criticism of the "youronlinechoices" scheme previously established by the IAB.

See here for the IAB's website providing information about the Framework. Organisations wishing to provide feedback have until 8 April 2018 to do so.

 

 

Cyber security

Security by Design - Cybersecurity should be embedded into the way 'smart' consumer devices are made

In collaboration with industry and the National Cyber Security Centre, the government has developed a draft code of practice designed to improve security in consumer "Internet of Things" ("IoT") products and associated services.

Their number one recommendation is for each consumer IoT device to have its own unique password that cannot be reset to a "universal factory default value".

"Whilst much work has been done to eliminate reliance on passwords and providing alternative methods of authenticating users and systems, some IoT products are still being brought to market with default usernames and passwords from user interfaces through to network protocols," the government said. "This is not an acceptable practice and it should be discontinued."

Further recommendations contained in the code address issues such as the disclosure of security vulnerabilities, maintenance of up-to-date software, and the secure storage of security credentials. Consumers should also be able to delete their personal data from devices.

Different parts of the code are aimed at different stakeholders, including the manufacturers of IoT devices (e.g. smart watches, CCTV cameras and children’s dolls), as well as the app developers, businesses that provide related IoT services, and retailers that sell the finished products.

The draft code (which is open to consultation until 25th April 2018) has been developed as part of a broader review into the cybersecurity of consumer IoT devices and services that the government has undertaken. The government estimates that there will be more than 420 million IoT devices in use across the UK within three years.

According to Margot James, Minister for Digital, the recent cyber-attacks (notably the 'WannaCry' attack) highlight the need to ensure that IoT devices are secure and that consumers' privacy is protected when using them. James called for "a fundamental shift in approach to moving the burden away from consumers having to secure their internet connected devices and instead ensure strong cyber security is built into consumer IoT products and associated services by design".

The new security by design initiative is aimed at supporting the government’s stated ambitions to make the UK "the most secure place in the world to live and do business online".

See here for the Government report.

 

 

ICO enforcement

WhatsApp Inc has signed an undertaking with the Information Commissioner

The UK’s data protection watchdog has concluded that WhatsApp’s sharing of user data with its parent company Facebook would have been illegal.

The messaging app was forced to pause sharing of personal data with Facebook in November 2016, after the ICO said it had cause for concern. The ICO opened a full investigation into the matter in August that year.

The ICO’s investigation found that WhatsApp has not identified a lawful basis of processing for any such sharing of personal data and that if they had shared the data, they would have been in contravention of the first and second data protection principles of the DPA, by failing to provide adequate information to users explaining the processing and sharing of their data, and carrying out a processing activity that was "incompatible with the purpose for which such data was obtained".

In response WhatsApp has signed an undertaking declaring that it will not share any EU user data with Facebook until the GDPR comes into force on 25 May 2018 and that it will only share data in accordance with the requirements of GDPR after 25 May 2018.

A number of other European countries have also raised concerns over the data sharing, including France which ordered WhatsApp to stop sharing data in December 2017. The EU fined Facebook £94m for providing misleading information over its technical capabilities in terms of sharing user data before its acquisition of WhatsApp in 2014.

See here for WhatsApp's signed undertaking.