Data Protection update - August 2025

Welcome to the latest edition of the Stephenson Harwood Data Protection update, covering the key developments in data protection and cyber security law in July and August 2025.
In data protection headline news, the ICO launches two consultations on its guidance on the Data (Use and Access) Act 2025; the ICO publishes a reminder on how data protection laws apply to facial recognition technology; and the ICO confirms that its new IoT guidance will not come into effect immediately upon publication.
In cyber security news, Apple and the UK Government appear to be moving towards a resolution regarding the Government’s request for a backdoor to access encrypted data.
In enforcement and civil litigation news, the ICO reprimands the South Yorkshire Police for accidentally deleting original body worn video footage; the ICO fines a charity after destroying irreplaceable personal records; and Instagram's "Map" feature leads to a class-action.
Data Protection
- Consultations on amendments to UK’s Data (Use and Access) Act 2025 launched
- How data protection laws apply to facial recognition technology
- ICO's IoT rules will not come into immediate effect
Cyber Security
Enforcement and Civil Litigation
- ICO reprimands South Yorkshire Police
- ICO fines a charity after personal records destroyed
- Instagram's "Map" feature leads to class-action
- Round up of enforcement actions
- Key US updates
Data Protection
Consultations on amendments to UK’s Data (Use and Access) Act 2025 launched
Following the enactment of the Data (Use and Access) Act 2025 ("DUAA") in the UK, the Information Commissioner’s Office ("ICO") has initiated public consultations to inform and refine its guidance on the DUAA. These consultations include one on its draft guidance about "recognised legitimate interests" and another on its draft guidance about "data protection complaints".
The recognised legitimate interest guidance explains that a "recognised legitimate interest," is a new lawful basis that will allow organisations greater confidence to use personal data for specific purposes listed in the DUAA (which may be amended and updated). Currently, the DUAA lists the following as a recognised legitimate interest: crime prevention, public security, safeguarding, and emergency response. The draft guidance explains how to apply this basis and incudes practical examples
The guidance is aimed at large organisations and data protection officers. and is divided into the following sections: what is the recognised legitimate interest basis, when can it be used, what are the current recognised legitimate interest purposes and other things to consider. The consultation is open until 30 October 2025 and responses can be submitted here.
The data protection complaints guidance has been drafted to assist organisations with implementing a process to handle data protection complaints from individuals, which they must do from June 2026.
Under this new requirement, organisations must:
- provide a clear process for individuals to make data protection complaints;
- acknowledge complaints within 30 days;
- respond promptly and keep complainants informed throughout; and
- communicate the outcome without undue delay.
The draft guidance offers practical advice for each stage of the complaints handling process and includes sections on: How to prepare for data protection complaints, what to do when you receive a complaint, and what to do after you’ve finished your investigation. The consultation is open until 19 October 2025 and responses can be submitted here.
You can find our article series on the DUAA, where we deep-dive into its various provisions, here.
How data protection laws apply to facial recognition technology
Facial recognition technology ("FRT") operates within the scope and framework of data protection law – it is a method of technology that uses biometrics to identify people using their facial features. As it involves the use of personal data, its deployment must be lawful, fair, and proportionate. When used by law enforcement, it is essential that FRT is implemented in a manner that upholds individuals' rights and freedoms, with robust safeguards in place.
The ICO continues to prioritise its efforts on FRT due to its significant potential benefits and associated risks and is committed to ensuring that police use of this technology complies with data protection law, and that the public's rights are respected. This includes issuing clear guidance and conducting regular audits of police forces to promote transparency and accountability.
According to a new statement released by the ICO, the key data protection themes for using live FRT for law enforcement are as follows:
- It must be strictly necessary
- There must be a clear purpose for using it
- Controllers must implement key data protection documentation surrounding its use
- Its use must be effective
- Watchlists must be in keeping with data protection principles
- Controllers must inform the public about its use
- Bias must be managed
FRT offers benefits such as crime prevention and faster identification processes. However, it undoubtedly also raises concerns about false identifications. While privacy violations and potential bias against minorities could also be issues, the ICO's guidance provides a strong foundation for the safeguarding of these rights. Balancing FRT's advantages with robust safeguards and transparency is essential to protecting individual rights and maintain public trust.
ICO's IoT rules will not come into immediate effect
The ICO is currently consulting on new guidance for privacy in the Internet of Things ("IoT") sector. However, Slavka Bielikova, a principal policy adviser at the ICO, recently confirmed that companies developing smart home products will not be required to comply with the guidance immediately on publication.
A draft of the guidance was published in June and outlines how UK data protection laws apply to devices such as fitness trackers, speakers and children's toys. The draft guidance addresses key issues such as transparency, consent, and individuals' rights over their data. We reported on this in a previous edition of this bulletin. The consultation is open until 7 September. In particular, the ICO is seeking industry feedback on issues like processor relationships and the economic impact that the guidance will have on device manufacturers.
Cyber Security
Apple and UK Government appear to be moving toward resolution
The US director of national intelligence, Tulsi Gabbard, took to Social Media website "X" to announce that the UK Government has agreed to drop its requirement for tech giant Apple to provide a "backdoor" in order for the UK Government to access fully encrypted files.
The UK Government issued Apple with a formal notice demanding this "backdoor" back in December last year. In response, Apple withdrew its Advanced Data Protection security tool from the UK market. Additionally, the company had begun proceedings to challenge the Government's order, with the case due to be heard at the tribunal early next year. We recently reported that Apple's challenge against the UK Government's request to access encrypted data would not be held in secret.
Despite the claims made by Gabbard, Apple has not officially confirmed whether it has received any formal communications on the matter from either the UK or US Governments. The UK government has also not made any statement on whether Gabbard's announcement is true, with a spokesperson telling the BBC that "we do not comment on operational matters, including confirming or denying the existence of such notices".
Enforcement and Civil Litigation
ICO reprimands South Yorkshire Police
The ICO has issued a formal reprimand to South Yorkshire Police ("SYP") following the deletion of over 96,000 items of body-worn video ("BWV") footage in breach of Sections 34(3) and 40 of the Data Protection Act 2018. The incident emphasises the importance for organisations to strike a balance between not retaining data for too long, but also making sure data is not deleted prematurely.
An ICO investigation found that SYP failed to implement appropriate security measures to protect BWV data. Key failures included:
- delayed development of IT backup policies;
- failure to escalate known flaws to senior management as early as 2019;
- poor record keeping; and
- a lack of risk assessment during the transfer of personal data (some of which included special category data) between IT systems.
The deletion occurred in July 2023 after system upgrades led to storage issues and a workaround was created which involved the transfer of data to local drives. The loss came to light in August 2023 when an IT manager identified that file storage was low – indicating files had been deleted. The incident was across 126 criminal cases, with three cases being directly impacted by the data loss. In one instance, the absence of BWV footage may have hindered progression to a first court hearing.
Although more than 95,000 files had been copied to a new system before the deletion, SYP was unable to confirm exactly how many files were lost without copies, due to insufficient documentation.
The Head of Investigations at the ICO emphasised that the case demonstrates the critical need for clear procedures and accountability when managing digital evidence. She called on all police forces and other organisations using BWV technology to assess and improve their data protection controls. In addition, the ICO has recommended several actions for SYP which include: implementing robust storage and backup solutions; defining third-party roles and responsibilities; and conducting risk assessments before allowing third-party access to police IT systems.
This incident serves as a broader reminder to all organisations handling sensitive digital evidence of the need for managing this evidence in line with data protection law.
ICO fines a charity after personal records destroyed
The ICO has fined Scottish charity Birthlink £18,000 after it destroyed around 4,800 personal records, including irreplaceable handwritten letters and photographs from birth parents. Birthlink supports people affected by adoption in Scotland. One of the charity’s key tasks is maintaining the Adoption Contact Register to help link and reunite adopted individuals and their birth families. These records were part of the charity's adoption support work and represented vital pieces of individuals' family histories and identities, some of which may now be lost forever.
The ICO's investigation found that Birthlink, among other things, lacked adequate policies and failed to properly train staff. In early 2021, the charity began destroying files of cases where people had already been linked with the person they were seeking due to limited storage space. It wasn't until August 2023, after an inspection by the Care Inspectorate, that Birthlink realised irreplaceable records had been destroyed and reported the breach to the ICO.
The ICO highlighted the profound impact such breaches have on people's lives, emphasising that these records held information and memories that are now lost. Following the breach, Birthlink has improved its data protection practices by digitising records, appointing a Data Protection Officer, and training staff.
The ICO's action serves as a reminder about the seriousness with which charities must take their data protection responsibilities.
Instagram's "Map" feature leads to class-action
In the US, Instagram is facing a class-action privacy litigation over its newly launched feature "Maps". This addition to the platform allows users to share their real time location, as well as the location of any posts that they share.
An individual from Maryland filed a suit alleging that Instagram and its parent company Meta have violated the California Unfair Competition Law, as well as committing a series of torts, by opting him into sharing his location data without his consent.
In his suit, the plaintiff described Meta's privacy controls as "illusory" and claimed that the "Defendants deceptively and unconscionably deprived and continue to deprive Plaintiff and Class Members of one of the most fundamental autonomy and privacy rights: the ability to control access to and knowledge of their whereabouts and their movements over time". The plaintiff is seeking to represent a class of Instagram users in the US whose location data has appeared on the Maps feature despite them not opting in.
In a statement, Meta explained that Instagram Maps is disabled by default, and live location sharing only happens if users choose to enable it.
Round up of enforcement actions
Company | Authority | Fine/enforcement action | Comment |
Zoo Loro Parque (the "Zoo") | Spanish Data Protection Agency ("DPA") | €250,000 | The Spanish DPA imposed a fine on the Zoo due to its use of visitors' fingerprints as part of its entry system, which was deemed to be the unlawful processing of biometric data as consent was not properly obtained for this. |
Caixabank | Spanish DPA | €200,000 | The Spanish DPA found that CaixaBank violated Article 5.1(e) of the GDPR, which requires personal data to be kept only as long as necessary. CaixaBank unlawfully kept a customer's data after their mortgage was cancelled in 2008. The customer only realised this when they were sent marketing communications from CaixaBank in 2022 - nearly 16 years after their mortgage was cancelled. CaixaBank argued the contract was still valid due to its structure as a credit line, but this argument was rejected by the Spanish DPA. |
An association for people with autism spectrum disorder | Greek DPA | €10,000 | A complaint to the Greek DPA revealed that the association "Shield of David" denied parents access to CCTV footage involving their child and unlawfully shared the child's medical and social records with a company without consent or prior notice. Additionally, a court of first instance decision was improperly disclosed to multiple recipients. The Greek DPA imposed fines totalling €10,000 for violations of GDPR Articles 5(1)(a), 12(2), 13, 15, 24, and 31, citing failure to uphold access rights, unauthorised data transmission (of both the personal data and the court decision), and not cooperating with the supervisory authority. This case underscores the importance of safeguarding minors' data and respecting parental rights. |
MediaAlpha | Federal Trade Commission ("FTC") | $45 million | The FTC announced a proposed order against MediaAlpha, Inc. ("MediaAlpha") for violating the FTC Act and Telemarketing Sales Rule by flooding consumers with robocalls promoting misleading healthcare offers. This included those on the Do Not Call Registry. The FTC found that MediaAlpha initiated or facilitated unlawful telemarketing calls and failed to obtain proper consent. As a result, the company faces a $45 million fine and must now secure consumers' express informed consent before collecting, selling, or disclosing personal data. This case underscores strict enforcement of consumer consent and telemarketing compliance under U.S. privacy law. Alongside the fine, the FTC's proposals necessitate that MediaAlpha obtain consumers' express informed consent to collect, sell, or disclose any personal information. |
Key US updates
Several US state privacy laws that were enacted into law in 2023 and 2024 have either recently come into effect or are coming into effect in the autumn.
State | Date | Law |
Maryland | 1 Oct 2025 | Maryland Online Data Privacy Act 2024 This Act applies to anyone doing business in the state or offering products or services to state residents, if, in the previous calendar year, they met certain thresholds. An example of one such threshold is where personal data of at least 35,000 consumers is controlled or processed (excluding where it is processed solely for the purpose of completing a payment transaction). |
Minnesota | 31 July 2025 | Minnesota Consumer Data Privacy Act This Act applies to anyone doing business in the state or offering products or services to state residents, if, in the previous calendar year, they met certain thresholds. An example of one such threshold is where personal data of at least 100,000 consumers is controlled or processed (excluding where it is processed solely for the purpose of completing a payment transaction). |
Tennessee | 1 July 2025 | Tennessee Information Protection Act 2023 The TIPA applies to a person that conducts business in Tennessee by producing products or services that target Tennessee residents, and that (i) exceeds $25,000,000 in revenue and (ii) (A) controls or processes personal information of at least 25,000 Tennessee residents and derives over fifty percent (50%) of gross revenue from the "sale" of any personal information or (B) during a calendar year, controls or processes personal information of at least 175,000 consumers. |