Skip to main content

Clifford Chance

Clifford Chance
Data<br />

Data

Talking Tech

Reflections from IAPP DPI: UK 2025

Data Privacy Artificial Intelligence Cyber Security 7 April 2025

Clifford Chance recently attended the International Association of Privacy Professionals (IAPP) Data Protection Intensive: UK 2025, in London. As part of a packed and varied conference programme, Tech//Digital Counsel Arnav Joshi chaired a panel on the complexities of integrating AI governance into vendor management, with Jyoti Campbell of Morgan Stanley and Kai Zenner, Head of Office and Digital Policy for Axel Voss MEP. Senior associate Laia Bertran Manye and associate James Wong, who also attended the event, share their takeaways and reflections.

Amid what was a busy month for data protection practitioners, delegates from around the world over assembled in central London for the IAPP Data Protection Intensive: UK 2025 at the begining of March 2025.

The opening keynotes, delivered by Sir Chris Bryant, Minister of State for Data Protection and Telecoms, and John Edwards, UK Information Commissioner, reminded the hundreds of professionals in attendance of the dual importance of their discipline: on the one hand, data continues to act as a foundation for transformational technologies that create a better world, while on the other, the misuse of data leads to significant real-world harms, particularly for the most vulnerable in society.

The opening speakers illustrated the opportunities by reference to new possibilities that the Data (Use and Access) Bill, which is currently making its way through the House of Commons and expected to pass into law in April this year, is intended to open up for improving public services and removing barriers to the sharing and use of (primarily, non-personal) data. They also highlighted the extraordinary ways that new generations of AI technology are augmenting our ability to make sense of large and complex datasets. Meanwhile, as to risks, the Information Commissioner commented on the ICO's recent Ripple Effect campaign – an attempt to focus 'hearts and minds' on the very real and personal impact that a data breach can have on an affected individual.

Other key areas commented on by the Information Commissioner included: 

  • children's privacy, which is an ongoing focus area for the ICO
  • AI and biometrics (which we note has been a topic of spirited debate in Europe in recent years)
  • cookies and online tracking.

The reflections below are on a sample of this year's key themes, based on to the sessions we attended.

Transparency

An overarching theme across the key areas mentioned by the Information Commissioner was transparency: (i) transparency playing a vital role in children's privacy regulatory work to ensure that any choices made by children with regards to their personal data (or by parents in their behalf) were informed; and (ii) transparency in online tracking to allow a fair online world, with an emphasis on the option to easily reject all non-essential cookies. The ICO will be looking at an expanded list of top UK websites from a transparency perspective. The transparency theme made its way to other sessions too, which put forward relevant issues such as the importance of successfully actioning transparency, e.g.. transparency resulting in the individual being able to understand data processing as opposed to being confused by the explanation of it.

Making the best use of regulatory resources

Another key theme we identified was Data Protection Authorities seeming to focus resources on areas they consider to be "higher stakes", without allowing that to mean that smaller companies will not be scrutinised. Different factors could influence the prioritisation of resources from a regulatory perspective, including risk of harm to data subjects, number of data subjects impacted, effect of the intervention for vulnerable individuals, whether another regulator might be better placed to investigate an issue, whether regulatory intervention would lead to more compliance and/or more growth, and the impact to the regulatory strategy and resources. On enforcement, the ICO seems to be drafting new guidance on its enforcement process (further to its fining guidance), which they will likely be consulting on.

Possible disruptions to cross-border data transfers

Asked what he expected to see in the coming five years, the Information Commissioner took the opportunity to say what he most wanted to see: agreement on fundamental global principles. This may have been an allusion to recent developments that could see a reversal or weakening of multilateral and bilateral mechanisms enabling cross-border data flows. He went on to describe international data transfers as a "bane of [the ICO's] existence". Clifford Chance's data protection experts are tracking global developments on international data transfers and can help clients navigate the changing landscape. For example, we recently wrote on concerns surrounding the future of transatlantic data transfers in light of changes to the composition of the Privacy and Civil Liberties Oversight Board in the United States.

Value chains to be reimagined by AI

Today's digital value chains were shaped in large part by cloud services. Data protection lawyers are familiar with mechanisms commonly found in cloud contracts. How will AI services change the shape of these value chains? The EU AI Act is product safety-style legislation. Rather than the controller-processor distinction, it imposes obligations based on provider, importer, distributor, authorised representative and deployer roles, familiar to trade and commerce lawyers. Unlike physical goods, though, AI value chains are complex and non-linear. While the design and functionality of a physical product is finalised once shipped, AI systems can and do evolve over time. A deployer can make a change to an AI system such that it becomes a provider, and this can trickle downstream creating a complex web of providers. This heightens the challenge of allocating liability across a value chain. These challenges resonated with us. As we help AI vendors and customers prepare for emerging regulatory complexity, it has been necessary to connect the dots between new regulations and emerging contractual norms.

No one-size-fits-all in online safety

In recent years, online platforms have been preparing for the EU Digital Services Act and UK Online Safety Act. These introduce obligations for content moderation (balanced against the protection of fundamental rights), transparency and accountability in decision-making and age-assurance to protect minors from harmful and illegal content. Online platforms and their regulators, however, face challenges in pursuing robust protection and promoting trust and safety while responding to advancements in technology, the evolving nature of online harms, the expanding range of online services and the need to safeguard free speech. Regulators in attendance at the conference expressed a recognition that there is no one-size-fits-all solution to enforcement and supervision. Online services should continually evaluate the standard to which they are held under law against evolving harms and the availability of new tooling to support content moderation processes.

Get in touch with us

Our data protection experts are happy to discuss current trends, what we are seeing in the market and how we are helping our clients to navigate emerging challenges in data protection, cybersecurity, online safety and AI regulation across markets. If you missed out on the IAPP Data Protection Intensive: UK 2025 but would like to discuss any of its themes, please reach out to us.