The decision issued by the Court of Justice of the European Union on 4 September 2025 in the case between the EDPS and the CRU (Case C-413/23 P) marks an important evolution in the way in which pseudonymised data should be handled under the GDPR. While the context of the case is very specific, the lessons learned by the CJEU are much broader in scope. They directly concern companies that share pseudonymized data with external partners, in particular as part of marketing activities, analytics or training artificial intelligence models.
The dispute originated in the resolution procedure carried out by the resolution authority of the European Banking Union, the Single Resolution Board (CRU) or SRB), which had put in place a mechanism for consulting shareholders and creditors. They could submit comments, which were then pseudonymized (removal of direct identifiers and assignment of an alphanumeric code) and then sent to Deloitte for analysis. The CRU had not informed the persons concerned of this transmission, considering that the data thus shared was no longer personal data. The European Data Protection Supervisor (EDPS) or EDPS) however considered that this transmission violated the transparency rules provided for in Regulation 2018/1725, applicable to European institutions. The General Court of the EU, in a judgment in 2023, ruled in favour of the CRU by considering that the data, from the point of view of Deloitte, were no longer personal, for lack of a means of identification.
The CJEU annuls this decision and draws from it several principles of general application, well beyond the public sector alone.
Although the judgment concerns Regulation 2018/1725, applicable to the institutions of the Union, the Court underlines that the definition of personal data it contains is, in essence, identical to that of the GDPR. It states that a uniform interpretation of these texts is necessary to ensure the consistency of Union law. The lessons of this decision therefore fully apply to private actors subject to the GDPR.
First, the Court recalls that the concept of personal data does not depend solely on the capacity of the recipient to identify a person, but also on the intrinsic link between the information and the person concerned. In other words, an opinion, even a pseudonymized one, can always “concern” its author, insofar as it reflects his beliefs, his experience or his positioning. The Court states that it is not necessary to demonstrate that the information has an effect on the person concerned in order for it to be classified as personal data.
Next, the CJEU confirms that the qualification of data as personal must be analyzed contextually. For the CRU, which retained the means of re-identification, the data clearly remained personal data. For Deloitte, which did not have access to these resources, the data could, under certain conditions, not fall within the scope of the GDPR. In other words, the same data may be personal for one, but not personal for the other, according to reasonably accessible means. This relative test is not new (cf. the Breyer case), but it is specified here in a context of sharing pseudonymized data between independent actors.
Finally, and this is the decisive point in the CJEU's analysis, the fact that the addressee cannot re-identify persons does not exempt the original manager from complying with the transparency obligations provided for by the texts. In this case, the CRU should have informed the participants that their comments, even if they were pseudonymized, would be sent to a third party for analysis. This obligation is a direct result of the fact that the CRU remained in possession of the correspondence key.
However, this decision leaves open some important questions, in particular concerning the exact role of the recipient of the data and the impact of the legal basis selected by the original person responsible.
On the one hand, although the decision does not resolve the issue, the exact role of the data recipient (autonomous manager or subcontractor) could have an important practical impact. In the CRU case, Deloitte acted as an independent contractor, responsible for analysing the responses as part of an assessment mission. The Court insists on the fact that the assessment of the qualification of the data depends on the point of view of the addressee and on his reasonable identification abilities. However, this contextual approach cannot be applied in the same way when it comes to a subcontractor. Indeed, a subcontractor acts on behalf of the data controller, according to his documented instructions, and under his authority. It is not autonomous in determining the purposes or means of processing. In legal terms, it is therefore considered to be an extension of the manager. This means that, even without direct access to the means of re-identification, the subcontractor remains subject to all the obligations applicable to the processing of personal data, as long as the person responsible is in a position to identify the persons concerned. In practice, this means that a subcontracting agreement is still necessary, and that all the guarantees imposed by the GDPR apply, subject to possible adaptations related to the low level of risk resulting from pseudonymization.
On the other hand, the decision also raises the question of the impact of the legal basis selected by the controller at the time of sharing the pseudonymized data. In CRU, comments came from participants in a public consultation, and CRU relied on their consent to process the data. The CJEU mainly accuses him of not having informed the persons concerned of the transmission of their data, even pseudonymized, to a third party.
But what if the processing was based on another legal basis, such as legitimate interest? In principle, the obligation to provide information applies regardless of the basis invoked. However, in a case where the data is transmitted to a third party unable to identify the persons concerned, the concrete scope of this information may be questionable. Should we really inform people of a treatment operated by an actor who does not know them, cannot recognize them, and is not in a position to act on them individually? And what about the exercise of rights in this context?
The Court does not rule on these issues. As a precaution, it should be considered that the transparency obligation remains fully applicable, even when the data is pseudonymised. This approach is all the more justified since, from the point of view of the person responsible, the data remains personal, and it is up to him to justify the legality of the treatment and to guarantee the rights of individuals. On the other hand, the way in which information is formulated and structured could evolve, to better reflect the low exposure to risk on the recipient's side, in a logic of proportionate transparency. In such a case, it could be envisaged to mention only the categories of recipients in the information notices, without necessarily going into the details of the entities concerned.
Other practical questions also remain open. For example, what happens if pseudonymised data is transferred outside of the European Union? The GDPR imposes specific guarantees for transfers of personal data to third countries. If the foreign recipient does not have access to the means of re-identification, can we consider that he does not, in practice, receive personal data? And should the rules relating to international transfers be applied in this case?
Beyond legal reasoning, this decision has important practical consequences for businesses. In many sectors, it invites the reconsideration of data governance strategies, the adjustment of contractual frameworks and the adaptation of operational practices according to the role played by each actor and the level of effective identifiability.
In digital marketing, it paves the way for a more flexible use of certain data sets when they have been pseudonymized in a robust way. For example, an actor receiving pseudonymized audience segments (without direct identifiers or re-identification keys) could, under certain conditions, fall outside the scope of the GDPR, provided they act as an autonomous manager and never attempt to enrich the data to find the identity of individuals.
In the field of artificial intelligence, decision-making can facilitate the training of models based on pseudonymized data when the service provider is not in a position to re-identify individuals. This requires rigorous technical governance and explicit contractual clauses governing in particular authorized uses, non-re-identification commitments and the distribution of responsibilities.
For SaaS providers, the decision makes it possible to adjust contractual obligations according to the effective level of identifiability. If the service provider acts as a subcontractor, a subcontracting agreement is still required. But some guarantees can be calibrated in a more proportionate manner if the data received is pseudonymised in a robust way.
In any case, businesses have every interest in mapping pseudonymized data flows, in documenting their identification risk analysis, and in adjusting their GDPR documentation and contracts accordingly. Pseudonymization should not be perceived only as a security measure: well applied, it becomes a real lever for legal flexibility provided that it remains under control.
This judgment comes in a context of transformation of European data law. It is in line with the logic of the Data Act (applicable as of September 12, 2025), which reinforces portability and security obligations in cloud services. It also complements the latest recommendations of the CNIL on the use of legitimate interest for the training of AI. Above all, it gives companies levers to adjust their level of legal commitment according to their real capacity to identify or not the people concerned.
For SaaS providers, technical platforms, AI teams or marketing departments, this decision is an opportunity for strategic review. Provided it is well understood and applied, it makes it possible to align GDPR obligations with operational realities, without giving up on the protection of individuals.
For any questions or support, you can contact us at cchance@squairlaw.com.
Caroline Chancé, partner lawyer at Squair
This analysis extends a first article published by Jeannie Mongouachon and Juliette Lobstein on the conclusions of the Advocate General (to be found). hither).