Each month, we deliver most of the latest data news in the newsletter Data4Coffee. Don't miss out on key information!
To receive it, please fill in This form.
[May 13] In May, the Bordeaux Court of Appeal confirmed the judicial resolution of a license agreement to operate a website due to substantial breaches of personal data protection obligations. In this case, the licensee operated a site that collected and processed data without complying with the security and information rules imposed by the RGPD, in particular the absence of compliant legal notices and the lack of a processing register. The contract expressly provided for the obligation to comply with the regulations on personal data, which was described as an essential condition. The licensee contended that these breaches were minor and could be rectified, but the court considered that they were serious enough to warrant immediate resolution. She noted that the lack of conformity damaged the image and responsibility of the grantor, who remained identified as the joint data controller. This decision underlines that compliance with GDPR obligations is not a simple formalism but is a decisive condition in digital exploitation contracts. The judgment is part of a demanding jurisprudential trend, placing data processing compliance at the heart of contractual relationships in the digital economy.
Source: Bordeaux Court of Appeal, 4th Commercial Chamber, May 13, 2025, no. 23/02044 | Doctrine
[June 5] While IT security is considered to be a business investment decision, it generally does not take into account the impact of such an investment on the rest of society and the market failures associated with the interdependent nature of businesses and individuals. In its analysis, the CNIL identifies three externalities that the RGPD makes it possible to mitigate: externalities affecting other companies, externalities for cybercriminals, and externalities affecting customers. Compliance with the GDPR, which requires actors to take a certain number of actions and in particular to report data breaches, makes it possible to combat under-investment in cybersecurity. Thus, notifications of these breaches would lead to a decrease of 2.5% to 6.1% in identity impersonations, and the RGPD would have prevented between 90 and 219 million euros in cyber damage in France in compensation for this crime.
Source: Cybersecurity: the economic benefits of the GDPR | CNIL
[June 10] Although the decision of the Constitutional Council of 15 November 2007 prohibits the collection of the real or presumed ethno-racial background of persons during studies, surveys to measure diversity at work are authorized as long as they are accompanied by sufficient guarantees to ensure respect for the participants' right to private life and do not result in discrimination. The CNIL recommendations published this month highlight the importance of the free and informed consent of respondents for the purposes of collecting their sensitive data, the optional nature of the survey and the information of the persons concerned. In addition, the CNIL recommends giving priority to anonymous surveys and limiting the data collected with closed questions.
Source: Surveys to measure diversity at work: the CNIL publishes its recommendations | CNIL
[June 12] An alternative tracking method to trackers and cookies, tracking pixels are an image of 1 pixel by 1 pixel containing a user identifier, integrated into a website or an email and invisible to the user, which makes it possible to know if the traced user visited the site or read the email concerned. The draft recommendation of the CNIL is limited to the use of these pixels in emails, and aims to complement the guidelines and the recommendation on cookies and other trackers. It identifies numerous actors whose role must be determined in terms of the processing carried out (email sender, email service provider, mailing list rental and email service provider, tracking technology provider, email service provider). A public consultation and a call for additional contributions on economic issues are open until 24 July 2025.
Source: Tracking pixels: the CNIL launches a public consultation on its draft recommendation | CNIL
[June 12] The Constitutional Council partially censored the law aimed at strengthening the fight against drug trafficking. Although the objective of maintaining public order is considered legitimate, the Council considered that several technical measures adopted disproportionately infringed fundamental rights, in particular respect for private life. Article 15, which allowed for the algorithmic processing of login data to detect suspicious behavior, has been cancelled. The Council emphasized that this arrangement established widespread and undifferentiated surveillance, contrary to the principle of proportionality guaranteed by the Constitution. Article 5, allowing direct access by intelligence services to administrative files containing fiscal, asset and banking data, was also censored. In the absence of sufficient guarantees on the traceability of accesses and the deletion of data that had become useless, this measure excessively infringed the right to privacy. This decision reiterates that the effectiveness of the fight against crime cannot justify surveillance tools without a rigorous framework for citizens' rights.
Source: Decision No. 2025-885 DC of 12 June 2025 | Law to get France out of the drug trafficking trap
[June 18] After being laid off as a precautionary measure, an associate director of Publicis Sapient France was dismissed for misconduct on 30 March 2018, due to the commission of acts of sexual or moral harassment or of acts of sexist or sexual connotation against female employees. As part of his labour court action, the employee had exercised his right of access within the meaning of article 15 of the RGPD in order to obtain communication of emails sent or received by him as part of the execution of his employment contract. The employer limited itself to sending various contractual, health and financial documents, without responding to the email request. The Court of Cassation, seized of this case, recalls that emails sent or received by an employee using his professional email are personal data, to which the employee has the right to access. The employer is thus required to provide both the metadata (timestamp, recipients) and the content of these emails, provided that the rights and freedoms of others are not infringed. The employer's abstention was recognized as faulty by the High Court. Thus, an employer cannot ignore requests for access to professional emails, and can only refuse to communicate them by justifying an infringement of the rights of others such as: respect for trade secrets, the protection of intellectual property rights or the protection of the right to privacy of third parties.
Source: Court of Cassation, Social Chamber, June 18, 2025, No. 23-19.022
To find out more about the relationship between GDPR and employment law, read our articles on right of access And the law of evidence.
[June 19] The CNIL has published recommendations on the use of legitimate interest as a legal basis for training artificial intelligence models. This document is based on the work undertaken since April 2024, including a first series of sheets on purpose, minimization, data retention and even AIPD. A second public consultation launched in June 2024 made it possible to collect 62 contributions from public, private and academic actors. The published summary focuses in particular on two fact sheets: one devoted to the conditions for using legitimate interests in the development of AI systems, the other focused on online data harvesting. It recalls that this legal basis requires a rigorous balancing of interests and respect for the rights of the persons concerned. A specific fact sheet on the open source distribution of AI models will be announced soon. These recommendations provide a practical framework for reconciling innovation and GDPR compliance.
Source: Development of AI systems: the CNIL publishes its recommendations on legitimate interest | CNIL
To find out more about using legitimate interest to train an AI, see our article on the recent Meta's position.
[June 20] The CNIL alerts on the launch of a single database containing 16 billion identifiers and passwords from previous data leaks, according to the Cybernews media. While this aggregation is not a new leak, it does increase the risk of online accounts being hacked, in particular through data reuse or automated attacks. The CNIL recommends simple but effective measures: check suspicious logins, immediately change compromised passwords, favor strong, unique passwords stored via a manager, and systematically activate multi-factor authentication on sensitive accounts. It warns against the use of third party sites offering to check if you are concerned, considered unreliable and potentially dangerous.
Source: Exposure of 16 billion logins and passwords — what to do? | CNIL
[June 20] The CNIL recently put online two practical FAQs to guide the use of artificial intelligence systems in schools. One is aimed at teachers, to help them integrate AI tools into the classroom safely: choosing appropriate tools, protecting data entered by students, monitoring and assisted correction, while stressing that these systems can produce plausible errors and must be used wisely. The other is aimed at data controllers (school heads, academies, etc.) and details the RGPD obligations: identification of the manager, role of the DPO, documentation of treatments, data security and clear information for students and parents.
Source: The CNIL publishes two FAQs on the use of AI systems in schools | CNIL
[June 26] While artificial intelligence is increasingly present and the European Union is making efforts to supervise its development and use, the CNIL and its partners are launching the PANAME project (Privacy Auditing of AI Models) in order to manage compliance issues with the RGPD. Since the work relating to the risks of breaches of the confidentiality of AI models is mainly academic and unsuitable for deployments in an industrial environment, the PANAME project aims to propose an evaluation tool, all or part of it in open source. For 18 months, the PEren (Digital Regulation Expertise Center), the ANSSI (National Agency for Information System Security), the PEPR iPOP Project (Priority Research Programs and Equipment) and the CNIL will collaborate to develop a software library intended to unify the way in which the confidentiality of models is tested. The PANAME project aims to offer actors in the AI ecosystem an effective and low-cost technical privacy assessment.
Sources:
[June 4] Between January and September 2023, Carrefour notified five data breaches to the Spanish Data Protection Authority (AEPD) that impacted 119,000 people concerned, all linked to illegitimate access to customer accounts via Credential Stuffing, without identifying the original source of the stolen identifiers. While Carrefour was aware of the first breach in October 2022, it was only reported in January 2023. In May 2023, the AEPD launched an investigation into these violations, having revealed that Carrefour had not implemented appropriate security measures (e.g. lack of multi-factor authentication) and that the persons concerned had not been properly informed of the data breaches, which nevertheless presented a high risk to their rights and freedoms. The AEPD imposed a penalty of 3.2 million euros in total, for the various breaches. This sanction reiterates the importance of implementing preventive security measures and analysing each data breach in terms of the risks for the persons concerned.
Source: AEPD (Spain) — EXP202305979 | GDPRhub
[June 5] In 2021, Hungary adopted a law on the protection of minors, in particular, requiring public bodies to provide “authorized persons” (adult person who requests it, parent of a person under 18 years of age or who provides education, custody or maintenance) with direct access to the criminal records of persons convicted of sexual offenses against minors. The European Commission tried, in vain, to request an amendment to this law, which it considered to be incompatible with Union law for violating Article 10 of the GDPR and the EU Charter of Fundamental Rights in the field of data protection. Hungary argued that the concept of “authorized person” was sufficiently precise when read in light of the Hungarian Civil Code, and that the access criteria were sufficiently limited: disclosure should probably be necessary to ensure the safety of the minor concerned, and it must be excessively difficult for the authorized person to access the data by other means. The case was referred to the Court of Justice of the European Union (CJEU), and Advocate General Ćapeta concluded that the concept of “authorized persons” was too broad and poorly defined, even when interpreted in the light of national civil law, and that the additional criteria were too generic and had to be assessed by the authorized person himself, thus relying on a self-declaration regime that was open to abuse and depriving organizations. public of any control over their data.
Sources:
[June 5] In December 2022, a person concerned was informed by his electricity supplier of the interruption of services at the request of the data controller, the electricity company Naturgy. Naturgy argued that this interruption was the consequence of a failure to pay bills for an electricity supply contract that would have been signed by the person concerned via a confirmation SMS and their IP address. The person concerned argued that the said contract had been signed using incorrect information and had therefore refused to pay the invoices to Naturgy. The Spanish Data Protection Authority (AEPD) considered that Naturgy had processed the data of the data subject without a legal basis, since it was its responsibility to verify that the telephone number that received the confirmation SMS was accurate and that Naturgy had continued to send invoices despite the fact that it had been informed by the data subject that it was not a signatory of the contract. With this sanction, the AEPD recalls the centrality of the legal basis during data processing.
Source: AEPD (Spain) — EXP202304821 | GDPRhub
[June 10] During an investigation, the Norwegian Data Protection Authority (Datatilsynet) noted the presence of 17 cookies and Meta and Snap pixels on a website providing assistance to minors who are victims of abuse. The data controller's privacy policy initially did not mention the presence of these trackers, and its modifications remained incomplete as the legal basis for the processing and the categories of data processed by the trackers remained absent and the terms used were not easily understood by minors. Datatilsynet's control focused on pixels, considering the processing to be unlawful since no legal basis could justify it: the website did not collect any consent, commercial monitoring is not a mission in the public interest and legitimate interest could not be retained since the declared objective of examining a marketing campaign did not outweigh the rights and freedoms of children. Datatilsynet imposed a fine of 250,000 NOK (21,600€) on the data controller, who acted negligently by sharing children's personal data. However, the latter cooperated with the authority during the procedure and removed the pixels before the decision was issued.
Source: Datatilsynet (Norway) — NO — DPA — 24/01055-10 | GDPRhub
[June 25] On 13 May 2025, the European Commission launched a public consultation on its guidelines on the protection of minors online under the Digital Services Act (DSA). With this step, the Commission aims to create a safer online environment for children, by supporting online platforms that are accessible to minors in order to ensure a high level of confidentiality, safety and security for children. The European Data Protection Board (EDPS) welcomes this initiative and recalls that children deserve special protection online, and that it is essential to avoid the deception or manipulation of minors, and to implement the principle of Privacy by design and By default. The EDPS notes that the DSA and the GDPR pursue different but complementary goals, and that the European Commission's draft guidelines provide clear and practical recommendations on what steps platform providers should take to improve security, safety, and privacy. As part of its 2024-2027 strategy, the EDPS intends to provide additional guidance on complying with the protection of the data of minors and on the interaction between the DSA and the GDPR in its “Guidelines for Children”.
Source: EDPB comments on European Commission's Guidelines on Art.28 DSA | EDPB
[June 27] Meike Kamp, German Data Protection Commissioner, called on Apple and Google to remove the DeepSeek app from their stores in Germany. She accuses the Chinese startup of illegally transferring the personal data of German users to China, without guaranteeing a level of protection equivalent to that required by the GDPR. The data in question includes not only the history of AI requests and downloaded files, but also sensitive information such as IP addresses or typing patterns, potentially accessible to Chinese authorities. This request is part of a wider European vigilance movement: Italy has already banned DeepSeek from its App Store, the Netherlands has excluded it from government devices, and the United States is considering the ban for official use. The case highlights the legal requirement of the GDPR for international data transfers (Chapter V) and the need, for any application involving servers outside the EU, to demonstrate adequate protection equivalent to that of the Union.
Sources:
[June 5] Following a data breach resulting from a cyberattack at Medibank and its subsidiary, the Australian Data Protection Authority (OPCW) launched an investigation that revealed breaches of the national law on the protection of personal data (Privacy Act 1988). As a result of this cyberattack, information from millions of current and past Medibank customers had been disseminated on the dark web. The OAIC considered that between March 2021 and October 2022, Medibank would have seriously infringed the privacy of 9.7 million Australian citizens by failing to take reasonable steps to protect their personal data from misuse and unauthorized access or disclosure. The authority retains the exposure of a large number of people to the likelihood of serious harm, including emotional distress, and to a significant risk of identity theft, extortion, and financial crime. As part of this procedure, the Federal Court could impose a civil fine of a maximum amount of 2,220,000 AUD for each violation of section 13G of the Privacy Act 1988 (serious invasion of an individual's privacy).
Source: OAIC takes civil penalty action against Medibank | Office of the Australian Information Commissioner
[June 19] The United Kingdom has adopted the Data (Use and Access) Act (DUAA), radically reforming the personal data protection regime. This law adjusts the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications Regulations, without repealing them, in order to better reconcile the economic value of data and compliance with European standards. The text facilitates the use of data for research purposes, including commercial purposes, and introduces arrangements in terms of information for individuals, by making it possible not to issue a notification when this would represent a disproportionate effort, subject to equivalent guarantees. At the same time, the text limits the access requests of the persons concerned to those considered reasonable and proportionate. In terms of automated decisions, the DUAA is expanding the possibilities of using these treatments on the basis of legitimate interest, while maintaining reinforced protection for sensitive data. The law creates a new legal basis called “recognized legitimate interests” for certain frequent treatments and relaxes the disclosure of data between public bodies. It also allows, under certain conditions, the use of cookies without prior consent and facilitates direct prospecting by charitable associations via a “soft opt-in”.
Source: Data (Use and Access) Act 2025
Caroline Chancé, Jeannie Mongouachon, Clémentine Beaussier, Victoire Grosjean and Juliette Lobstein