
Each month, we deliver most of the latest data news in the newsletter Data4Coffee. Don't miss out on key information!
To receive it, please fill in This form.
[October 1st] The CNIL and the National Institute for Research in Digital Sciences and Technologies (Inria) have signed a renewed agreement to conduct joint work on the protection of personal data, privacy and the evaluation of artificial intelligence systems. This partnership aims to pool the expertise of the supervisory authority and the research institute in order to anticipate the legal and technological challenges related to the General Data Protection Regulation (RGPD) and the European AI Act. In particular, the agreement provides for the conduct of algorithm audits, the development of educational tools, as well as the awareness-raising of the scientific community about transparency and ethical requirements. This initiative marks an important step: the automated processing of personal data is no longer just a technical matter, but really a question of legal compliance. For innovative players, this is a reminder that integrating data protection and “privacy by design” from the design stage is no longer an option but a requirement.
[October 3] Following the entry into force of the Data Act, ARCEP published an essential recommendation on cloud services, setting out best practices for interoperability and data portability, in accordance with the SREN law. This initiative aims to facilitate the transition from one supplier to another by increasing transparency, by imposing stable and documented APIs, and by documenting migration procedures.
For more information on the Data Act, please see our Dedicated article.
Source: Interoperability and portability: Arcep sets its guidelines for the cloud
[October 6] The Paris prosecutor's office has launched a criminal investigation targeting Apple Inc. for its voice assistant Siri, following a report from the Human Rights League about recordings potentially made without consent. The complaint refers to audio conversations, sometimes particularly sensitive, analyzed by an Irish Apple provider in order to improve Siri, a treatment that could violate the provisions of the GDPR relating to privacy. The judicial investigation is entrusted to the Central Office for Combating Crime Related to Information Technology (OCLCTIC), which specializes in cybercrime.
Source: The Paris prosecutor's office opens an investigation into Siri, Apple's voice assistant — POLITICO
[October 10] The CNIL recalls that, for the rental of an apartment, only documents strictly referred to and governed by Decree No. 2015-1437 may be requested from the candidate and from his guarantor. Forbidden documents include in particular the copy of the Carte Vitale, the family record book, bank statements or the criminal record extract. Agencies or donors must also inform the persons concerned of their data rights (access, correction, storage) at the time of collection. The CNIL underlines the need to limit the duration of storage of supporting documents and to guarantee the security of the data collected, while relying on its “rental management” framework applicable to professionals.
Source: Renting a property: what supporting documents can you ask for from candidates? | CNIL
[October 13] Since May 2025, the CNIL has imposed 16 sanctions under its simplified procedure, including 14 following complaints. The breaches mainly concern video surveillance, commercial prospecting without consent and the lack of cooperation with the supervisory authority. This procedure, which is faster and simplified, makes it possible to intervene for apparent breaches of the GDPR without carrying out a lengthy investigation. On the legal level, it highlights that even cases of so-called “simple” breaches can lead to a sanction and that rigor is required in the obligations of transparency, minimization and responsibility. For businesses, these sanctions serve as a reminder of the importance of anticipating controls and treating current risks seriously.
Source: The CNIL pronounces 16 new sanctions under the simplified procedure | CNIL
[October 14] The CNIL specified that, in a loyalty program, the bar code of the product purchased or the amount of a promotion awarded to the customer do constitute personal data and therefore fall within the scope of the right to portability. On the other hand, the method for calculating a targeted promotion or the algorithm distributing offers are not considered as personal data covered by the right to portability. This position of the CNIL implies that distributors must allow customers to extract this personal data in order to transfer them to a third party.
From a legal point of view, this clarifies that the right to portability does not only apply to data explicitly provided by the persons concerned, but also to those generated in a context of indirect identification. The companies concerned must adapt their procedures in order to comply with these portability obligations while ensuring the proportionality and security of the processing.
Source: Loyalty programs: the CNIL specifies the application of the right to data portability | CNIL
[October 15] The CNIL publishes Cahier IP no. 10: “Our data after us - From death to digital immortality”, which analyzes the practices surrounding post-mortem data and the associated new technological uses. Although the RGPD does not apply to the data of deceased persons, the CNIL emphasizes that national frameworks nevertheless make it possible to regulate the use and conservation of this data.
In particular, the CNIL notebook looks at the emergence of “deadbots”, conversational agents fed by the data of deceased persons, and highlights the legal gray areas that this can lead to. The CNIL highlights the need to anticipate the management of its digital heritage: pre-mortem choice, role of heirs, responsibilities of platforms. Legally, the focus is on transparency, the governance of post-mortem rights, and the maintenance of persistent data. This prospective work also invites us to take into account the environmental footprint of data, and the way in which “posthumous privacy” could become a field to be regulated.
Source: IP10 notebook - Our data after us | Linc
[October 15] On December 29, 2022, the CNIL imposed a fine of 8 million euros against Apple for violating article 82 of the Data Protection Act due to the filing of advertising trackers without users' consent. The Council of State rejected Apple's appeal against this decision, considering that the financial penalty imposed was not vitiated by an error of law or assessment. To set the amount of its penalty, the CNIL legitimately took into account the scope of the breach found due to the following elements: the dependence of users on the App Store for the download of applications, the number of people concerned by these trackers as well as all the same the consideration of the principles of data protection by design and by default.
Source: Council of State, 10th - 9th Chambers reunited, October 15, 2025, no. 473833
[October 16] In a dedicated article, the CNIL explains how to oppose the reuse of personal data for the training of artificial intelligence models. It details the procedures to follow according to the platforms — for example via the “Activity” settings for Gemini or opposition forms for Meta Platforms — in order to refuse the use of its data in a chatbot or conversational agent. Legally, this right to object is based on article 21 of the RGPD for treatments whose legal basis is legitimate interest. The CNIL emphasizes that this opposition does not call into question access to the service, but makes it possible to avoid that the shared data is exploited for training purposes.
For more information on the legal bases for training an AI with personal data, please consult our Dedicated article.
[October 21] The CNIL has published six fact sheets intended to guide candidates, political parties and service providers in respecting the protection of personal data during the electoral period. The documents include: political prospecting tools, usable files, communication by telephone or email and voters' rights. Each sheet recalls the application of the RGPD (legality, purpose, purpose, transparency, minimization) to political communication. The authority emphasizes that the consent or legitimate interest of the data controller must be clearly identified for any treatment and that voters must be able to easily oppose this prospecting. These guides reinforce the readability of obligations for political actors while recalling that data protection should not be an obstacle but a foundation of trust.
Source: Elections and political communication: the CNIL publishes 6 fact sheets to support actors | CNIL
[October 22] While the 2026 finance bill is under consideration by the National Assembly, the Finance Committee approved the adoption of an amendment to raise the tax rate on digital services from 3% to 15%, while increasing the global tax threshold from 750 million to 2 billion euros. The deputies behind the project consider that this measure aims to ensure a fairer contribution by major digital platforms to the financing of French public policies. With this new tax, only companies with more than 2 billion in turnover would be impacted, so as to avoid the taxation of French players such as Leboncoin. First adopted in 2019, the evolution of this GAFAM tax is presented as an act of fiscal sovereignty. The text still has to be voted on in plenary session in the National Assembly, then in the Senate, before being definitively adopted.
Sources:
[October 23rd] According to a survey published by the CNIL, 24% to 33% of Internet users would be ready to pay to access digital services without targeted advertising, with an average monthly price estimated between €5.50 and €9. More specifically, 48% of users are considering this option for online listening, 42% for video on demand and 30% for video games, while subscriptions in the press and AI concern 31% of respondents.
The study reveals that 51% of respondents consider data protection to be one of the three major criteria in choosing a digital service, underlining the real value of privacy protection. In this context, the “consent or pay” model, which imposes a choice between accepting targeting or subscribing, is developing but remains difficult to balance in terms of the right to free and informed consent.
For more information on the compliance of the “Consent or Pay” system with the GDPR, please consult our Dedicated article.
Source: Are French people ready to pay for online services without targeted advertising? | CNIL
[October 2nd] In September 2025, Bits of Freedom (BoF), a Dutch non-profit advocacy group, argued that Meta was restricting the freedom of choice and autonomy of users by the way it set up the recommendation systems on its Instagram and Facebook platforms. As such, BoF considered that Meta violated articles 25, 27 and 38 of the Digital Services Act (DSA) by making it difficult to access where to choose a non-profiled recommendation system, by automatically applying a profiled recommendation system to the home page and by making it impossible for the user to define a persistent choice in applications or on websites.
The Dutch jurisdiction characterized violations of the DSA by Meta, by considering that the functionality of choice was not easily accessible and that the return to a profiled system, despite the user's choice, constituted a “dark pattern”. As such, the judges ordered Meta to make the choice indicated by users persistent and to make the choice page directly and easily accessible on the home page and the Reels section of the platforms.
Source: Rb. Amsterdam — C/13/774725/KG ZA 25-687 MK/JD | GDPRhub
[October 9th] The European Data Protection Board (EDPB) will hold an online event in order to gather the opinions of actors concerned with the subject of data anonymization and pseudonymization, in light of the recent decision of the Court of Justice of the European Union (CJEU) clarifying the concept of “personal data”. This consultation aims to contribute to the development of guidelines that will specify, in particular, when pseudonymization can rule out the application of the GDPR. Discussions will also focus on the technical, organizational and contextual criteria to be taken into account when judging the re-identification of a person.
For more information on the decision of the CJEU in question, please consult our two dedicated articles:
[October 14] The EDPB announced that its coordinated action for 2026, within the framework of the Coordinated Enforcement Framework, will focus on complying with the information and transparency obligations provided for in Articles 12, 13 and 14 of the GDPR. During this operation, national data protection authorities will have to focus on how organizations communicate to the persons concerned the use of their data. This initiative involves the strengthening of cross-border controls and the possible harmonization of enforcement approaches between the various authorities within the European Union.
Source: Coordinated Enforcement Framework: EDPB selects topic for 2026 | European Data Protection Board
[October 16] The EDPB adopted its first joint guidelines with the European Commission to explain how the GDPR and the Digital Markets Act (DMA) interact, in particular with regard to the processing of personal data by access controllers or “gatekeepers” (Alphabet, Apple, Apple, Amazon, Amazon, ByteDance, Meta and Microsoft). These guidelines aim to help gatekeepers simultaneously comply with their information and transparency obligations imposed by articles 12 to 14 of the GDPR and the provisions of the DMA. The EDPB has also launched a new coordinated application framework (or “CEF”) on the transparency of treatments, in order to harmonize controls between data protection authorities at EU level.
For more information, please see our article XXX.
[October 16] In a dispute between NTH and a former employee of the company, the company illegally collected and stored data that allowed it to attest that its former employee had sold company-owned equipment on eBay. In this context, the CJEU has been referred to the CJEU for a preliminary ruling by the Higher Labour Court of Lower Saxony in order to verify the possibility for a national court to use personal data originally collected unlawfully. The General Counsel Dean Spielmann considers that the principle of limitation of the purpose and storage of data does not preclude such use when it occurs in the exercise of the judicial function. However, it states that such processing must comply with the following requirements: be based on applicable national law, comply with the principles of equivalence and effectiveness, meet a public interest objective and respect the principles of necessity and proportionality. The decision of the CJEU should be monitored, as it could influence the reconciliation between the right to evidence and data protection.
Sources:
[October 20] The EDPB adopted two opinions concerning the draft adequacy decisions of the European Commission for the United Kingdom: one under the GDPR, the other under the “Police-Justice” Directive. The aim is to extend, by mutual agreement, the United Kingdom's adequacy decision until December 2031. However, the EDPB notes several points of attention to monitor: in particular the management of automated decision systems as well as the impact of the UK‑US Cloud Act on data transfers by European nationals. European organizations therefore remain subject to vigilance both for their flows to the United Kingdom and in controlling the mechanisms for accessing their data.
[October 21] The Court of Appeal in Borgating, Norway, confirmed a fine of 5.5 million euros against Grindr for sharing sensitive personal data of its users with several business partners, who reserved the right to share this data with thousands of other companies, for the purposes of targeted advertising based on surveillance. This decision marks the third failure of Grindr in its appeal against the original 2012 decision of the Norwegian Data Protection Authority. Grindr could still try to take the case to the Norwegian Supreme Court. However, Finn Lützow-Holm Myrstad, the Director of Digital Policy at the Norwegian Consumer Advisory Service, welcomes the decision of the Court of Appeal by recalling the seriousness of such violations for consumer privacy.
Sources:
[October 24] Following preliminary findings, the European Commission concluded that there was a violation of the Digital Services Act (DSA) by TikTok and Meta. First, it considered that the procedures and tools put in place by Meta and TikTok violated their obligation to grant researchers adequate access to their public data, since researchers frequently access partial or unreliable data. She then concluded that Instagram and Facebook's mechanisms for reporting illegal content and challenging content moderation decisions did not allow users to provide explanations or evidence in support of their recourse, effectively limiting the effectiveness of such remedies. Meta and TikTok can now review the European Commission's investigative documents and respond to its preliminary findings. If the Commission confirms its position, these digital players face a fine of up to 6% of their total annual global turnover.
[October 13] California enacted SB 243 regulating “companion chatbots.” These are artificial intelligences that maintain a social type of interaction with users. In particular, it imposes the obligation for operators to clearly indicate that the user is interacting with an AI and not a human. A series of safety protocols are also required: prevent the appearance of content related to suicide or self-harm, prohibit minors from producing explicitly sexual content and publish these procedures transparently. Finally, starting in July 2027, platforms will have to submit an annual report to the Office of Suicide Prevention, outlining the number of recommendations made to crisis services and the protocols in place.
[October 15] On March 22, 2023, a cyberattack initiated by the download of a malicious JavaScript file on the device of an employee of the British company Capita PLC specializing in financial services resulted in access to the personal and sensitive data of more than 6 million people concerned. The British Data Protection Authority (ICO), at the end of its investigations, found that Capita PLC, the data controller, had not implemented either the technical and organizational measures necessary to ensure data security and prevent the escalation of privileges and unauthorized lateral movements, nor those to respond effectively to detected security alerts. The ICO found the same shortcomings at Capita Pension Solutions Limited, the subcontractor to Capita PLC. In view of the high degree of seriousness of the infringements, the ICO fined Capita PLC 8 million pounds, and 6 million pounds, on Capita Pension Solutions Limited, for a total amount of 14 million pounds.
Sources:
[October 22] The British manufacturer Jaguar Land Rover has suffered a major cyberattack, which alone cost the British economy 2.19 billion euros, according to the estimate of a think tank. The prolonged stoppage of its production lines has led to direct losses on the lines, but also to cascading effects for suppliers and subcontractors.
This economic impact confirms that cybersecurity is now a major industrial risk, with financial and strategic repercussions for the entire supply chain. Organizations must integrate this reality into their governance, by strengthening their defenses, by ensuring operational crisis management and by anticipating the legal consequences of a prolonged production stoppage.
Source: The Jaguar Land Rover hack cost the British economy 2.19 billion euros
[October 23rd] Reddit, the American publisher of the famous discussion and community website has filed a lawsuit against Perplexity AI and several “scraping” companies, accusing them of obtaining data from its users on an “industrial scale” without authorization. The complaint refers to both the circumvention of the website's protection measures and the exploitation of this content to train Perplexity's AI, without having concluded a license equivalent to those of Google or OpenAI governing the use of this data.
Source: Reddit is suing Perplexity for siphoning its data
Caroline Chancé, Jeannie Mongouachon, Clémentine Beaussier, Victoire Grosjean and Juliette Lobstein
.png)