Each month, we deliver most of the latest data news in the newsletter Data4Coffee. Don't miss out on key information!
To receive it, please fill in This form.
[July 3rd] On 15 October 2025, Regulation (EU) 2024/900 on the transparency and targeting of political advertising will come into force. In order to ensure the proper implementation of this regulation complementing the RGPD and tending to regulate the use of digital tools in political communication, the CNIL has been designated as the competent authority to ensure the application of the articles relating to the processing of personal data. As such, the CNIL will be responsible for ensuring the compliance of data processing for the purposes of political communication, which implies: explicit consent of the persons concerned, collection directly from them, a ban on profiling based on the personal data of minors or on sensitive data and an obligation to keep a processing register. At the start of the school year, the CNIL will publish recommendations.
Source: Framework for targeted political advertising: the CNIL updates its doctrine | CNIL
[July 4th] The CNIL offers providers of audience measurement solutions a self-assessment tool allowing them to check whether these solutions can be used without collecting the consent of Internet users, in accordance with article 82 of the Data Protection Act. This tool lists precisely the technical criteria to be respected: single purpose (audience measurement), collection limited to essential data, deactivation of marketing treatments and absence of reuse of the data collected for the benefit of the service provider. If all the criteria are met, the supplier can declare to its customers that its solution, properly configured, benefits from the exemption of consent. The publisher, for his part, must ensure that his service provider provides clear and operational documentation allowing the tool to be configured in a compliant manner, otherwise he will be liable in the event of an inspection.
Source: Cookies: solutions for audience measurement tools | CNIL
[July 11] Operating using an artificial intelligence algorithm, “augmented” cameras are used by some tobacco shops as a decision-making tool in order to determine the majority, or not, of their customers. While the CNIL indicated in July 2022 that the use of such devices should be necessary and proportionate, it now considers that the control of the sale of products prohibited to minors is neither necessary nor proportionate. Since these devices cannot replace the systematic verification of proof of majority by tobacconists, they present unjustified risks to the fundamental rights and freedoms of the persons concerned. Since their mode of operation involves activation by default and on an ongoing basis, this results in permanent surveillance and an obstacle to the right of opposition of the persons concerned.
Source: “Augmented” cameras to estimate age in tobacco shops: the CNIL specifies its position | CNIL
To find out more about the legal and regulatory framework for smart cameras in public spaces, consult our item on this subject.
[July 11] After receiving two reports alleging the alleged use of the X platform algorithm for the purposes of foreign interference, the cybercrime section of the Paris prosecutor's office opened an investigation against the platform for altering the functioning and fraudulent extraction of data from an automated data processing system. The reports denounce a lack of transparency on the criteria for changing algorithms and moderation policies, as well as a significant change in the algorithm leading to more political, hateful, racist, anti-LGBT+ and homophobic content. In a publication, X denounces an investigation motivated by political considerations, questions the impartiality of the investigation, categorically denies the allegations and affirms that he does not comply with the requests of the French authorities.
Sources:
[July 13] A cyberattack targeting the Kairos application, a training management service for job seekers intended for France Travail partners, allowed potential access to the identity, contact details, identifiers and France Travail statuses of 340,000 job seekers. The hacker allegedly stole the password of a training organization located in Isère to illegally access the Kairos database. This new incident, which comes a year and a half after a cyberattack that compromised the identification data and social security numbers of 43 million job seekers, illustrates the recurring risks to data processing systems and recalls the need for strengthened security measures. France Travail declares that it has filed a complaint and reported the incident to the CNIL.
Sources:
[July 22] Since April 2024, the CNIL has been working to publish numerous recommendations concerning the relationship between data protection and artificial intelligence. In this line, on July 22, 2025, the CNIL published three new recommendations specifying the conditions for the applicability of the GDPR to AI models, as well as the requirements for the security and annotation of data used to train AI systems. In it, the CNIL details the methodological approaches to be adopted, the information to be given to the persons concerned, the appropriate security measures, the documentation to be kept, etc. This publication is part of the CNIL's 2025-2028 strategic plan, which aims to oversee the development of AI systems that respect data protection issues, without hampering innovation.
To learn more about the key steps to bring an AI tool into GDPR compliance, check out our item on this subject.
[July 28] Filtering web gateways, also called filtering web proxies, allow you to control and monitor Internet access to block access to certain sites or content for security and compliance reasons. Faced with the digitization of economic activity and the marked increase in cyber threats, these devices have evolved and now mobilize technologies combining artificial intelligence, sharing and use of various information, automated decision-making, and analysis of user behavior. Thus, while these solutions make it possible to meet the security obligation of article 32 of the RGPD, the CNIL is working on a draft recommendation aimed at supporting professionals and securing their approach to the protection of privacy by design. This draft is subject to public consultation until September 30, 2025.
Source: Web filtering gateway: the CNIL launches a public consultation on its draft recommendation | CNIL
[July 31] In 2021, the Mousse association was opposed by the CNIL to reject their complaint against SNCF for the unnecessary collection of the civility (Madame/Monsieur) of their customers. The Council of State, hearing an application for the annulment of this decision, had suspended the decision pending a response from the Court of Justice of the European Union (CJEU) to its preliminary questions. On January 9, 2025, the CJEU ruled that collecting customer civility, pursuing the objective of personalizing commercial communication, was not necessary for the execution of the contract between the user and the company except in special cases (e.g.: bunk compartments reserved for single women). Taking note of this decision, the Council of State ruled on 31 July 2025 that the systematic processing by SNCF of personal data relating to the civil status of its customers was unnecessary, especially as the absence of this mention would not make it more difficult to identify passengers. The Council of State concludes that SNCF's legitimate interest in personalizing its communications can be achieved by offering customers the option of indicating their citizenship on an optional basis. This decision reiterates the importance of the principle of data minimization.
Sources:
For more information, see our decision analysis article.
[July 31] The Bordeaux Court of Appeal ruled that a geolocation device attached to objects in stock constituted the processing of personal data, as long as it made it possible to identify an employee. The employer, who had neither informed the employee nor declared the measure, had the disciplinary warning cancelled and was convicted of psychological harassment. This decision recalls that geolocation, even indirect, may fall under the GDPR if it targets an identifiable person. In the absence of transparency and proportionality, the treatment is unlawful.
Source: Decision - Bordeaux Court of Appeal: RG n°22/05581 | Cour de cassation
[August 6] The jewelry brand Pandora has confirmed that it has suffered a cyberattack linked to a third-party platform, presumably resulting from an intrusion into its Salesforce base, compromising the names and email addresses of customers. No “highly personal” data such as passwords, bank details or logins was exposed, according to the company. Pandora said it immediately strengthened its security measures and that to date, no fraudulent use of data has been detected. Nevertheless, the brand encouraged its customers to remain vigilant against phishing attempts that exploit stolen information.
Source: Pandora jeweler victim of a personal data leak
[August 7th] Air France-KLM detected a data leak on a third-party platform used by their third party service provider in charge of customer relations, resulting in the exposure of names, contact details, Flying Blue loyalty numbers, their status and the subject of customer requests. According to the airline, internal systems were not affected and “highly personal” data (passwords, bank cards, passports, miles) was not compromised. The incident was reported to the competent authorities — the CNIL for Air France and the Autoriteit Persoonsgegevens for KLM — and the customers concerned were informed and invited to be vigilant in the face of phishing attempts. This flaw once again illustrates the vulnerability of outsourced CRM tools, which are very popular targets for cyberattacks.
Source: Air France-KLM warns its customers of a personal data leak
[August 8] The Constitutional Council ruled that article 22 of the Data Protection Act, in its version resulting from the law of 21 May 2024, infringes the Constitution by not providing for the information of the person implicated by the CNIL about his right to remain silent. This right, protected by article 9 of the Declaration of Human and Citizen Rights, applies to any punitive sanction procedure, including those conducted by the CNIL. If the repeal of the provision is postponed to 1 October 2026, the authority must immediately adapt its practices. Otherwise, its decisions could be overturned by the administrative judge. This decision establishes a new procedural law in the field of data protection, strengthening the balance between supervisory power and respect for fundamental rights.
Source: Decision no. 2025-1154 QPC of 8 August 2025 | Constitutional Council
[July 3rd] On the occasion of a high-level meeting held in Helsinki on 1Er and July 2, 2025, the European Data Protection Board (EDPS) adopted a historic statement on strengthening clarity, support and commitment. This statement describes new initiatives to facilitate GDPR compliance, especially for small and medium-sized organizations and to strengthen the consistency of interpretations and cooperation between European data protection authorities. The EDPS said it was launching a series of direct and practical resources to simplify the application of the GDPR, including the development of practices, methods, tools and guidelines for reviewing actions common to national authorities. This statement reflects the recognition by the EDPS of the increasing complexity of the digital regulatory landscape.
Sources:
[July 9th] After the publication of hateful and antisemitic remarks, aimed in particular at Polish politicians and the glorification of Adolf Hitler, Poland formally requested the European Commission to investigate Grok, the chatbot enriched with artificial intelligence developed by Elon Musk and integrated into the X platform. In a letter, the Polish Prime Minister, Krysztof Gawkowski, in a letter, the Polish Prime Minister, Krysztof Gawkowski, invokes a possible major violation of the Digital Services Act and considers that there are sufficient reasons. to think that the negative effects of Grok on rights and the fundamental freedoms of individuals are not the result of chance but the result of deliberate intent, By design. This report by Poland, which is a continuation of a Turkish decision that recently blocked access to certain Grok content, raises crucial legal issues on the responsibility of platforms and the obligation of proactive surveillance and protection by AI systems against automated hate speech.
Sources:
[July 10] On 10 July 2025, the European Commission published its General Purpose AI Code of Practice, along with Guidelines on the scope of obligations for providers of general-purpose AI models. This documentation clarifies the new obligations for general-purpose AI providers beginning on August 2, 2025, and aims to help the digital industry comply with them. Adherence to this Code of Practice, composed of three chapters on Transparency, Copyright, and Safety and Security, allows signatory suppliers to demonstrate their compliance with the IA Act and benefit from increased legal certainty. To date, 26 providers of general-purpose AI models, including Amazon, Google, Microsoft, and OpenAI, have fully adhered to this Code. On the other hand, xAI only adhered to the last chapter while Meta completely refuses to adhere to it, considering that this Code introduces legal uncertainties beyond the IA Act. These actors will have to demonstrate compliance with their obligations by other appropriate means.
Sources:
[July 10] The Spanish Data Protection Authority (AEPD) has fined Repsol 1.38 million euros for mistakenly awarding another customer's gas supply contract to a person concerned and invoicing them fees based on an erroneous contract. Although Repsol argued that there was no violation of the RGPD because it was only a human error resulting from the similarity between the names of the persons concerned, the AEPD recalled that other data (identity card, bank details) were different and that the error could have easily been avoided by a simple verification of the data by the data controller. In view of the extent and regularity of the processing, the authority considers that the risk of human error is a factor that Repsol should have taken into account and concludes that there has been a violation of articles 5 (1) (adequacy) and 32 (security) of the GDPR.
Sources:
[July 14] On 2 May 2025, the Irish Data Protection Commission (DPC) imposed a fine of 530 million euros and an injunction to stop transfers against TikTok for having transferred the personal data of European users to China on 2 May 2025, in violation of Article 46 of the GDPR. TikTok now maintains that this sanction is criminal in nature, and even goes beyond that by considering that the DPC exceeded its limited judicial functions and powers in violation of the Irish Constitution. TikTok argues that the decision to impose such a fine, and the lack of an effective right of appeal, constitute a violation of the right to a fair trial under the European Convention on Human Rights. As a result, the Irish High Court allowed judicial review of the DPC decision. The case will be heard in October.
Source: TikTok granted permission to challenge €530m DPC fine | RTE
[July 16th] In May 2018, Lisa Ballmann filed a complaint against Meta for violating articles 6 and 9 of the GDPR, which resulted in December 2022 in the EDPS fining Meta 210 million euros for unlawful processing of personal data for the purposes of targeted advertising on Facebook. At the end of the procedure, Lisa Ballmann sent the ECDP a request for access to the file relating to her complaint, in particular under the right to good administration under the Charter of Fundamental Rights of the European Union. Since the EDPS only gave access to certain parts of the file, Lisa Ballmann brought an action before the General Court of the European Union (TEU). The TUE granted Lisa Ballmann's requests, considering that the refusal to grant access to the complaint file affected the applicant's legal situation immediately and irreversibly. This decision highlights the direct interest of complainants in the outcome of a procedure, as here, which directly concerns the processing of their personal data.
[July 16th] The European Commission organized a dialogue on the implementation of the GDPR in Brussels, bringing together companies, NGOs and academics. All welcomed the overall balance of the regulation and its effectiveness, but expressed strong expectations: more clarity on certain concepts, more practical guides and adapted support, especially for SMEs. While civil society is strongly opposed to any revision of the text, some economic actors have suggested targeted adjustments, in particular to take into account challenges related to AI. All agree on the need for harmonized application and better coordination with other European texts, such as the AI Act.
[July 24] Between 2008 and 2020, the personal and financial data of a Life Assurance insured person was mistakenly sent to a third party. In 2021, the person concerned brought an action against the data controller in order to obtain compensation for the damages of distress, upheaval, anxiety, inconvenience and losses caused by the negligence and failure of Life Assurance to comply with its obligations under the GDPR. At the end of legal proceedings before the various Irish courts, the Supreme Court considered that the insured's complaint had been wrongly characterized as a claim for compensation for “personal injury”, and added that claims for compensation for moral damage related to distress, distress or anxiety may fall directly under the GDPR.
Sources:
[1]Er August] At the end of an investigation, the Swedish data protection authority had fined AB Storstockholms Lokaltrafik, the Stockholm public transport authority, a fine of 1.42 million euros, including 355,500 euros for violating article 13 of the GDPR. The data protection authority considered that by failing to inform passengers that they were being filmed by ticket controllers equipped with body cameras, AB Storstockholms Lokaltrafik had breached its obligations. After several appeals, the Swedish Supreme Court brought this case before the Court of Justice of the European Union (CJEU) for a preliminary ruling on the applicability of articles 13 or 14 of the GDPR in the event of data collection by a body worn camera. In her conclusions, the Advocate General of the CJEU upholds the application of article 13 of the GDPR, in accordance with the general principle of transparency. The collection of data by a body worn camera requires direct information to the persons concerned, at the time of collection, since they are the source and a lack of information could undermine the useful effect of Article 13 and give rise to hidden surveillance practices. The future decision of the CJEU is one to watch.
[Aug 12] LOn June 24, 2025, the Greek Data Protection Authority (DPA) imposed several fines on the “Shield of David” association for transmitting sensitive data concerning a child to third parties without parental consent, refused access to video surveillance recordings, and refused to cooperate with the DPA. The infringements concern the basic principles of the GDPR (articles 5, 12, 12, 13, 13, 13, 13, 15, 24 and 31), in particular transparency, the right of access, the responsibility of the data controller and cooperation with the supervisory authority. In total, the penalty amounts to €10,000, divided into four separate administrative fines. The decision recalls that even associations are fully subject to the rules of the RGPD, and that the data of minors require increased vigilance.
[August 18] The Austrian Federal Administrative Court has confirmed the decision of the national data protection authority that derstandard.at's so-called “Pay or Okay” model violates the GDPR. This device required users to accept targeting or to pay a monthly subscription, without offering them the opportunity to specifically consent to each purpose of the treatment. Barely 1 to 7% of Internet users wanted to be followed, while this system obtained an artificially high consent rate of 99.9%, demonstrating the absence of genuine free consent. The newspaper can appeal to the Supreme Administrative Court of Austria, which could refer the case to the Court of Justice of the European Union. This decision is a reminder that consent mechanisms must be granular to allow for freely given and non-coercive consent.
Source: Court decides “Pay or Okay” on derstandard.at is illegal
[August 26th] The Polish Data Protection Authority (UODO) has fined ING Bank Śląski 18.4 million zlotys (around €4.3 million), one of the highest amounts ever pronounced in the country. The penalty concerns the systematic digitization of identity documents of customers and prospects between April 2019 and September 2020, without sufficient legal basis and beyond the requirements of the duty to combat money laundering. UODO specified that this excessive collection should be preceded by a necessity assessment, which was not done, and that the mass of data processed implies increased vigilance. ING cooperated with the authority, revised its procedures (limiting scanning to new customers or changing data), but announced its intention to appeal.
Source: Polish Bank Fined Over GDPR... | Vitallaw.com
[July 22] Microsoft has faced a large-scale campaign of cyberattacks targeting on-premise SharePoint servers, following the exploitation by hackers of a zero-day vulnerability that allowed them to bypass protection measures, take total control of an exposed server and install malicious files. Among the hackers, Microsoft security teams identified at least three actors, all suspected of operating in China, including two state groups LinentYphoon and VioletTyphoon. These groups are known to target government organizations in particular and carry out espionage campaigns that involve stealing data and then reselling it. The Cybersecurity and Infrastructure Security Agency, an American authority, reacted quickly by asking all relevant federal agencies to install a patch or disconnect the affected servers.
Sources:
[July 28] On July 28, 2025, the British Data Protection Authority (ICO) published its decision issued on June 24, 2025, its decision against Birthlink, a company providing post-adoption services and managing the registry of adoption contacts for Scotland allowing biological and adoptive parents to register their contact details in order to be connected. As such, Birthlink kept manual files, stored in filing cabinets, containing documents relating to the individual circumstances of adopted persons. In 2021, 4,800 of these files, containing sensitive and irreplaceable files, were destroyed without the explicit agreement of the board of directors. In September 2023, after internal investigations, Birthlink reported this data breach to the ICO. Faced with the absence of appropriate security and organizational measures (internal destruction and approval policies, staff training, etc.), the ICO concluded that the principles of integrity, confidentiality andAccountability and fined £18,000.
Sources:
[August 19] Google and YouTube have agreed to settle a class action lawsuit in a $30 million settlement, alleging that they collected personal data from children under 13 on YouTube without parental consent in order to serve targeted advertising. This agreement, subject to the approval of a federal court in California, potentially involves between 35 and 45 million people affected, with compensation estimated between $30 and $60 per person for those who file a claim. This agreement comes after Google had already paid 170 million dollars in 2019 for similar events. The group denies any responsibility but wants to avoid a lawsuit. The case illustrates the growing pressure on platforms targeting young audiences, a field that is increasingly legally risky.
Source: Google Settles YouTube Children's Privacy Lawsuit | Reuters
[August 20] The Data Use and Access Act 2025 (DUAA), which came into force gradually between August 2025 and June 2026, modernises several key pieces of UK digital information law, including the UK GDPR, the Data Protection Act 2018, and the Privacy and Electronic Communications Regulations. It clarifies the rules on scientific data, authorizes the Broad Consent, and broadens the possibility of relying on recognized legitimate interests, without systematically applying a balance test for certain contexts. The DUAA also regulates data access requests (DSARs) more precisely: organizations are only required to conduct “reasonable and proportionate” searches, with the possibility of suspending deadlines pending further information. The British data protection authority, the ICO, has already published guides for organizations to support this transition, in particular on research, decision automation, cookies and procedural obligations.
Source: The Data Use and Access Act 2025 (DUAA) - what does it mean for organizations? | HERE
[August 28] The Privacy Commissioner of Canada, Philippe Dufresne, has concluded, after an investigation against Google, that individuals can ask to do De-list (“de-reference”) certain information about them when the risk of serious harm outweighs the public interest in keeping that data accessible through a name search. Among the criteria for such a request are the fact of not being a public figure, information outside of public debate, inaccurate, outdated or concerning a minor. The approach is based on the PIPEDA law and is inspired by the principles of the European GDPR, underlining the need to balance privacy protection and freedom of expression. Google refused to immediately implement this recommendation, despite court decisions claiming that its search engine falls under Canadian law obligations.
Caroline Chancé, Jeannie Mongouachon, Clémentine Beaussier, Victoire Grosjean and Juliette Lobstein