Universal Opt-Outs (UOOM) spread throughout US states, Yahoo and Uber receive fines, and analytical cookies escape ePrivacy consent in the EU.

ePrivacy and regulatory updates


The US Senate Judiciary Committee invited the CEOs of Meta, TikTok, Snap, X, and Discord to (through the usual mono-directional grilling) better understand the role of these social media platforms in the harm suffered by children – and to possibly put an end to Section 230 liability protection from content that is not directly published by these platforms, but rather by other users.

On December 29th Google agreed to a $5bn settlement in California over tracking internet users (via Google Analytics, cookies, and other methods) while they were browsing in “incognito” mode. It was alleged that the search giant had entered into a legally binding promise not to collect users’ data when they browsed in private mode.

The CNIL (France’s Supervisory Authority for both the GDPR and ePrivacy Directive) imposed a 10 million euro fine on Yahoo for setting a large amount of cookies regardless of the choices made by users on the website. For its part, the Dutch DPA imposed an identical amount on Uber for disregarding the privacy rights of its drivers (not honoring access rights and failing to specify a retention period for their personal data). 

On January 9th, the FTC ordered data broker X-Mode to stop selling location data, which it considered “sensitive”. The company failed to obtain consent across multiple data collection partners, as well as to provide sufficient guarantees (such as withdrawing such consent). The FTC had also made it clear that consent will not be enough to justify unfair or disproportionate data collection practices or the commoditization of personal information.

On January 22nd the FTC told anti-virus software company Avast to stop selling consumer browsing history to third parties. The prohibition came accompanied by a $16.5m fine. Avast’s product was promoted as a means to protect users against trackers, as well as viruses, but its browser extensions were in turn able to collect massive amounts of information including religious beliefs, health concerns, political leanings, location, financial status, visits to child-directed content and other sensitive information. Through data onboarding partnerships with large advertising groups such as Omnicom, Avast promised to enrich user and click-level data with its own individual profiles. 

On February 2nd, at the request of the Italian DPA (Garante), Spain’s AEPD issued a 550k EUR fine to Glovo (a delivery app) for a breach of the GDPR in the manner in which it handled the personal information of its “riders”, including a disregard for the principles of transparency (art.13 – information being provided at the outset) and privacy by design (art. 25), the use of automated decisions through a scoring system (which would determine the assignment of a particular job -art. 22), and the exposure of their personal information to third parties in or outside the country in which they operated. 

Doordash was found in breach of both CCPA and CalOPPA, with California’s AG announcing a settlement on February 21st that included a change in its customer data selling/sharing practices (through a “marketing cooperative”), updates to its privacy notices, a $375k fine, and a full review of its contracts with analytics and marketing technology vendors. 

On March 5th, Spain’s AEPD used an urgent procedure to order Worldcoin (which has headquartered its EU operations in Bavaria, Germany) to stop scanning people’s irises in exchange for cryptocurrency in the country, pointing out the severe risks of consenting so lightly (often by teenagers) to the processing of highly sensitive biometric information, without a clearly defined purpose – other than theoretically being able to tell real humans apart from androids in a distant future. Individual unique identifiers are merely hashed and tied to mobile phone numbers and mobile app instances. Some 400,000 people in Spain have had their eyes scanned in less than a year, showing one of the best responses to this initiative across the world.

On March 7th, the CJEU agreed with the Belgian DPA on most of its complaints against IAB Europe, regarding the validity of its Transparency and Consent Framework (TCF).

On March 11th, the European Data Protection Supervisor found that the EU Commission’s use of Microsoft 365 products infringed the GDPR and ordered the EU executive branch to revisit its contractual framework with the US-based giant in order to guarantee that all appropriate safeguards are in place. 

The Dutch DPA will start enforcing misleading cookie banners and so will the UK’s ICO

Legal updates and guidelines

The EU Cyber Resilience Act was approved on Dec 1st. Hardware products with a digital component (able to connect to other devices or the internet) will have to follow a framework of cybersecurity requirements as of January 2027. 

On December 11th, California approved a new proposal to require browsers to respect opt-out preference signals. If adopted, this will finally force Chrome (Google), Safari (Apple), and Edge (Microsoft) to join Brave, DuckDuckGo and Firefox in the support for browser-level privacy settings – rather than website-specific pop-ups. Prior to this, the California Privacy Protection Agency released its own proposed framework to regulate automated decision making. 

Spain’s AEPD released its first Guidelines for the use of cookies in the context of Digital Analytics for statistical and high-level audience measurement purposes, at long last excluding them from consent requirements under Spain’s implementation of the ePrivacy Directive. These cookies will have a maximum lifespan of thirteen months and traffic history will not be kept for longer than two years. Any integration with campaign management or CRM data will trigger user consent. A Data Processing Agreement must also be in place, together with an observance of international data transfer rules and an impact assessment regarding its adequate configuration. These Guidelines coincide in time with the end of a grace period for the inclusion of a “Reject All” option (and the avoidance of dark patterns or “nudging” techniques) on the first layer of every website’s consent management pop-up. 

US lawmakers have introduced a law that could ban TikTok in the country unless its Chinese parent company (ByteDance) transfers its ownership to a US buyer. Besides national security and a threat to US democracy through the potential manipulation of individual feeds, privacy is a major justification for the new law – and this could be the first time that a law of such nature counts on bi-partisan support at federal level. 

MarTech and AdTech

In yet another twist of the “cookieless” future, The UK Competition and Markets Authority stated that Google must do much more in terms of resolving competition concerns resulting from widespread use of its Privacy Sandbox prior to entirely discarding third party cookies at the end of this year. Google insisted that the rollout would continue according to plan.

LiveRamp acquired Data Clean Room provider Habu. This comes a few weeks after Snowflake’s acquisition of Samooha (as reported in our latest Newsroom). 

Composable CDPs, often known as Reverse ETL solutions, kept on gaining ground as a data activation solution that sits on top of modern data warehouses rather than building an entirely separate copy of all customer records. We interviewed Tejas Manohar on Masters of Privacy to discuss their differences, privacy concerns, and potential overlaps with Data Clean Rooms.

AI, Competition and Digital Markets 

Preparing for the arrival of the EU’s Digital Markets Act (and having been designated as a “gatekeeper”), Google has changed how its search engine works in the EU, combining some of its “direct responses” or built-in services (such as Google Flights) with third party alternatives and results. 

After releasing its brand new GenAI tool, Gemini, in early February, Google was subject to intense public scrutiny after showing considerable “over-compensating” bias in the generation of historical images, with the search engine being accused of pushing a “political agenda” through the manipulation of weights in its AI Models, in favor of minorities or traditionally “liberal” views in entirely inconsistent historical contexts. A wider, more concerning question was raised by some: Can a complete lack of bias be assured in the future of internet search, with engines throwing out a single valid result rather than letting end users work their way around different sources or SEO tricks?

In mid-February, OpenAI introduced a “memory” feature in ChatGPT, raising privacy concerns despite the various individual controls provided for the deletion of such memory. Shortly after, the same company introduced a “text-to-video” GenAI tool called Sora. To counterbalance the increased risk of copyright infringement, misinformation, and “deep fakes”, OpenAI announced it had incorporated the Coalition for Content Provenance and Authenticity (C2PA) standard, which many experts deemed insufficient.

PETs and Zero-Party Data

Signal introduced usernames to provide an additional layer of privacy to 1:1 and group messaging. 

Bluesky’s user base continued to add X/Twitter defectors, adding individual controls and composability features which could turn the open social media platform into the most attractive alternative to Elon Musk’s private playground. 

We hosted an interview with Tumult Labs’ Damien Desfontaines on Masters of Privacy, discussing the most common Privacy Enhancing Technologies deployed by leading Data Clean Rooms in order to provide a privacy-safe alternative to granular data sharing or one-to-one user deduplication. 

Future of Media

Microsoft is embracing Chrome’s Privacy Sandbox (or at least part of it) in its Edge browser.

On a separate interview with Peter Craddock, we explored a possible alternative to the “consent walls” (where payment is the only alternative to accepting cookies) used by major publishers across Europe, hoping for the EDPB (or the courts) to update its interpretation of the “technical necessity” exemption in article 5.3 of the ePrivacy Directive so as to include basic advertising components. This would clear the way for a GDPR-specific scrutiny of the most appropriate legal basis by looking at the different levels of intrusiveness or personal data use employed by the many varieties of tailored advertising now available to publishers and advertisers. 

With Nina Müller and Sergio Maldonado.

Comments are closed.