ePrivacy and regulatory updates
This was a pretty active season in terms of regulatory updates and decisions or guidelines coming out of supervisory bodies.
Spain’s AEPD issued a decision on the use of Google Analytics by the Royal Academy of Spanish Language (“RAE”), becoming the first EU Data Protection Agency to see the glass half full in the use of the widespread digital data collection service (having been considered high-risk in Denmark, Italy, France, the Netherlands and Austria). It must however be noted that the RAE was only using the most basic version of the tool, without any AdTech integrations or individual user profiling – and in this regard aligned with the CNIL’s long-standing guidelines for the valid use of the tool.
At EU level, the Artificial Intelligence Act (which we have covered this quarter in a couple of Masters of Privacy interviews) made fast progress with the Council adopting its final position. At the same time, new common rules on cybersecurity became a reality with the approval of the NS2 Directive (or v2 of the Network and Information Security Directive) on November 28th. The updated framework covers incident response, supply chain security and encryption among other things, leaving less wiggle room for Member States to get creative when it comes to “essential sectors” (such as energy, banking, health, or digital infrastructure).
Across the Channel, the UK’s Data Protection Agency (ICO) issued brand new guidelines on international data transfers, providing a practical tool for businesses to properly carry out Transfer Risk Assessments and making it clear that either such tool or the guidelines provided by the European Data Protection Board will be considered valid.
Already into the new year, the European Data Protection Board (EDPB) issued two important reports, on valid consent in the context of cookie banners (in the hope to agree on a common approach in the face of multiple NOYB complaints across the EU) and the use of cloud-based services by the public sector. The former concluded that the vast majority of DPAs (Supervisory Authorities) did not accept hiding the “Reject All” button in a second layer – which most notably leaves Spain’s AEPD as the odd one out. They did all agree on the non-conformity of: a) pre-ticked consent checkboxes on second layer; b) a reliance on legitimate interest; c) the use of dark patterns in link design or deceptive button colors/contrast; and d) the inaccurate classification of essential cookies.
The latter concluded that public bodies across the EU may find it hard to provide supplementary measures when sending personal data to a US-based cloud (as per Schrems II requirements) in the context of some Software as a Service (SaaS) implementations, suggesting that switching to an EEA-sovereign Cloud Service Provider (CSP) would solve the problem and getting many to wonder whether it also refers to US-owned CSPs, which would leave few options on the table and none able to compete at many levels in terms of features or scale.
All of which can easily lead us to the latest update on the EU-US Data Privacy Framework:
The EDPB released its non-binding opinion on the status of the EU-US Data Privacy Framework (voicing concerns about proportionality, the data protection review court and bulk data collection by national security agencies). The EU Commission will now proceed to ask EU Member States to approve it with the hope of issuing an adequacy decision by July 2023. This would do away with all the headaches derived from the Schrems II ECJ decision (including growing pressure to store personal data in EU-based data centers), were it not for the general impression that a Schrems III challenge looms in the horizon.
In the United States, long-awaited new privacy rules in California (CPRA) and Virginia (CDPA) entered into force on January 1st. Although both provide a set of rights in terms of ensuring individual control over personal data being collected across the Internet (opt-out, access, deletion, correction, portability…), California’s creates a private right of action that could pave the way for a new avalanche of privacy-related lawsuits.In any case, only companies meeting a minimum threshold in terms of revenue or the amount of consumers affected by their data collection practices (both of them varying across the two states) will have to comply with the new rules.
Lastly, Privacy by Design will become ISO standard 31700 on February 8th, finally introducing an auditable process to conform to the seven principles originally laid out by Anne Cavoukian as Ontario(Canada)’s former Data Protection Commissioner.
It’s been interesting to see how continental Data Protection Agencies (“DPAs”) keep milking the cow of the ePrivacy Directive’s lack of a one-stop-shop for US or China-based Big Tech giants. The long-awaited ePrivacy Regulation never arrived to keep this framework in sync with the GDPR (which does have a one-stop-shop), and this leaves an opening for any DPA to avoid referring large enforcement cases involving such players to the Irish Data Protection Commissioner (“DPC”) whenever cookie consent is involved. This criterion has been further strengthened by the recent conclusions of EPDB cookie banner task force.
Microsoft was the last major victim of this particular gap (following Meta and Google), receiving a 60-million euro fine from France’s DPA (CNIL), which shortly after honored TikTok with a 5m euro fine (once again, due to the absence of a “Reject All” button on its first layer – or “not being as easy to reject cookies as it is to accept them”) and, not having had enough, went on to give Apple an 8m euro fine for collecting unique device identifiers of visitors to its App Store without prior consent or notice, in order to serve its own ads (which is akin to a cookie or local storage system when it comes to article 5.3 of the ePrivacy Directive).
The CNIL ePrivacy-related enforcement spree did not stop short at Big Tech. Voodoo, a leader in hyper-casual mobile games, was also a target, receiving a 3 million euro fine for lack of proper consent when serving an IDFV (unique identifier “for vendors”, which Apples does allow app publishers to set when IDFA or cross-app identifiers have been declined via the App Tracking Transparency prompt).
Putting the ePrivacy Directive aside, and well into pure GDPR domain, Discord received a 800k euro fine (again, at the hands of CNIL) on the basis of: a) a failure to properly determine and enforce a concrete data retention period; b) a failure to consider Privacy by Design requirements in the development of its products; c) accepting very low security levels for user-created passwords; and d) failing to carry out a Data Protection Impact Assessment (given the volume of data it processed and the fact that the tool has become popular among minors).
And yet, one particular piece of news outshined mostly everything else in this category: Ireland’s DPC imposed a 390 euro fine on Meta following considerable pressure from the EDPB for relying on the contractual legal basis in order to serve personalized advertising – itself the core business model of both social networks. We had a debate on the matter with Tim Walters (English) and Alonso Hurtado (Spanish) on Masters of Privacy, and published an opinion piece on our blog.
This last affair is a good segue into Twitter’s latest troubles. Its new owner, Elon Musk, not content with having fired key senior executives in charge of EU privacy compliance (including its Chief Privacy Officer and DPO), has suggested that he will oblige its non-paying users to consent to personalized advertising. The Irish DPC (once again, in charge of its supervision under the one-stop-shop rule) asked Twitter for a meeting in the hope to draw a few red lines.
Meanwhile, the Spanish AEPD, still breaking all records in terms of monthly fines, sanctioned UPS (70,000 euros) for handing out a MediaMarkt (consumer electronics) delivery to a neighbor, thus breaching confidentiality duties. This will have a serious impact on the regular practices of courier services in the country.
Back in the United States, Epic Games and the FTC agreed to a $520m fine for directly targeting children under the age of 13 with its Fortnite game (a default setting that allows them to engage in voice and text communications with strangers has made it worse), as well for using for “dark patterns” in in-game purchases.
Separately, in what we believe it is a first case of its kind, even in the EU (with the ECJ FashionID case possibly being the closest we have been to it). Betterhelp has received an FTC $7,8m fine for using the Facebook Lookalike Audiences feature (and alternative offerings in the programmatic advertising space, including those of Criteo, Snapchat or Pinterest) to find potential customers on the basis of their similarity with the online mental health service’s current user base. This involved sensitive data and follows repetitive disclaimers by Betterhelp that data would in no case be shared with third parties.
On the private lawsuits front (especially important in the US), Meta agreed to pay $725m after a class action was brought in California against Facebook on the back of the ever-present Cambridge Analytica scandal. Also, the Illinois Biometric Information Privacy Act (BIPA) kept putting money into the pockets of claimants and class action lawyers, in this case forcing Whole Foods (an upscale organic food supermarket chain owned by Amazon) to settle for $300.000 – we have previously previous cases against TikTok, Facebook or Snapchat, albeit it was the monitoring, via “voiceprints”, of its own employees (rather than its customers) that triggered this particular lawsuit.
Legitimate Interest strikes back
To finish with this section, very recent developments justify turning our eyes back to the UK and the EU as there is growing momentum for the acceptance of the legitimate interest as a legal basis for purely commercial or direct marketing purposes:
While the CJEU decides on a question posed by a Dutch court in January, in which the DPA issued a fine to a tennis association for relying on legitimate interest to share member details with its sponsors (who then sent commercial offers to them), a UK court (First-Tier Tribunal) has ruled against the ICO (UK DPA) and in favor of Experian (a well-known data broker) for collecting data about 5.3m people from publicly available sources, including the electorate register, to build customer profiles and subsequently selling them to advertisers. Experian has relied on legitimate interest and found it too burdensome to properly inform every single individual (this being the ICO’s main point of contention). The decision does appear to indicate that using legitimate interest would not be possible if the original data collection had been based on consent, but even this is not entirely clear.
So, just to make it even more clear and simple, the UK Government presented a new draft of a new UK Data Protection Bill on March 8th that includes a pre-built shortcut to using legitimate interest without need for the so-called three part test (purpose, necessity, balancing). Data controllers can now go ahead with this legal basis if they find their purpose in a non-exhaustive list provided – which includes direct marketing.
Competition and Digital Markets
Google was sued by the Department of Justice for anti-competitive behavior in its dominance of the AdTech stack across the open market (or the ads that are shown across the web and beyond its own “walled gardens”), using its dominance of the publisher ad server market (supply side) to further strengthen its stranglehold of the demand side (advertisers, many of them already glued to its Google Ads or DV360 platforms in order to invest in search keywords or YouTube inventory) and, worse, artificially manipulating its own ad exchange to favor publishers at the expense of advertisers – thereby reinforcing the flywheel, as digital media publishers found themselves with even less incentives to work with competing ad servers.
Zero-Party Data and Future of Media
(The piece of news below obliges us to combine both categories this season)
The BBC has rolled out its own version of SOLID pods to allow its own customers to leverage their own data (exported from Netflix, Spotify, and the BBC) in order to obtain relevant recommendations while staying in full control of such data. Perhaps a little step towards individual agency, but a giant one for a digital media ecosystem mostly butchered by the untenable notice-and-consent approach derived from the current legal framework – which takes us back full circle to Elizabeth Renieris’ new book.