Google Continues to Promise Its Bid to End Cookies Isn't an Enormous Power Grab

We may earn a commission from links on this page.
Image for article titled Google Continues to Promise Its Bid to End Cookies Isn't an Enormous Power Grab
Photo: Spencer Platt (Getty Images)

On Monday, Google released a few more details on its proposed tracking alternative to third-party cookies, a “privacy-first” technology that, from any angle, seems like just another way for the company to maintain its stranglehold on digital ad sales.

Google’s calls its new creation “Federated Learning of Cohorts” (FLoC, for short), and promises that it’s not only a less-creepy alternative to the third-party cookies and trackers that we’ve come to know and loathe over the years, but one that won’t hurt cut into its advertisers profits. Like most things in adtech, the full proposal is both complicated and technical as hell, but in a nutshell, while cookies allow advertisers to target people based on their individual web-browsing behavior, FLoC would essentially plop people into specific groups (called “flocks”) based on their inferred interests. Any data generated on an individual basis would be kept in-browser, and the only thing advertisers could track and target would be a “flock” containing an aggregated group of semi-anonymized people.

Advertisement

As an example, I can tell you that I recently became the proud owner of an Instant Pot, and have spent the past few days visiting countless sites with Instant Pot recipes, hacks, and how-to’s that invariably drop third-party cookies on my Chrome browser labeling me as a potential Instant Pot fanatic. The way digital ads work right now, these sort of cookies can be used to target me with Instant Pot-adjacent ads across the web, even if it kinda skeeves me out. Because these cookies are held within the browser I’m using to surf the web—Chrome, in this case—the only way I’d be able to flush out that data is through Chrome’s specific settings.

Advertisement

With FLoC, what would happen instead is that my Chrome browser would keep watch over the websites I visit, and, overtime, lump me into a so-called “flock” along with thousands of other Chrome users. In this particular case, my browser might catch onto the myriad slow-cooking sites I’m visiting every day, and assign me to a specific slow-cooking flock. Google’s advertisers could target these groups the same way they targeted their cookie-based groups beforehand—a tactic that, as Google’s latest blog puts it, “effectively hides individuals ‘in the crowd’.”

Advertisement

In and of itself, FloC doesn’t kill off third-party browser cookies—though Google has threatened to make that a reality for Chrome users before the end of the year—but the company hopes this new paradigm will supplant them. (Don’t worry, the useful kind of cookies, like tokens that remember your logins for frequently-visited sites, aren’t being sent to the great beyond just yet.)

FLoC is just one of the proposals that comprise the Privacy Sandbox project Google kicked off towards the end of 2019. Much like those other proposals, they’re ideas that sound decent until you start asking questions. As the EFF pointed out in its own breakdown of the Privacy Sandbox, being a part of a flock isn’t unlike being branded with a “behavioral credit score”: one that remembers your interests, your purchase history, and a lot of what makes you you, and puts it in the hands of one extremely powerful, largely unaccountable corporation.

Advertisement

Plus, as Google’s own technical documentation points out, it’s impossible to promise that that the machine learning algorithm that creates these groups won’t inadvertently end up creating flocks based on seriously sensitive information. As we’ve written before, different types of data are considered “sensitive” to different people, which means even if FLoC tries to mitigate some of these issues, there’s still going to be users left at risk. As the documentation states:

A cohort might reveal sensitive information. As a first mitigation, the browser should remove sensitive categories from its data collection. But this does not mean sensitive information can’t be leaked [...] It should be clear that FLoC will never be able to prevent all misuse.

Advertisement

Aside from that huge honking issue, it’s also worth remembering that FLoC only works if Google can still keep its unfettered access to all of our juicy user data. This wrinkle has led advocates and academics in the digital privacy sphere to call bullshit on the company again and again, pointing out Google’s privacy ploy is actually a shittily veiled attempt to kill off part of the digital ad market while controlling everything built upon its ashes. At the start of this year, the UK’s Competition and Markets Authority opened a formal investigation to probe some of these claims for themselves.

But this ongoing investigation in the UK (or any of the many other cases currently building against the company in the U.S.) hasn’t stopped Google from experimenting with FLoC. In the new blog, Google Product Manager Chetna Bindra claimed that, by the company’s estimation, an audience targeted by their “flock” tends to offer advertisers virtually the same bang for their buck. Based Google’s internal testing, Bindra claimed that ad targeting via flocks generated 95% of the same “conversions”—digital ad lingo describing clicks on an ad or purchases on a site among other actions—that cookie-based targeting did.

Advertisement

In other words, as Bindra told CNBC, using FLoC for advertising “ is literally nearly as effective as third-party cookies.” The only difference is that Google goes from controlling a giant chunk of the ad-targeting ecosystem to controlling virtually all of it.

Advertisement