I’ve been disappointed by how little interest the remedies portion of this antitrust trial has received within the privacy community. Privacy, data protection, and information governance is at the heart of much of what is being proposed, and for all of the stakeholders calling for a digital regulator, the government’s proposal to put an independent technical committee in place over Google strikes me as a tremendous source of learnings. There is the occasional exception, but privacy is poised to an important consideration by the federal courts as they work to address Google’s anticompetitive behaviors in online search.

Since a federal judge declared Google an unlawful monopoly last summer, many different ideas have been put forward to remedy the search giant’s monopoly. On one side, more modest proposals have called for loosening Google’s lucrative agreements to be search defaults on different browsers and devices. Meanwhile, breaking up the company has more frequently made headlines. Yet, in between these extremes sits a bundle of remedies that deal more directly with the vast troves of data Google has collected and benefits from. As Julia Angwin writes, “[c]ompetitors need access to something else that Google monopolizes: data about our searches.”

Data Access Remedies

Search access remedies are important because merely stopping Google from outspending rivals or breaking up the company do not seem to clearly result in making online search better for anyone. DuckDuckGo, my employer, has explicitly called for access to technical data held by Google as well as requiring Google to provide competitors with access to search results via real-time APIs at fair, reasonable, and non-discriminatory rates.

There are multiple ways to go about this. The Department of Justice has proposed several different data access remedies. One is akin to Google’s existing relationship with Yahoo Japan and is mandated syndication agreements where rivals can build on top of Google’s infrastructure to tailor and improve their results to users. A second option is to leverage what is known as click and query data under the EU Digital Markets Act to help mitigate the huge advantage Google has in scale due to the amount of search queries the tech giant sees. Both offer a pathway to more competition in search over time, but it’s also easy to conflate the two proposals. 

Google does this, and this is because these data access remedies raise different privacy issues and offer different benefits for competition. Data sharing can sound scary, and unsurprisingly, Google has leaned into the idea that the government will make them share “your queries.” 

In a corporate blog post, the company called the search access an extreme idea, endangering “the security and privacy of millions of Americans” and requiring “disclosure to unknown foreign and domestic companies of not just Google’s innovations and results, but even more troublingly, Americans’ personal search queries.” That isn’t what the Department of Justice has proposed, but the company twists kernels of valid concerns into self-serving rhetoric. 

Privacy As Pretext

We should hesitate to give Google’s protestations too much credence. Even the Department of Justice recognizes the need to be “mindful of potential user privacy concerns in the context of data sharing,” but cautions that “genuine privacy concerns must be distinguished from pretextual arguments to maintain market position or deny scale to rivals.” In an amicus brief, the Federal Trade Commission explained that the government’s proposal is “well designed to protect user privacy as it seeks to ‘pry open’ long monopolized markets,” flagging “Google’s past privacy lapses and the potential for a remedy in this case to spur newfound competition between search providers on their privacy tools and policies.”

The unfortunate reality is that Google is quick to use privacy as a pretext when it suits them. As legal scholar Rory Van Loo explains, companies including Google have frequently cited concern for privacy as a reason not to share data with other entities, and in nearly every instance, “lack of data has the potential to weaken markets, whether by cutting off data from competitors or digital helpers.” 

Trial testimony demonstrates how Google has used privacy as cover to engage in sloppy technical work to undermine the DMA’s requirement that online gatekeepers provide anonymized “ranking, query, click and view data” about search results. Everyone agrees on the need to protect privacy here, but the result is Google has suddenly become a privacy maximalist. As DuckDuckGo explains, Google’s proposed anonymization only includes data from queries that have been searched more than 30 times in the last 13 months by 30 separate signed in users. Google may believe this is a thoughtful way to protect its users’ privacy, but it also conveniently omits over 99% of all search queries. 

Analyzing Google’s methodology, Prof. David Evans concluded that Google avoided trying any methods “to satisfy the privacy requirement while making the data more useful . . .  including the one that Google describes on its own privacy policy page.” What Google called significant engineering effort were things that Prof. Evans has students do in introductory computer science classes for nonmajors.

This isn’t the first time Google has suggested offering up scraps in a way that doesn’t threaten its dominance in search. Google told UK regulators in 2020 that one way to improve competition in search would be to provide Google Trends to its competitors, but no one who has built a search engine would find these sorts of informational trends to be technically useful. 

Even if we assume Google is just looking out for its users, it’s clear Google has ulterior motives. Note that at the same time as Google expresses concern about privacy, it highlights the need to protect its own “innovations” and bemoans that search access remedies could create “copycats” and “decrease incentives” for innovation. These aren’t privacy issues at all; they seem like sideshows to avoid dealing with the issue that Google was found to have an unlawful monopoly. This has come out in the remedy trial, too, where Google has been quick to conflate its own interest in hoarding its ill-gotten IP gains with nobly protecting privacy.

Enter a Technical Committee

So what should be done to protect people’s privacy interests here? The easiest solution is for more privacy experts to weigh in. The Department of Justice has called for the creation of an independent technical committee to oversee how competitors access Google data, staffed by members with expertise in “software engineering, information retrieval, artificial intelligence, economics, and behavioral science.” We should require privacy expertise within this entity, as well. An oversight body that can conduct privacy audits of both Google and the DuckDuckGos of the world is a no-brainer, as well as establish protocols for safeguarding and anonymizing search data.

While more research is needed, it’s not clear what level of privacy risk is inherent in sharing access to search results. Everyone – Julia Angwin, Google, and even DuckDuckGo – fixates on the infamous public release of millions of search query logs. The New York Times wrote headlines about how AOL Searcher No. 4417749 had been successfully re-identified by combing through the logs, but the Department of Justice – nor the European Commission – are asking for Google to release all search histories tied to user IDs. (In other words, a query for “Joe Jerome” that is divorced from a Google account, Tampa-based IP address, or anything else I’ve searched for.) 

Indeed, wholesale sharing of search history might be a major privacy breach, but that isn’t what’s been proposed as Google well knows. It’s also not clear such raw detailed information would even be useful to another search engine. What search engines need from Google is not detailed search histories, but rather information about what gets shown to people who make random queries. Should a knowledge panel be shown? Do people click on the third or fourth link on the page? You only need detailed user profiles if your goal is to profile people, but Google has made it so people think an online search is synonymous with this sort of invasive tracking. 

The other distinction here is that AOL shared that information with the public, but neither the Department of Justice nor European regulators have suggested making any of these remedies available to anyone. The technical committee described above has also been charged with establishing technical controls, data-sharing policies, and designating “qualified competitors” who can take advantage of these data access remedies. Limiting the universe of entities that can take advantage of these search access remedies also shrinks the universe of potential bad actors that could get access. It also suggests that non-technical protections could be useful, including internal controls like security policies, access limits, employee training, data segregation guidelines, and data deletion practices that aim to stop confidential information from being exploited or leaked to the public. Contractual restrictions on data use should also be required. 

If these considerations and protections are layered on top of any data access, the ultimate privacy risk goes down. An independent technical committee to regulate both Google and other search engine’s use of search data also addresses the fact that most data-driven companies face a trust deficit. But here, too, we see evidence that Google’s privacy protestations may be less about protecting users and more about avoiding oversight of its practices. Google has lambasted the very idea of an independent body to regulate it as “government micromanagement,” but an independent technical committee was viewed by all parties involved in the Microsoft antitrust action 20 years ago as a major success story

Google’s opposition echoes its longstanding opposition to strong privacy enforcement and lobbying to weaken privacy protections. As the FTC notes, Google’s hardly an innocent actor here. We all should be just as concerned with how Google uses all of the data it has amassed, and remedies that boost competition could be vital to making search more private for everyone.

0 Comments

Comments are closed.