free market

Average Folks and Retailer Tracking

Yesterday evening, I found myself at the Mansion on O Street, whose eccentric interior filled with hidden doors, secret passages, and bizarrely themed rooms, seemed as good as any place to hold a privacy-related reception. The event marked the beta launch of my organization’s mobile location tracking opt-out.  Mobile location tracking, which is being implemented across the country by major retailers, fast food companies, malls, and the odd airport, first came to the public’s attention last year when Nordstrom informed its customers that it was tracking their phones in order to learn more about their shopping habits.

Today, the Federal Trade Commission hosted a morning workshop to discuss the issue, featuring representatives from analytics companies, consumer education firms, and privacy advocates. The workshop presented some of the same predictable arguments about lack of consumer awareness and ever-present worries about stifling innovation, but I think a contemporaneous conversation I had with a friend better highlights some of the privacy challenges mobile analytics presents.  Names removed to predict privacy, of course!

Whose Hypothetical Horribles?

Released last fall, Rick Smolan and Jennifer Erwitt’s The Human Face of Big Data is a gorgeous, coffee table book that details page after page of projects that are using Big Data to reshape human society.  In a later interview, Smolan suggested that Big Data was “akin to the planet suddenly developing a nervous system” with “the potential to have a bigger impact on civilization than the Internet.” A bold claim if ever there was one, but the advent of Big Data begs the question: what sort of ramifications will this new data nervous system have on society?

Since I began reading about Big Data in earnest last year, I’ve noticed that much of the discussion seems to be focused at the extremes, hyping either the tremendous benefits and terrific fears of Big Data.

Proponents tend to look at data analytics with wide-eyed optimism.  In their recent book, Big Data: A Revolution That Will Transform How We Live, Work, and Think, Viktor Mayer-Schoenberger and Ken Cukier suggest that Big Data will “extract new insights or create new form of value, in ways that change markets, organizations, the relationship between citizens and governments, and more.”

On the other side of the coin, scholars like Paul Ohm argue that “Big Data’s touted benefits are often less significant than claimed and less necessary than assumed.” It is very easy to see Big Data as a giant engine designed explicitly to discriminate, to track, to profile, and ultimately, to exclude.  Technologist Alistair Croll has declared Big Data to be the “civil rights issue” of our generation.

Adam Thierer, who puts his faith in the market to sort these issues out, has derisively suggested these worries are little more than boogeyman scenarios and hypothetical horribles, but I take his point that much of the worry surrounding Big Data is of a kind of abstract doom and gloom.  The discussion could benefit by actually describing what consumers–what individuals are facing on the ground.

For example, in my paper, I noticed two interesting stories in the span of a few weeks.  First, that noted Judge Alex Kozinski had declared that he would be willing to spend $2,400 a year in order to protect his privacy from marketers and miscreants.  Second, that individuals were data-mining themselves on Kickstarter to the tune of $2,700.  One was an established legal figure; the other a poor graduate student.  One could pay.  The other could only sell.

More of the Big Data discussion should center around how consumers are honestly being impacted.  Instead, we’re still talking about Fair Information Practice Principles with the strong conviction that a few tweaks here and there and a renewed dedication to some long-standing principles will “solve” the privacy challenge we face.  In the regulatory regime, there is much discussion about offering “meaningful” user choice, but as the “Do Not Track” process has demonstrated, no one really knows what that means.

I would love to pay for my privacy, but that’s a cost I’m not prepared to cover.  I’d love to make meaningful choices about my privacy, but I’m not sure what any of my choices will actually accomplish.  Perhaps Thierer has a point, that I’m worried about hypothetical horribles, but I’m not sure our planet’s new data nervous system has my best interests in mind.

 Scroll to top