The Future of Privacy: More Data and More Choices

As I wrapped up my time at the Future of Privacy Forum, I prepared the following essay in advance of participating on a plenary discussion on the “future of privacy” at the Privacy & Access 20/20 conference in Vancouver on November 13, 2015 — my final outing in think tankery. 

Alan Westin famously described privacy as the ability of individuals “to determine for themselves when, how, and to what extent information about them is communicated to others.” Today, the challenge of controlling let alone managing our information has strained this definition of privacy to the breaking point. As one former European consumer protection commissioner put it, personal information is not just “the new oil of the Internet” but is also “the new currency of the digital world.” Information, much of it personal and much of it sensitive, is now everywhere, and anyone’s individual ability to control it is limited.

Early debates over consumer privacy focused on the role of cookies and other identifiers on web browsers. Technologies that feature unique identifiers have since expanded to include wearable devices, home thermostats, smart lighting, and every type of device in the Internet of Things. As a result, digital data trails will feed from a broad range of sensors and will paint a more detailed portrait about users than previously imagined. If privacy was once about controlling who knew your home address and what you might be doing inside, our understanding of the word requires revision in a world where every device has a digital address and ceaselessly broadcasts information.

The complexity of our digital world makes a huge challenge out of explaining all of this data collection and sharing. Privacy policies must either be high level and generic or technical and detailed, each option proves of limited value to the average consumer. Many connected devices have little capacity to communicate anything to consumers or passersby. And without meaningful insight, it makes sense to argue that our activities are now subject to the determinations of a giant digital black box. We see privacy conversations increasingly shift to discussions about fairness, equity, power imbalances, and discrimination.

No one can put the data genie back in a bottle. No one would want to. At a recent convening of privacy advocates, folks discussed the social impact of being surrounded by an endless array of “always on” devices, yet no one was willing to silence their smartphones for even an hour. It has become difficult, if not impossible, to opt out of our digital world, so the challenge moving forward is how do we reconcile reality with Westin’s understanding of privacy.

Yes, consumers may grow more comfortable with our increasingly transparent society over time, but survey after survey suggest that the vast majority of consumers feel powerless when it comes to controlling their personal information. Moreover, they want to do more to protect their privacy. This dynamic must be viewed as an opportunity. Rather than dour information management, we need better ways to express our desire for privacy. It is true that “privacy management” and “user empowerment” have been at the heart of efforts to improve privacy for years. Many companies already offer consumers an array of helpful controls, but one would be hard-pressed to convince the average consumer of this. The proliferation of opt-outs and plug-ins has done little to actually provide consumers with any feeling of control.

The problem is few of these tools actually help individuals engage with their information in a practical, transparent, or easy way. The notion of privacy as clinging to control of our information against faceless entities leaves consumers feeling powerless and frustrated. Privacy needs some rebranding. Privacy must be “appified” and made more engaging. There is a business model to be made in finding a way to marry privacy and control in an experience that is simple and functional. Start-ups are working to answer that challenge, and the rise of ephemeral messaging apps are, if not perfect implementations, a sure sign that consumers want privacy, if they can get it easily. For Westin’s view of privacy to have a future, we need to do a better job of embracing creative, outside-the-box ways to get consumers thinking about and engaging with how their data is being used, secured, and ultimately kept private.

The How/What/When/Where of Location Data

Location data is one of the most coveted and sensitive data points in the digital ecosystem, and combined with an array of new context-based services, it promises a future of ultra-personalization. In this post for Privacy Perspectives, I discuss the state of location information and the unclear rules and control around the collection and use of this information. As companies build more and more advertising and user-profiling on top of different types of location data, we need to provide more effective controls. // More on IAPP Privacy Perspectives.

Voter Privacy and the Future of Democracy

As the election season gets into full swing, I teamed up with Evan Selinger (and an otherwise off-the-grid coworker) to discuss some of the privacy challenges facing the campaigns. A recent study by the Online Trust Alliance found major failings’ with the campaigns’ privacy policies, and beyond the nuts and bolts of having an online privacy notice, political hunger for data presents very real challenges for voters and perhaps more provocatively, for democracy. // More at the Christian Science Monitor’s Passcode.

Practical Privacy in 60 Minutes: Computers Freedom & Privacy 2015

After doing a roundtable on cookie tracking, Justin Brookman, formerly of the Center for Democracy & Technology, suggested with the proliferation of privacy tools, good and bad, he’d love to see what he could really do to protect his privacy in an hour. Inspired by this, I put together a panel conversation with Meghan Land, Staff Attorney, Privacy Rights Clearinghouse; Rainey Reitman, Activism Director, EFF; and Jay Stanley, Senior Policy Analyst, ACLU.

My aim was to try and have a high-level conversation/debate about what average folks would reasonably do to protect their privacy. Coming up with a set of basic privacy (and security) is tough: encrypted email is probably too hard, having a complex password system is probably too annoying. For sixty minute, I provoked the panelists and the audience at Computers Freedom and Privacy 2015 as to what we what might actually suggest to consumers and citizens.

Ethics and Privacy in the Data-Driven World

As part of the U.S. Chamber of Commerce’s “Internet of Everything” project, my boss and I co-authored a short essay on the growing need for company’s to have a “data ethics” policy:

Formalizing an ethical review process will give companies an outlet to weigh the benefits of data use against a larger array of risks. It provides a mechanism to formalize data stewardship and move away from a world where companies are largely forced to rely on the “gut instinct” of marketers or the C-Suite. By establishing an ethics policy, one can also capture issues that go beyond privacy issues and data protection, and ensure that the benefits of a future of smart devices outweigh any risks.

// Read more at the U.S. Chamber Foundation.

Social Listening and Monitoring of Students

The line between monitoring consumer sentiment in general and tracking individual customers is unclear and ill-defined. Companies need to understand public perceptions about both different types of online tracking and different sorts of consumer concerns. Monitoring by schools appears to be even more complex. In an opinion piece in Education Week, Jules Polonetsky and I discuss the recent revelation that Pearson—the educational testing and publishing company—was monitoring social media for any discussion by students of a national standardized test it was charged with administering. // Read more on Education Week.

Plunging Into the Black Box Society

Frank Pasquale’s The Black Box Society has been steadily moving up my reading list since it came out, but after Monday’s morning-long workshop on the topic of impenetrable algorithms, the book looks to be this weekend’s reading project. Professor Pasquale has been making the rounds for a while now, but when his presentation was combined by devastating real world examples of how opaque credit scores are harming consumers and regulators that were ill-equipped to address these challenges, U.S. PIRG Education Fund and the Center for Digital Democracy were largely successful in putting the algorithmic fear of God into me.

A few takeaways: first, my professional interest in privacy only occasionally intersects with credit reporting and the proliferation of credit scores, so it was alarming to learn that 25% of consumers have serious errors in their credit reports, errors large enough to impact their credit ratings. (PIRG famously concluded in 2004 that 79% of credit reports have significant errors.)

That’s appalling, particular as credit scores are increasingly essential, as economic justice advocate Alexis Goldstein put it, “to avail yourself of basic opportunity.” Pasquale described the situation as a data collection architecture that is “defective by design.” Comparing the situation to malfunctioning toasters, he noted that basic consumer protection laws (and tort liability) would functionally prohibit toasters with a 20% chance of blowing up on toast-delivery, but we’ve become far more cavalier when it comes to data-based products. More problematic is the byzantine procedures for contesting credit scores and resolving errors.

Or even realizing your report has errors. I have taken to using up one of my free, annual credit reports every three months with a different major credit reporting bureau, and while I think this procedure makes me feel like a responsible credit risk, I’m not sure what good I’m doing. It also strikes me as disheartening that the credit bureaus have turned around and made “free” credit reports into both a business segment and something of a joke — who can forget the FreeCreditReport.com “band”?

Second, the Fair Credit Reporting Act, the first “big data” law, came out of the event looking utterly broken. At one point, advocates were describing how individuals in New York City had lost out on job opportunities due to bad or missing credit reports — and had frequently never received adverse action notices as required by FCRA. Peggy Twohig from the Consumer Financial Protection Bureau then discussed how her agency expected most consumer reporting agencies to have compliance programs, with basic training and monitoring, and quickly found many lacked adequate oversight or capacity to track consumer complaints.

And this is the law regulators frequently point to as strongly protective of consumers? Maybe there’s some combination of spotty enforcement, lack of understanding, or data run amok that is to blame for the problems discussed, but the FCRA is a forty-five year-old law. I’m not sure ignorance and unfamiliarity are adequate explanations.

Jessica Rich, the Director of the FTC’s Bureau of Consumer Protection, conceded that there were “significant gaps” in existing law, and moreover, that in some respects consumers have limited ability to control information about them. This wasn’t news to me, but no one seemed to have any realistic notion for how to resolve this problem. There were a few ideas bandied back-and-forth, including an interesting exchange about competitive self-regulation, but Pasquale’s larger argument seemed to be that many of these proposals were band-aids on a much larger problem.

The opacity of big data, he argued, allows firms to “magically arbitrage…or rather mathematically arbitrage around all the laws.” He lamented “big data boosters” who believe data will be able to predict everything. If that’s the case, he argued, it is no longer possible to sincerely support sectoral data privacy regulation where financial data is somehow separate from health data, from educational data, from consumer data. “If big data works the way they claim it works, that calls for a rethink of regulation.” Or a black box over our heads?

1 2 3 7  Scroll to top