Plunging Into the Black Box Society

Frank Pasquale’s The Black Box Society has been steadily moving up my reading list since it came out, but after Monday’s morning-long workshop on the topic of impenetrable algorithms, the book looks to be this weekend’s reading project. Professor Pasquale has been making the rounds for a while now, but when his presentation was combined by devastating real world examples of how opaque credit scores are harming consumers and regulators that were ill-equipped to address these challenges, U.S. PIRG Education Fund and the Center for Digital Democracy were largely successful in putting the algorithmic fear of God into me.

A few takeaways: first, my professional interest in privacy only occasionally intersects with credit reporting and the proliferation of credit scores, so it was alarming to learn that 25% of consumers have serious errors in their credit reports, errors large enough to impact their credit ratings. (PIRG famously concluded in 2004 that 79% of credit reports have significant errors.)

That’s appalling, particular as credit scores are increasingly essential, as economic justice advocate Alexis Goldstein put it, “to avail yourself of basic opportunity.” Pasquale described the situation as a data collection architecture that is “defective by design.” Comparing the situation to malfunctioning toasters, he noted that basic consumer protection laws (and tort liability) would functionally prohibit toasters with a 20% chance of blowing up on toast-delivery, but we’ve become far more cavalier when it comes to data-based products. More problematic is the byzantine procedures for contesting credit scores and resolving errors.

Or even realizing your report has errors. I have taken to using up one of my free, annual credit reports every three months with a different major credit reporting bureau, and while I think this procedure makes me feel like a responsible credit risk, I’m not sure what good I’m doing. It also strikes me as disheartening that the credit bureaus have turned around and made “free” credit reports into both a business segment and something of a joke — who can forget the FreeCreditReport.com “band”?

Second, the Fair Credit Reporting Act, the first “big data” law, came out of the event looking utterly broken. At one point, advocates were describing how individuals in New York City had lost out on job opportunities due to bad or missing credit reports — and had frequently never received adverse action notices as required by FCRA. Peggy Twohig from the Consumer Financial Protection Bureau then discussed how her agency expected most consumer reporting agencies to have compliance programs, with basic training and monitoring, and quickly found many lacked adequate oversight or capacity to track consumer complaints.

And this is the law regulators frequently point to as strongly protective of consumers? Maybe there’s some combination of spotty enforcement, lack of understanding, or data run amok that is to blame for the problems discussed, but the FCRA is a forty-five year-old law. I’m not sure ignorance and unfamiliarity are adequate explanations.

Jessica Rich, the Director of the FTC’s Bureau of Consumer Protection, conceded that there were “significant gaps” in existing law, and moreover, that in some respects consumers have limited ability to control information about them. This wasn’t news to me, but no one seemed to have any realistic notion for how to resolve this problem. There were a few ideas bandied back-and-forth, including an interesting exchange about competitive self-regulation, but Pasquale’s larger argument seemed to be that many of these proposals were band-aids on a much larger problem.

The opacity of big data, he argued, allows firms to “magically arbitrage…or rather mathematically arbitrage around all the laws.” He lamented “big data boosters” who believe data will be able to predict everything. If that’s the case, he argued, it is no longer possible to sincerely support sectoral data privacy regulation where financial data is somehow separate from health data, from educational data, from consumer data. “If big data works the way they claim it works, that calls for a rethink of regulation.” Or a black box over our heads?

Hate the Consumer Privacy Bill of Rights, but Love the Privacy Review Boards

Considering the criticism on all sides, it’s not a bold prediction to suggest the White House’s Consumer Privacy Bill of Rights is unlikely to go far in the current Congress. Yet while actual legislation may not be the cards, the ideas raised by the proposed bill will impact the privacy debate. One of the bill’s biggest ideas is the creation of a new governance institution, the Privacy Review Board.

The bill envisions that Privacy Review Boards will provide a safety valve for innovative uses of information that strain existing privacy protections but could provide big benefits. In particular, when notice and choice are impractical and data analysis would be “not reasonable in light of context,” Privacy Review Boards could still permit data uses when “the likely benefits of the analysis outweigh the likely privacy risks.” This approach provides a middle-ground between calls for permissionless innovation, on one hand, and blanket prohibitions on innovative uses of information on the other.

Instead, Privacy Review Boards embrace the idea that ongoing review processes, whether external or internal, are important and are a better way to address amorphous benefits and privacy risks. Whatever they ultimately look like, these boards can begin the challenging task of specifically confronting the ethical qualms being raised by the benefits of “big data” and the Internet of Things.

This isn’t a novel idea. After all, the creation of formal review panels was one of the primary responses to ethical concerns with biomedical research. Institutional review boards, or IRBs, have now existed as a fundamental part of the human research approval process for decades. IRBs are not without their flaws. They can become overburdened and bureaucratic, and the larger ethical questions can be replaced by a rigid process of checking-off boxes and filling out paperwork. Yet IRBs have become an important mechanism by which society has come to trust researchers.

At their foundation, IRBs reflect an effort to infuse research with several overarching ethical principles identified in the Belmont Report, which serves as a foundational document in ethical research. The report’s principles of respect for persons, beneficence, and justice embody the ideas that researchers (1) should respect individual autonomy, (2) maximize benefits to the research project while minimizing risks to research subjects, and (3) ensure that costs and benefits of research are distributed fairly and equitably.

Formalizing a process of considering these principles, warts and all, went a long way toward alleviating fears that medical researchers lacked rules. Privacy Review Boards could do the same today for consumer data in the digital space. Consumers feel like they lack control over their own information, and they want reassurances that their personal data is only being used in ways that ultimately benefit them. Moreover, calls to develop these sorts of mechanisms in the consumer space are also not new. In response to privacy headaches, companies like Facebook and Google have already instituted review panels that are designed to reflect different viewpoints and encourage careful consideration.

Establishing the exact requirements for Privacy Review Boards will demand flexibility. The White House’s proposal offers a litany of different factors to consider. Specifically, Privacy Review Boards will need to have a degree of independence and also possess subject-matter expertise. They will need to take the sizes, experiences, and resources of a given company into account. Perhaps most challenging, Privacy Review Boards will to balance transparency and confidentiality. Controversially, the proposed bill places the Federal Trade Commission in the role of arbiter of the board’s validity. While it would be interested to imagine how the FTC could approach such a task, the larger project of having more ethical conversations about innovative data use is worth pursuing, and perhaps the principles put forward in the Belmont Report can provide a good foundation once more.

The principles in the Belmont Report already reflect ideas that exist in debates surrounding privacy. For example, the notion of respect for persons echoes privacy law’s emphasis on fair notice and informed choice. Beneficence stresses the need to maximize benefits and minimize harms, much like existing documentation on the FTC’s test for unfair business practices, and justice raises questions about the equity of data use and considerations about unfair or illegal disparate impacts. If the Consumer Privacy Bill of Rights accomplishes nothing else, it will have reaffirmed the importance of a considered review process. Privacy Review Boards might not have all the answers – but they are in a position to monitor data uses for problems, promote trust, and ultimately, better protect privacy.

Cyberlaw and Business Report: Discussing the Internet of Things

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

On Wednesday, February 4th, I joined Bennet Kelley on the Cyberlaw and Business Report to discuss the FTC’s recent staff report on the Internet of Things. We discussed some of the privacy implications of new wearables and smart technologies, as well as how traditional Fair Information Practices are under strain.

Overcoming the Privacy Management Buzzkill: Make Privacy Fun?

Over the past two years, protecting our personal privacy has come to feel like a daily struggle and an oppressive burden. With the holiday shopping season in full swing, get ready for more stories about major data security breaches at retailers, our devices being detected through every shopping aisle and checkout line, and the usual array of Grinch-like hackers and spammers looking to steal our most intimate photos.

Bah! Humbug!

So it should come as no surprise that average Americans feel completely insecure when it comes to protecting their privacy. A recent Pew survey revealed 91% of Americans now believe they have lost control over how companies gather and use their personal information.

This anxiety should provide an opportunity for companies to win with consumers simply by providing them with more control. Fully 61% of those surveyed by Pew expressed a strong desire “to do more” to protect their privacy online.

Still, “user empowerment” and “privacy management” have been at the hearts of efforts to improve privacy for years, and the 2012 White House Consumer Bill of Rights repeatedly stresses the importance of providing individual controls for the collection and use of data. Many companies already offer consumers an array of meaningful controls, but one would be hard-pressed to convince the average consumer of that. Further, the proliferation of digital opt-out mechanisms has done little to actually provide consumers with any feeling of control.

The problem is few of these tools actually help individuals engage with their information in a practical, transparent way. Instead, privacy becomes an overwhelming chore, and something that takes too much time and energy for the average person to process.

What we need are tools that make privacy fun.

Nico Sell, CEO of Wickr, an app that provides encrypted self-destructing messages, argues that privacy has developed something of a dour image problem. Instead, she suggests online privacy tools need to be marketed more like snowboarding – as something cool. While it is easy to dismiss her suggestion as the sales-pitch of a Silicon Valley entrepreneur, privacy could benefit from being less like paperwork, particularly if the goal is to alleviate consumer insecurities.

Conversations abound about how privacy should be “baked into” consumer products and services, or how to offer features to control personal data. Most public privacy debates focus on what sort of check-boxes should be clicked by default. While understandable, these debates sidestep simple usability issues. We need to do a better job of embracing creative, outside-the-box ways to get consumers thinking about how their data can used, secured, and kept private online. Advances in web design and, more recently, app development have made everything from tracking personal finances to reading the text-heavy Harvard Law Review more enjoyable. There’s no reason design and functionality can’t also be used to make privacy more engaging.

Even small tweaks go far. Facebook, for example, recently featured a blue privacy dinosaur to help its users with a “privacy check-up.” More than 86% of Facebook users seeing the tool actually completed the entire privacy check-up, and Facebook suggested that the dinosaur “helped make the experience a little more approachable and a little more engaging.” Presenting users with a privacy check-up is easier than asking them to wade through a myriad of privacy settings of their own volition. Putting these simple tools right in front of user eyeballs not only makes privacy more approachable, but more salient.

How privacy tools are marketed and presented to consumers is important. Firefox announced it would enable a “Forget” button right on the browser’s dash, allowing users to wipe clean portions of the Internet browsing history with two clicks. Hardly a new feature, but it’s not something anyone thinks to do regularly. Placing privacy tools front and center can change that equation, and the “Forget” button is a much more user-friendly concept than a privacy tutorial that asks someone to plow through menus and preference panels.

Different companies have different business models and incentives to stress privacy, but everyone should agree that longstanding, widespread public anxiety – even apathy – about privacy is something that needs to be addressed. At some point, someone will find a way to marry privacy and simplicity in a cool, fun, and more importantly, widely embraced experience. A host of start-ups are working to answer that challenge, and the rise of ephemeral messaging apps like Snapchat are, if not perfect implementations, a sure sign that consumers will flock to tools that will give them privacy piece of mind.

Big tech players may be in the best position to help privacy go mainstream, which is why it’s a positive step when a company like Apple can make privacy features a centerpiece in its rollout of iOS8. Apple has always excelled at getting people engaged with its products, and at the very least, privacy needs a marketing makeover.

After all, when it comes to privacy, getting consumers engaged is half the battle. Making fun privacy tools shouldn’t be that hard. The challenge will be to make them more widespread.

Big Data: Catalyst for a Privacy Conversation

This week, the Indiana Law Review released my short article on privacy and big data that I prepared after the journal’s spring symposium. Law and policy appear on the verge of redefining how they understand privacy, and data collectors and privacy advocates are trying to present a path forward. The article discusses the rise of big data and the role of privacy in both the Fourth Amendment and consumer contexts. It explores how the dominant conceptions of privacy as secrecy and as control are increasingly untenable, leading to calls to focus on data use or respect the context of collection. I quickly argue that the future of privacy will have to be built upon a foundation of trust—between individuals and the technologies that will be watching and listening. I was especially thrilled to see the article highlighted by The New York Times’ Technology Section Scuttlebot.

No Privacy/No Control

This week, the Pew Research Center released a new report detailing Americans’ attitudes about their privacy. I wrote up a few thoughts, but my big takeaway is that Americans both want and need more control over their personal information. Of course, the challenge is helping users engage with their privacy, i.e., making privacy “fun,” which anyone will tell you is easier said than done. Then again, considering we’ve found ways to make everything from budgeting to health tracking “fun,” I’m unsure what’s stopping industry from finding some way to do it. // More on the Future of Privacy Forum blog.

Big Data Conversations Need a Big Data Definition

As part of my day job, I recently recapped the Federal Trade Commission’s workshop on “Big Data” and discrimination. My two key takeaways were that regulators and the advocacy community wanted more “transparency” into how industry is using big data, particularly in positive ways, and second, that there was a pressing need for industry to take affirmative steps to implement governance systems and stronger “institutional review board”-type mechanisms to overcome the transparency hurdle the opacity of big data present.

But if I’m being candid, I think we really need to start narrowing our definitions of big data. Big data has become a term that gets attached to a wide-array of different technologies and tools that really ought to be addressed separately. We just don’t have a standard definition. The Berkeley School of Information recently asked forty different thought leaders what they thought of big data, and basically got forty different definitions. While there’s a common understanding of big data as more volume, more variety, and at greater velocity, I’m not sure how any of these terms is a foundation to start talking about practices or rules, let alone ethics.

At the FTC’s workshop, big data was spoken in the context of machine learning and data mining, the activities of data brokers and scoring profiles, wearable technologies and the greater Internet of Things. No one ever set ground rules as to what “Big Data” meant as a tool for inclusion or exclusion. At one point, a member of the civil rights community was focused on big data largely as the volume of communications being produced by social media at the same time as another panelist was discussing consumer loyalty cards. Maybe there’s some overlap, but the risks and rewards can be very different.

1 2 3 6  Scroll to top