skip to main content

Two Points and a Lesson from PrivacyCon, FTC's Digital-Privacy Conference

Last week, on January 14, 2016, the Federal Trade Commission (FTC) convened PrivacyCon, a first-of-its kind conference bringing together policymakers, academics, and technology researchers to discuss the challenges surrounding online privacy as we navigate between a fixed-internet world, a mobile one, and the growing “internet of things.” I was in the audience, and I came away with two major points and a lesson for white collar defense lawyers and their clients.

Online privacy is important to the FTC

First, the FTC is now the top federal cybersecurity enforcer, and it has turned its sights on data privacy in a serious way. (Data privacy is distinguished from its close cousin, data security, because privacy deals with the collection and intentional use of web users’ personal information, while security is about protecting such information from malicious third parties.)  The simple fact that the FTC organized PrivacyCon – a full day of presentations of nearly twenty academic papers followed by panel discussions involving the authors, FTC personnel, and other thought leaders – demonstrates this. Additionally, certain recent FTC hires, who were given the stage at the conference, shed light on this redoubled focus. Two of those new hires – policy director for the newly-created FTC Office of Technology Research and Investigation (“OTech”) Justin Brookman, and Chief FTC Technologist Lorrie Cranor – bookended the day, with Brookman moderating the first panel and Cranor providing brief closing remarks.

Brookman is a prominent online privacy advocate, and came to the FTC in August of 2015 from the Center for Democracy and Technology, where he served as director to the Consumer Privacy Project. Just months before his transition, Brookman wrote a blog post criticizing two then-current FTC commissioners for their stance that the FTC should only step in to punish privacy missteps when consumer harm can be demonstrated. (One of the two commissioners, Joshua Wright, stepped down days before Brookman announced his move, and has since been critical of the Commission’s privacy work.) 

Cranor, for whom PrivacyCon marked her fourth day at the FTC, is on leave from Carnegie Mellon where she has been director of the Usable Privacy and Security Laboratory and co-director of the school’s privacy engineering master’s program; she was also a member of the board of directors of digital advocacy group Electronic Frontier Foundation, which includes digital privacy among its core focuses, and served on the Future of Privacy Forum advisory board.

PrivacyCon, and the embrace of online privacy advocates like Brookman and Cranor, come shortly after the FTC dipped its toe in the privacy waters, settling with retail tracking firm Nomi Technologies in April 2015 over charges that the firm did not fulfill promises to consumers regarding opting out of tracking technology, and settling with Snapchat a year earlier in a matter involving allegedly inaccurate statements about the impermanent nature of messages sent on the platform. In November of last year, the FTC Associate Director of the Division of Privacy and Identity Protection noted to an industry audience that companies “need to be careful” about honoring consumers’ requests to opt out of data collection and tracking programs, adding, “And if they are unclear, or deceptive, [about] creating the opt out, or communicating the opt out in a way that conflicts with consumers’ understanding, there may be room for a Section 5 deception action in that case.” All of these signs point to increased FTC focus on online privacy, and increased potential for enforcement in this area.

FTC wants consumers to have more control over their data

Second, the FTC’s focus appears to center around giving consumers increased control over what companies do with their information, and aligning consumers’ expectations and hopes in that regard with companies’ actual practices. Much of the research presented at PrivacyCon dealt with one of two issues: (1) consumers’ understanding of what is being done with their data, or (2) what data collection and tracking is in fact taking place.

Neither question is necessarily straightforward. As to the first, one theme that was articulated in various ways throughout the day was the notion that, while consumers appear happy to hand over data about themselves while navigating the digital world, the reality is that they are unhappy about it, but have become “resigned” or “habituated” to the perception that data collection and tracking is taking place. Believing that they have no real control over what is done with their data, they allow collection and tracking to happen despite misgivings. Commissioner Julie Brill, reacting to these findings in mid-day remarks, said “individuals have to be in the loop regarding decisions about what data is collected about them and how it is used.”

As to the second question, the audience was told repeatedly that online data collection and use is in some ways a black box into which the public – including policymakers and academics – cannot see. At least four of the papers presented at PrivacyCon showcased painstaking research aimed at understanding what data online applications collect, and what they do with it. Indeed, several commentators noted that, even if the world were allowed to look behind the curtain, the algorithms are often so complex that their outcomes can be unexpected and not predictable based purely on the viewing the code. (As Cranor noted, “[I]n order to have algorithmic transparency it’s not enough to just know what the algorithms are, because that doesn’t really tell us very much. What we need is systems that help interpret the results of the algorithms and show us the impact of those algorithms.”)  Finally and possibly most troublingly for companies, one speaker told the audience that even companies themselves frequently do not understand the data collection and tracking techniques they are using: according to Chris Hoofnagle of the University of California, Berkeley, “What my team has found over and over, when we discover things like HTML5 or flash cookie respawning [technologies that can be used to frustrate users’ attempts to avoid tracking], we go to the companies and say ‘we think you’re doing this,’ and they say ‘we are not doing it’ – and they don’t actually know what they’re doing.”
 

Lesson: Know what your online applications are doing with users’ data, and strive to make it consistent with consumers’ expectations
 

With the above in mind, companies should be careful to ensure that their websites’, mobile apps’, and connected devices’ data collection, tracking, and use practices are consistent with what users are being told.

There was palpable frustration at PrivacyCon over the failure of online privacy policies to give users effective notice of what was being done with their data; this was a theme of at least five presentations. We can expect the FTC to work behind the scenes to develop guidance for more effectively notifying users of their options with respect to their data, and to put users in the driver’s seat in this area. And when that guidance is published, companies would be well served to make every effort to follow it: we have observed, in the cybersecurity realm, that failures to follow FTC guidance can lead to enforcement actions.

The FTC has tested out its privacy enforcement powers with the Snapchat and Nomi Technologies cases, instances in which companies allegedly promised to treat users’ personal data in a way that was inconsistent with the companies’ actual practices. It is likely that the FTC will continue to aggressively pursue enforcement actions wherever it finds inconsistencies between the reality of companies’ data collection and their promises regarding their users’ data. More than once, PrivacyCon presenters suggested that their research had uncovered instances of companies using consumers’ data in ways that they had promised not to – something that, as noted above, the FTC believes “may [create] room for a Section 5 deception action.” Now is the time to get on top of this. 

Disclaimer: This post does not offer specific legal advice, nor does it create an attorney-client relationship. You should not reach any legal conclusions based on the information contained in this post without first seeking the advice of counsel.

About the Author

Abraham J. Rein is a Principal in the Firm's Internal Investigations & White Collar Defense Group, Co-Chair of its Information Privacy & Security Group, and a member of the Firm's Diversity and Inclusion Committee. He focuses particularly on the intersection of technology and the law, advising clients on legal aspects of data security, social media compliance, electronic discovery, the application of certain constitutional rights in a digital era, and related topics.

Read more >