Experts provide an overview of the debate in the United States over online privacy, and offer their views on what the United States should do to overcome disagreements between the government and private sector about privacy rights. The panel will identify what the next administration could do to improve online privacy in the United States and repair relations between Silicon Valley and Washington, DC over encryption policy.
Michael McCaul, Chairman, House Committee on Homeland Security (R-TX)
Amie Stepanovich, U.S. Policy Manager and Global Policy Counsel, Access Now
Christopher Wolf, Partner, Hogan Lovells
Adam Segal, Ira A. Lipman Chair in Emerging Technologies and National Security and Director of the Digital and Cyberspace Policy Program, Council on Foreign Relations
SEGAL: Good morning, everyone. As the panel is situating itself, I’m just going to welcome everybody. I am Adam Segal. I’m the Ira A. Lipman chair in Emerging Technologies and National Security, and I direct the Council’s Digital and Cyberspace Policy Program.
Today’s event is the third in our annual symposium on digital and cyber issues. The first one was on internet governance. The second was on security. And now we are touching on privacy and data in the age of surveillance.
It has been an eventful eight years over this topic, the debate over encryption, the issue of data localization and access to data, a kind of growing data populism or data nationalism that we’ve seen around the world, issues about how to promote consumer privacy as well as access to data from individuals, companies, and countries.
I suspect that the next four years will be as eventful, if not more. And so I’m really looking forward to the discussions today on the domestic situation, what’s happening in Europe. And then I think we’re going to end on a slightly perhaps more optimistic tone about some of the advantages and progress we can make on technology in big data. And there may be some technology solutions to some of these privacy issues that we’re debating.
Please all of you check in with our blog, Net Politics, where we cover a lot of these issues. We have a series of cyber briefs that come out that have dealt with privacy and security and internet governance. Please read those. And please reach out to me about your concerns or interests in the program.
Before I ask the panelists to come up, I just want to thank Alex Grigsby, who works with me in the program, for helping to organize, and in the D.C. office, Dexter Ndengabaganizi and Sam Dunderdale, who helped arrange everything and made sure that we all got here smoothly.
So we’re waiting. One panelist is probably going to be rushing on stage, but we’re going to start off anyways because she’s stuck in traffic. So why don’t you guys come up and we’ll kick it off?
MODERATOR: All right, is my mic live? OK. Welcome to the Council on Foreign Relations session on the State of Online Privacy in the United States. As Adam said, we have one of our panelists who will come in midstream. So let me introduce her phantom presence, which will assume corporeal form presently. That’s Joan O’Hara, who’s general counsel of the House of Representatives Committee on Homeland Security.
To her left in current corporeal form is Jules Polonetsky, who’s the CEO of the Future of Privacy Forum. And to his left and my immediate right is Amie Stepanovich, the U.S. policy manager and global policy counsel for Access Now.
And so I want to start off just with a broad question for the two of you. How is the state of privacy online in the United States different on January 21st than on January 20th? What’s the consequence to this discussion of a transition of power between President Obama and President-elect Trump? Amie, do you want to get us started?
STEPANOVICH: Sure. And I think one of the things that is really important to remember is that there are a lot of things that don’t change between those two dates. Section 702 still sunsets at the end of next year, no matter what. So we still are going to have this big discussion about national-security surveillance.
Around the same time, Privacy Shield is still going to be up for discussion about whether or not that continues. We still have two lawsuits ongoing in the EU, albeit I believe the way the courts and the EU look at privacy in the United States under a different administration will probably be fairly drastically different than what we would have thought that they looked at under an Obama administration.
We still have this rising tide of the Internet of Things and the inability or the—that haven’t been secured yet. I don’t know exactly if you want to say inability to properly secure or just the manufacturers haven’t really done it yet. And that’s a huge weapon. And we’ve seen that not necessarily you want to call it weaponized, but we’ve seen that used in really significant ways to shut down speech and to shut down internet service in meaningful areas. And so I think that that is something that’s going to have to be addressed no matter what.
How is that addressed? That is an interesting question, because we’re not really sure right now what expertise we’re going to see in the proper agencies and departments in the government that can be able to move forward on these questions. There’s a hearing today on the Internet of Things in Energy and Commerce, in that committee. And I believe Rose Schooler is testifying it’s going to call for regulation. What does regulation look like in order to require security, updateability, privacy protections? Where does the conversation on privacy legislation in the United States go? Probably not very far, which might again impact the Privacy Shield negotiations.
The Section 702 fight, it’s coming. That’s not changing. But it looks really different in that we knew the Obama administration was willing to engage on that. We knew that they had talked a lot about U.S. surveillance, had talked a lot about what needs to be done and how you protect privacy. They had taken this unprecedented step of passing a policy directive that actually recognized that people outside of the United States have privacy interests. They didn’t call them privacy rights, but they said there were privacy interests there.
That was huge, and I don’t think we can underestimate that. And we don’t know if those movements are going to continue under a Trump administration. I think it looks a lot more complicated.
POLONETSKY: Well, I’ll add this. And I focus primarily on the consumer side of the world, but obviously there’s this great intersection with the free flow of data for companies that are doing business globally. And obviously the Privacy Shield agreement was a tenuous one and ended up being very dependent on letters and commitments made by the U.S. government, which some European critics said, well, that’s a letter. And we said, no, this is a formal representation. They said, well, but what if the administration changes? And we said, no, these things are taken seriously when there are commitments that are made. The Privacy Shield gets a review in a year. And if those commitments are seen as, you know, wiped away or very tenuous, we’ll have a real challenge.
Now, some have argued, well, we’ll have forceful foreign policy, especially when it comes to trade, things like this. But at the end of the day, it’s the privacy authorities who have incredible independence, have always had incredible independence in Europe, and now, because of the Schrems decision, have, you know, almost unfettered independence in terms of saying, no, we don’t think this meets the parameters of the court. And that ends up, I think, being challenging not only for global data flows, but for consumers or employees who need to be paid by data moving easily.
On the consumer side of the world, I’d say it’s going to be an interesting play. And I wonder whether some of the trends we have seen in consumer privacy are likely to continue. And here’s what I mean by that. I was a chief privacy officer perhaps 16 years ago. And when I started out, it was all about cookies and tick boxes and email marketing—technical nuts and bolts of how to handle, you know, data in a respectful way.
And clearly, in the last couple of years, the dynamic that has been significant is we’re talking about algorithmic discrimination and fairness and are we treating different classes of people differently. The civil rights community obviously always engaged in government surveillance and concerns such as that have become very focused on whether data is the next civil right.
One of the foundation funders I was talking to the other day said to me we helped support and fund the first civil rights revolution, and now we’re supporting the next, which is how data can be used to deny opportunities and benefits, whether it’s a marketing thing or whether it’s, you know, a necessary opportunity. And so we need to have an empowered civil rights community. And you’ve seen, you know, media story after media story of, hey, is this algorithm treating women differently, offering them different jobs? Is this algorithm being racist? And so on and so forth.
We may see a different dynamic in a Trump administration. But the Sanders and Warren and perhaps Brexit and Trump voters, those who are concerned about big entities, whether it’s government, or big companies or Silicon Valley tech folks having data and tech that rigs the system, that mistreats people, doesn’t give them a fair shot, I think is one that’s likely to continue, or it may have a little bit of a different—no pun intended—complexion about it, but the notion of fairness.
And, look, even on the Hillary Clinton side of the campaign in the tech policy world, if you read closely the tech and innovation agenda, it had privacy. But every time privacy was in there, we had fairness. And so that notion of let’s focus on whether we have transparency or we have accountability for algorithms and whether people are getting, you know, advantages because of data, as opposed to sort of one-sided limitations, I think may continue to be a very central focus. We’re certainly betting on that, and I think we’re likely to see that from both constituencies on either side of the political equation.
MODERATOR: OK, so you guys have put a huge amount—a huge range of stuff on the table here. And I want to try to pick different threads of it apart.
Amie, I actually just wrote a transition memo for—a Brookings transition memo on the 702 question, which I haven’t published yet. So I’m particularly interested in your thoughts on this. It seems to me that a huge amount of the 702 discussion is predicated on the idea that the intelligence community, in fact, behaves responsibly; B, that 702 has—you know, despite all the Snowden revelations, there’s really no suggestion of serious abuse of it. There’s all these accountability mechanisms. And one of those major accountability mechanisms is the serious Justice Department oversight of NSA and the FBI.
We now have a president-elect who has threatened to remove the FBI director for purely political reasons, who has talked quite publicly about surveillance uses that most people would regard as abusive, and who surrounded himself with people who have a certain degree of open hostility to, you know, minority groups and religious minorities domestically.
And so my question is to what extent, A, does that shape congressional willingness to reauthorize a program of this power and scope, and one that I should add that I’ve always supported very energetically and believe is a really important national-security tool? And the second question is, to what extent should it inflect a reauthorization discussion of that program?
STEPANOVICH: Sure. I mean, I think one of the big things that you’ve touched on is, I mean, both the FISA court and many of its opinions that we have gotten to see and the Privacy and Civil Liberties Oversight Board really delved into the administration’s policies, procedures, processes for how they deal with data in the 702 program on the back end, outside of the actual language and the four corners of the law.
And a lot of those procedures were critical to them improving the programs, allowing them to move forward, not calling for significant reforms necessarily, it’s all in these documents. Some of them we’ve gotten to see. Some of them have been declassified. Others have not been declassified and we’ve still gotten to see them. And still others we have no idea what’s in them whatsoever.
The idea that those things are not codified, I think, raises new issues; 702 and national-security surveillance has all these checks and balances that we’ve built into it, from putting the FISA court in place and recognize in the 1970s that we need some sort of court overseeing national-security surveillance, but it requires secrecy in order to maintain the confidentiality of national—sensitive national-security programs. So we have this secret court.
We have these secret committees, the intelligence committees in the Senate and the House. They primarily hold very secretive, closed, classified briefings that the public does not have an eye into. The operation of these programs within the NSA, within the FBI or the CIA who have access to and a role to play in this program, that’s very closed off. It’s a lot of not public accountability. They talk about accountability, but it’s secret accountability from these secret agencies to one another.
I think that that is going to be a structure that needs to shift. We either need to start codifying some of these protections that are in the minimization and the targeting procedures or that this FISA court has set standards of into the law itself, to ensure that these are not things that can be swept away by future administrations, immediate or further into the future.
I think that also the Privacy Shield conversation is really, really central to this. It’s not only going to be necessarily how people in the United States feel about these authorities that dictates what the future of how they look is going to be. With striking down Privacy Shield, it’s going to deal a huge—if they strike down Privacy Shield—and not only are they considering it, but there are two active lawsuits challenging it at the moment—if that goes away, that’s a huge economic blow. That’s a huge problem for companies, both in the U.S. and the EU. It’s going to harm probably user rights for people, human rights for people in both places, because data portability helps users. And that’s going to be a big problem.
And so if there are going to need to be more protections, not only for people in the U.S. but for people overseas, to preserve the ability for data to flow between those two regions, the U.S. and the EU, I think that’s going to be something that we have to look at, because the EU is going to be significantly skeptical of how these authorities are being used.
MODERATOR: So let me ask the sort of same question to you, Jules, in the context of sort of these Privacy Shield questions. You mentioned earlier that a—you know, that a lot rides on the sort of assurances that the administration has given as to, you know, what we will and will not do. And so my question is to what extent does the—I’m trying to think of a delicate way to say this—the instability of the words of the president-elect across a wide range of areas inflect European willingness? Or do you expect it to inflect European willingness to take the assurances from this administration seriously, carrying into the future, but also to accept assurances from Commerce and State in the next administration as stable reflections of U.S. policy?
POLONETSKY: Well, let me maybe state it like this, because who knows whether and how the statements and activities will play out? I will say this. We are already in a fairly deep hole with regard to the rest of the world’s appreciation and understanding of the structure and the controls and the systems we have in place, and—
MODERATOR: And actually, you know, for those people—for those members here who are not conversant in the European—the sort of transatlantic privacy divide—unpack that statement a little bit. What is the hole that we’re in?
POLONETSKY: You know, we don’t understand each other really well. And let’s take our share of the guilt as well. You know, we show up and we maybe don’t appreciate that people wake up every day—and, you know, maybe just substitute any foreign government’s name for your email service, right? So let’s take—I don’t want to take China, but let’s take—I don’t know—let’s take France.
Let’s say you woke up every day and you logged onto French mail and you turned on your French PC and your French operating system and you went to French Book and you, you know, tweeted on French Twitter, right? Like, just realize the world that, because of the success—and they love the products, right; they’re using them—and then you searched on your, you know, French search engine, right.
We have this sort of love-hate relationship with the day-to-day technologies we use, right. And then later on, the fact that there is very little understanding of the system of checks and balances that we have today, the fact that we don’t have perhaps comprehensive privacy legislation is inconceivable to some folks, right.
Imagine today we said we don’t need laws that protect against discrimination in terms of race or religion. You know, the free market—the innovating will work it out. Companies will do the right thing, because that’s the way they’ll get the right employees, right. We kind of recognize that in some significant way there are certain rights that need to be protected.
And when the Europeans look at us, they think that we are in this wild, unprotected world it doesn’t therefore appreciate. Now, the reality is, right, a giant chunk of our economy is regulated; almost everything that’s sensitive, right, whether we need to cover, you know, more areas or not, right. And we have an FTC, and it’s got a significant amount of independence, right. And we’ve got this system of attorneys general that isn’t so visible to them. And, you know, the studies that have looked at how do companies actually comply with the nuts and bolts of privacy law show that the U.S. actually does very well. There’s uncertainty, for better or worse. Well, some AG will bring an action against you. You’d better, you know, give some (money ?). There could be a class-action lawsuit. And so, at the end of the day, our system, as imperfect as it is, let alone their understanding of our surveillance. And when we debate them, they live in many countries in a court system where the judge also has an investigative role, right. The role of our judiciary as having this sort of independence isn’t always fully appreciated.
So we’ve argued that, frankly, even under an Obama administration, you’d show up and there’d be 30 privacy commissioners and European Commission officials and, you know, ministers of the interior with privacy responsibility. And then we’d have, you know, somebody from Commerce and maybe somebody from State, and then they couldn’t talk for the law-enforcement community.
So we’ve got this incredible clash where even the structures we have in place were completely, you know, unappreciated. And what we need, whoever the next administration is—I guess we know who it is—but whatever the structure looks like, we need to figure out how to have—I don’t want to call it a privacy czar, but today we don’t have the ability, even when we have the desire, to talk to Europe or the rest of the world in a way that makes it clear that the U.S. has strategic interests and strategic protections.
And so here we are sort begging for the Privacy Shield, which primarily is used for U.S. companies to pay their European employees. When you look at, you know, the overall number, a significant number of companies that are in there are in there for employee data, right. The goal was to give stock options and bonuses and pay your European employees by having their data come back and forth, right? Most of the U.S. companies that do business in Europe, they’re already subject to European law. Yes, they want to be able to move data back and forth, but the regulators have jurisdiction over them. These edge cases, which are significant—like, don’t get me wrong; everybody was upset about the Privacy Shield because of the chaos—but the kinds of chaos that it would created, you know, were the kinds of practical things that certainly would have hurt European employees and European consumers as much as anything.
So we need to figure out how—whether it’s Trump or, frankly, for the future—how to have a(n) empowered voice. We’ve luckily had Cam Kerry. He was interested. He took a leadership role. We had Danny Weitzner as a deputy CTO. It’s been very personality-dependent. Do we have somebody there who kind of cares enough? Julie Brill, you know, praise her soul, decided, look, I really want to take my role at the FTC as helping represent and, you know, advocate for the U.S. system. But, you know, if she wasn’t there, would somebody else have stepped up?
So we need to figure out how to formalize that role and have, whether it’s at the White House or in some appropriate interagency way, somebody who can bring together law-enforcement questions as well, because that’s—you know, that’s 90 percent of the debate. And that was hard. Until the president called Angela Merkel and so forth, you know, to kind of get this deal closed, who knows where it would have ended? So hopefully that stages that a little bit.
MODERATOR: Thank you.
So, Joan, I want to tack back to the question that we started off, because I think you probably have a bit of a different perspective than the other panelists. Maybe you won’t. How is the general state of online privacy in the United States different on January 21st than it is on January 20th? What does the change in administration likely mean, in your judgment, for the constellation of issues, both on the law-enforcement and intelligence and also on the consumer side, that are encompassed by this panel?
O’HARA: It’ll be a total disaster, really. Brace yourselves. No, I’m kidding. (Laughs.)
Quite frankly, I don’t see there being a big difference on January 21st. We have yet, I think, to really hear what Donald Trump and his administration think in terms of privacy protections and relationship between, like, the U.S. and the EU, for example, and how he wants to move forward on all of that. They haven’t, I don’t think, fleshed out those plans to the extent that perhaps the Clinton camp had.
I am hopeful that they will be in listening mode when it comes to listening to representatives from Congress. I know that they are reaching out to various committees to get their input on questions like that, where we are in terms of online privacy, one of those questions, and hopefully to the private sector as well.
Again, I can’t speak for the Trump administration, but I’m confident that they are in receiving mode and trying to really learn what the state of play is, what the various political interests are, and where we need to go over the next four years. So I am cautiously optimistic that things will move in the right direction. I do think this is an area that’s very much in flux.
Even at the most basic level, the question of what is the expectation of privacy in today’s digital age, what does that mean, I think that’s still being shaped. So I think there are some basic values that would always be in place. But in terms of sort of putting that in the context of the digital reality and the tools that we all use as a very intimate part of our everyday lives is still being developed. So I think this administration will play a role in shaping those expectations and those realities. But as I said, I’m cautiously optimistic that they will be in receiving mode and getting input from as many stakeholders as they can.
MODERATOR: So I’m interested sort of in—you know, being in receiving mode is something that certainly all incoming administrations should be to some degree, particularly in the areas in which—that are not what they campaigned on. But I’m—but the way you formulated that makes it sound a little bit like a blank slate. And I’m—and my impression is that there are certainly noises that the Trump campaign made, and Trump himself made, about things like aggressive surveillance, things like, you know, surveillance based on, you know, religious identification, as well as terms like extreme vetting, which certainly have implications of, you know, gathering and processing very large amounts of information in order to make judgments about individuals.
And so I guess my question is to what extent are we talking about being in receiving mode in the presence of a preexisting mood? And to what extent are we really talking about a blank slate? And furthermore, to what extent is, when we’re talking about unified Republican government, is there a mood on these issues that flows from the center of gravity of the Republican caucus in the House and Senate, as well as the president? And to what extent is this really just an area where we don’t know where we’re going yet?
O’HARA: I think you hit the nail on the head with saying sort of the Republican center of gravity. I don’t necessarily think that there is one on these issues. Within the Republican Party you see someone like Senator Burr, who has advocated for mandating back doors into encrypted devices. And on the other hand you see Senator Lee, who, on Rule 41, is looking to implement a delay on seeing that rule go forward.
So it’s very interesting, actually, when it comes to privacy, because you sort of have the full range of opinions within the Republican Party. So you have those who are very hawkish on national security and intelligence, and you have those who are very much in the civil-libertarian camp and want to see privacy protected, even if it potentially is at the expense of some security.
So I don’t feel like there is necessarily a unified Republican opinion on this issue. And Donald Trump and his administration are going to have to listen to everyone across that spectrum.
MODERATOR: So the—I’m going to ask you the same question I asked Amie. The first major necessary discussion, I think, that the administration is going to have to have is over reauthorization of 702 which, you know, expires at the end of 2017. And do you have any reason to think that the substantive positions of the income administration would be materially different in that regard from what we could have expected from the Obama administration, number one? And number two, do you have any reason to think that the same arguments might be received differently from a president who openly contemplates the removal of the FBI director, for example, for political reasons, openly contemplates, you know, spying on Muslim communities—that there may a sort of different inflection to the way the 702 debate takes place that has almost—that has very little to do with differences in the substantive positions that they may take?
O’HARA: I think open contemplation is probably a good way to describe it, a lot of what was sort of said on the campaign trail. And at the risk of sort of being speculative, you know, I wouldn’t be too surprised if some of those most extreme positions are walked back a little bit when other input is received from people on the Hill and from the private sector. So I wouldn’t expect that the conversation would be terribly different than it has been over the past couple of years. But from the little bit of interaction I’ve had with the Trump folks, I do—I really do believe that they’re going to be listening with an open mind and trying to take in various ways of looking at things.
MODERATOR: One last question, and then we’ll—and this is for all three of you—before we go to questions from members. Is there any, in any of the areas that we’re—that we’re thinking about with respect to online privacy—any areas where it’s just reasonable to expect or obvious to you that there will be change as a result of—you know, where you look at it and you say either the center of gravity of Republican opinion really just dissents from the things the Obama has done, or that Trump himself has clearly articulated a different sort of privacy balance and vision than that reflected by current policy? Or is it really just all in the—up in the air. Let’s just start with Amie and go down. You know, what’s the one area that you look at, if any, and say I know this is going to be different?
STEPANOVICH: So I think you hit the nail on the head, unfortunately, earlier where—between you and Joan—saying we’ve heard things that he has said that are going to majorly impact privacy, like surveillance of Muslim communities. But we haven’t heard a specific privacy position that we can delve into and say: This is going to be a big deal. And that’s really important, because we’re still reading tea leaves into what priorities this administration is going to take when it comes to the digital world.
I think one of the things—and Jules hinted at this. And it’s maybe more on the commercial side, but I think it’ll have interesting implications in the administration—is the idea of algorithms, and this concept of algorithmic transparency or, as some people say, algorithmic accountability that has been taking place over the last year. When you talk about that sort of surveillance of Muslim communities or trying to do this extreme vetting, inevitably algorithms are going to come into that. And they’re going to be analyzing large amounts of data. They’re going to be making decisions based on that data. What do those algorithms look like? And should we have some sort of view into what they can do?
And I can’t imagine a world where he moves in any direction forward on those statements that he has made—and while he has walked back some statements, he has still, I think, consistently maintained a direction toward these promises and statements he’s made. So we can expect that direction to continue. Algorithms are going to play a key role at the heart of that conversation. And it’s going to look really different when you talk about the trust factor of how people imagine the Obama administration used algorithms versus how the Trump administration and his intelligence community might seek to implement algorithms under current authorities.
POLONETSKY: So I’ll say this: I was the consumer affairs commissioner in the Giuliani administration. I was a Democrat, so I was always a little suspect. Was I going to be too regulatory in a Republican administration? But what I found was that on the areas that I knew the mayor cared about—quality of life in New York and crime and so forth—I knew what my agenda was and I had to support it—kids getting access to box cutters or spray paint for graffiti, I knew I had an issue.
On everything else, this was not on the mayor’s radar screen. It was not—you know, he had a hundred other things that he cared about. He didn’t care about it. And I got to drive my agenda, which I think was kind of a good, strong consumer enforcement agenda. And so I think what you see happening when you don’t have this clear vision on, you know, some strategy—we knew where the Obama administration was, right? Supporting data, wanting the right rules, you know, supporting all types of tech innovation. Every agency knew it’s their job to, you know, help support that agenda.
I don’t think any agency will likely today be able to look at, you know, the Trump campaign and say, ah, here’s—you know, other than where there’s really clear agendas—and how to take it. So I think the people who come into place are going to have much more room to sort of operate and set an agenda than we’ve previously seen before. So where we do maybe already see some shape taking place? Obviously the FCC already, you know, both because of prior statements as well as, you know, people who are involved in the transition, you certainly see a likely very different orientation perhaps at the FCC, than under Wheeler.
But I’d argue we may end up being very surprised by the people who end up embracing a little bit of the mission of their agency because the career staff, you know, do a lot to set the agenda. And they may have more room to run than there is a clear sort of direction managed out of sort of a central drive of a particular agenda on these issues. I’d like to think these are the most important issues, but having been involved now in a couple presidential campaigns to kind of work on the privacy issues we’re always, like, great, when are you going to put this out? And, you know, it’s what does this do for middle class jobs in Ohio?
So it’s not, all right. You had the iPhone issue kind of pop, but most of these issues, despite, I think, the importance to us, are not the top five, seven, you know, 10 issues on the White House plate. So I think power devolves and the key people are going to be making a much broader range of decisions about a lot of the issues that we care about.
O’HARA: I would agree. I don’t think there’s necessarily a completely clear and fully developed agenda developed at this point. But I do think that they will probably, as I said, try and get up to speed quickly by taking input from various stakeholders, including members and Senators on the Hill. And as I said earlier also, even within the Republican Party, there’s a broad range of opinions on how surveillance and privacy and counterterrorism should be dealt with. And for whatever direction that is coming to us from the White House, Congress still has to act in a lot of these areas. So there will continue to be deliberation. There’s not going to be some sort of just heavy hand that comes down and says: This is the way we’re doing things. But that deliberative process will continue.
Obviously we’re dealing with a reality of, for example, home-grown violent extremism. This is a real problem. It’s something that we’re seeing increasing across the nation with self-radicalization and attacks on innocent people in unexpected places. That has to be dealt with. And it’s very difficult. It’s not the same paradigm of the U.S. looking at nation-state actors. Some of these people are Americans, born and raised right here in the United States, who for whatever reason are gravitating towards a particular ideology and taking things into their own hands.
It is something that we have a responsibility to try and deal with. I don’t think that we need to erode civil liberties and privacy in the process of doing that, but it is an issue that we’re going to have to look at in creative ways. And I’m confident that the administration will work with Congress to try to shape that. And it’s not going to be an easy task. I am sure that some of the battles that have been fought over the last four and even eight years will continue, because you’re going to have a lot of—notwithstanding a new administration—you’re going to have a lot of the same players and a lot of the same voices engaged in the conversation.
MODERATOR: So at this time I’m going to invite members to join the conversation with questions. So I want to—I’m admonished to remind you all that this session is on the record. So there is accountability—as Amie would say, public accountability for your remarks. When I call on you, please wait for the microphone, stand and identify yourself with your name and affiliation—more public accountability. And please limit yourself to one question. And if you could direct it to the person that you want an answer from, that will keep things moving and keep us—give as many people a chance to participate as possible.
Q: I’m Mark MacCarthy. I’m with Georgetown University and also with the Software and Information Industry Association. Great panel.
I wanted to go back to the point that both Jules and Amie raised on the development of privacy from sort of an older way of thinking about consent and so on to this new idea of fairness, especially algorithmic fairness. My trade association just put out an issue brief on the topic, which is on our website. And I urge all of you to go take a look. It’s a pretty balanced piece on the issue. But I wanted to raise the question of the way forward. You know, this is one area where surveillance does get involved. I think most of the issues are on the commercial side. There was a nice event at Georgetown earlier this year, which talked about the concerns of the minority and civil rights community about the use of algorithms for surveillance. And I think it’s a major issue, and an important issue.
But most of the issues that have been brought to public attention through books like Cathy O’Neil’s “Weapons of Math Destruction,” are on the commercial side, where the consequential decisions for people are made using these algorithms. And the question is are the algorithms fair, are they biased, do they give us the results we want? We’re think that some sort of framework that’s responsible use that’s developed by industry is a good idea. What’s the way forward on this complex set of issues?
POLONETSKY: So I think this is real turning point moment for industry. You know, one of the challenging issues in privacy has always been data retention, right? Companies believe I need data because I may come up with valuable purposes for it. Don’t make me throw it out. And, you know, the Europeans and privacy advocates and folks who focus on fair information practices say, no, limited data, get rid of it, bad things will happen, right? And that’s always been a tension point between sort of the innovation and so forth.
I’m hearing folks across the country—in the school, K-12 sort of community, and certainly some in the corporate community—saying, do I want to have this data because, for the first time, I’m actually aware that maybe bad things might happen. Do I want—for school leaders saying, well, do we need immigration status? We had it maybe for some useful purpose, to help serve, you know, part of our student body. Do we want it? And so some of the issues that advocates have raised, which I think commercial folks, unless you are, you know, really the folks responding to surveillance requests, but the broader commercial community, I think, has sort of downplayed because, hey, we’re just doing great things with data, I think you’re seeing a new sensitivity.
Mark’s group’s paper calls for disparate impact, you know, test, which, you know, I’d say a couple of years ago companies would be, no, no, that’s for credit. That’s housing. You know, that’s—we get that that, you know, even if I don’t intend to discriminate I have to do extra work to see whether I’m ending up in an offer or a loan that is being extended in some way that has a disparate impact on different classes. But I don’t want to go further than that. And now you’re seeing companies sort of progressively saying, you know what? Maybe we do need to do more to figure out how to vet ourselves, whether it’s the PR or whether it’s the legal risk.
So I hate to say that it takes the fear of governing overreach, and hopefully these things will all be backed off and so forth—but the—certainly the rhetoric and the claims and the hate crimes that people are seeing around the country I think are sensitizing industry in a way that I hope will end up being productive in the long run, because we’ll take some concerns and we’ll imbue then, and you’ll see much more careful work done to worry about unfair treatment, and you’ll worry—you’ll see people worrying about retention, in a way hopefully that comes up with an appropriate balance that supports appropriate law enforcement and appropriate innovation, and so forth.
STEPANOVICH: So I’m a little more comfortable in the surveillance space than I am in the commercial space. (Laughs.) But just to add a little bit to what Jules said from that perspective, Joan brought up reasonable expectation of privacy. And I think that at least from the government perspective, one of the things that I think we are going to need to see is a revisiting of our notion of reasonable expectation of privacy in light of how much can be done with data. We have—we have advocated, along with some other groups, for a movement toward this notion of protected information. You don’t have public information and private information when you collect a lot of public data and it yields private information. And so that amount of public data, either combined with analysis or combined with derivative results—those results need to also be considered private information.
And they’re not right now. And so, for example, the customs and border protection wants to be able to collect social media information for people entering the country. A lot of people are like, why do you find this objectionable? Shouldn’t they be able to look at your Twitter feed? And it’s not—this notion—this weird notion that they’re just going to pull up your account and look at your top 20 tweets and be like, oh, this isn’t somebody who’s a threat to national security, let them in—it’s going to be putting this account into a huge algorithmically driven program and trying to make decisions about somebody based on tons of information, and then move that forward. And I don’t think we can necessarily say that a program that can analyze huge data feeds, even public data feeds about a person in a matter of seconds, that that shouldn’t be protected in some way. And I think we need to move to an understanding of that information as protected.
MODERATOR: Who else has a—yes, sir.
Q: Hi, Larry Clinton, Internet Security Alliance. I have a question for Joan.
So one of the initiatives that Chairman McCaul and Senator Warner had issued in the last Congress was the notion of a commission that would be brought together the deal with particularly the encryption privacy versus security tensions. I’m wondering if you have any thoughts as to what the prospects for that sort of entity might be in the new world. And is that possibly a vehicle for dealing with some of these broader issues that are raised between the privacy, security, and intelligence communities?
MODERATOR: Brilliant question.
O’HARA: Thank you for remembering the commission legislation. It lives! (Laughter.) I’m still trying to convince people to move it this Congress, but if not we will certainly be reintroducing. And I believe Senator Warner will be reintroducing the companion in the Senate as well.
Yes, our idea—the idea behind having a commission to look at the very complex issue of encryption is supposed to sort of spur a larger conversation as well. So it’s not just limited to the encryption issue, but privacy and digital security. So I think it would be an excellent type of forum for all of the questions that we’re discussing. And that’s actually written into the bill itself, that you don’t just stop at encryption, but look at the challenges of the digital paradigm going forward.
For law enforcement, for intelligence, for privacy and civil liberties advocates the—for those of you who are not familiar with the famous encryption commission bill, this is a commission that would be stood up that would incorporate representatives from all of the major stakeholder groups. So in doing our due diligence, coming up with this idea, and developing the bill, our committee, the Homeland Security Committee, met with literally more than a hundred various stakeholder groups and representatives from state and local law enforcement, federal law enforcement, the intelligence community, the civil liberties community, from academia, actually experts in cryptography, and I’m probably forgetting a few. But we really tried to meet with—oh, I’m sorry—district attorneys, the legal community.
We tried to meet with everyone that we could think of that would have an interesting and valuable perspective on this problem. So the commission would be comprised of representatives from all of those various stakeholder groups, some of which, quite frankly, are in conflict with one another in terms of their opinions. But to have a concentrated effort on really taking an honest look at this issue, and having an honest conversation about it. And trying to come up—not necessarily with bullet-proof answers, but an honest assessment and some recommendations on how we can move forward I think would be very valuable to us, not just on the encryption issue, but on digital security and privacy more broadly.
We were looking at a commission that would put out potentially an interim report in six months of so with hopefully some more definitive analysis over the course of a year. So we realize this is an issue that is urgent in some respects. And we do need to move on it. But we also understand that this isn’t something that you can just solve by having a couple of hours of conversation on it. so we were hoping to try and find something that was reasonable. And we thought a year might be an appropriate timeframe to come up with some recommendations for Congress and for the public more broadly. But at least—at the very least—an analysis and honest look at this issue and what the questions are that we need to be asking.
It seems that over the course of the past year especially it’s been much more of just a public argument rather than a discussion. And we would like to see that turn into a productive conversation. So, again, thank you for bringing it up. We’re hopeful that at some point the stars will align and the bill will move. On the House side, at least, it’s unfortunate, but it’s really just jurisdictional conflicts that are holding up the movement of the bill. But fingers crossed that we can get something done in the next Congress.
SEGAL: This is for the entire panel. Since the—it’s hard to predict where the administration is going to be, given statements from the campaign, maybe you could all try to predict what the first crisis is going to be that’s going to push the privacy issue to the forefront. So, is it going to be the sun-setting of 702? Is it going to be the failure of privacy shield? Is it going to be some terrorist attack? What do you think is the first thing that they’re going to be confronted with?
MODERATOR: Let’s just go down the—(laughter)—
O’HARA: From my perspective, and it might just be where I work, on the Homeland Security Committee, we tend to think that attacks on U.S. soil sort of spur this conversation, again, trying to find out what’s the proper relationship between law enforcement and privacy protection. As I mentioned earlier, the threat of homegrown terrorism of self-radicalized individuals here is very real. I’m not basing this on any sort of intelligence whatsoever, just purely speculative. But, you know, I wouldn’t be surprised if there’s an uptick after the new administration comes in, just because it’s a major shift in the U.S. So sometimes individuals gravitate towards making a statement during those periods of time.
I think that that has brought this issue to the fore in the past. And so in answer to your question, I think if there is a significant terrorist attack here in the U.S., that would probably prompt this conversation.
POLONETSKY: Well, I think it’s likely to be something perhaps sort of trivial and sort of fortuitous. Assuming Mr. Trump continues to sort of monitor the media stories of the day and respond to them someone will tweet at him or a reporter will say, what about that? And it’ll either be something some tech company did, or it’ll be about some headline. And he’ll say: I’m again that and we’re going to stop that. And who knows what that is and whether the repercussions or the practicality of it will make any sense or not. But my bet is he’ll respond to some question about isn’t this a bad thing. And they’ll say, it is. And we’re going to stop it. And it may or may not be something that makes any logic or any reality. And then we’ll be figuring out how to exactly deal with his directive to some agency to stop that particular thing.
STEPANOVICH: I think Jules might be right. And it might be him coming up with some—like, there’s this story that people in the privacy community like to tell, like, why do you want a backdoor on your phone? Would you have a camera in your house that can be activated by the government? And it’s this really funny story and everybody’s like, no, of course, I don’t want a camera in my house. But Trump has a history of going for really easy answers to things like, oh, we need to stop immigration from Mexico. Build a wall. Oh, there are Muslim extremists in the country. Kick all Muslims out. The idea that we could get one of these easy answers that is horrifying in the privacy space is a threat that exists.
I actually think the first crisis that we’re going to deal with nobody is going to know about. I think some private company is going to be asked to do something awful, and it’s going to be classified, and it’s going to be a lot of internal private crises that punctuate the year, that are not public and never make headlines. And I think the—that means that responsibility is shifting from regulators to private industry to be vigilant and to stand up for rights and to make sure that they continue to fight in a way that we have seen a history of them fighting, even when they have to fight in secret against these overbroad orders. I think that we’re going to see a lot of—a lot of classified crises, or maybe not. We’re going to not see a lot of classified crises pop up throughout the course of the year.
MODERATOR: At the risk of violating my moderator’s non-substantive involvement, I’m going to propose a variant of Jules’ theme, which is one possibility is the impulsive late-night Twitter disclosure of a highly sensitive either program or activity that then produces a sudden eruption of controversy over that, and that the government is not well-positioned to talk about or explain what it is and is not in fact doing because the president—of course, who has the absolute authority to do that tweet and disclose whatever he wishes—but nobody else has the authority to talk about it.
Yes. Wait for the mic, please.
Q: Oh, sorry. Hi. Thank you. Astri Kimball from Google.
On all of those prospects, are you thinking about ideal White House staffing on this issue, on privacy, and whether that’s—I hear you, that a lot of this will be at the agencies. And I think that’s a really interesting point. But if a lot of this policy or direction might be determined from the West Wing, are you thinking is it the CTO, is it the National Economic Council, the NSC where you would direct your pleas for influence? Which part of the White House? Is it OSTP? Have you thought about that?
POLONETSKY: Well, Cam Kerry did a phenomenal paper for Brookings, sort of really an internal who gets what roles and what the interagency process looks like and doesn’t look like inside the administration, that I think walked through a lot of the options. And, you know, I think—I don’t want to speak for his conclusions—but, you know, I’d argue that if we’re talking about bringing together commercial and consumer at the end of the day to represent, you know, a bigger picture, that ends up taking it to the White House. And today, you know, you have a staffer at NEC who can’t leave White House because of the 24/7 kind of chaos there, who ideally you kind of want also representing you around the world, doesn’t have the extensive staff, everything’s always in sort of crisis mode.
So whether it’s realistic to think that you build these things up in the White House and whether that’s sort of—you know, at the end of the day—or whether you could empower an interagency thing. You know, I think Commerce made great strides under some of the leadership there is really pulling together things across government. And to the extent that some of those gains, which I think were hard-fought and then were valuable at the end of the day with leaders, they are playing sort of a broader role in sort of helping coordinate that process. I hope that some of those end up sort of being part of the future infrastructure.
But I think I’d argue, and I think even some of the business community, have sort of looked to see whether, you know, NEC could do longer-reach planning as well. Right now, it’s all, you know, day-to-day, you know, activity. And so if one’s thinking about the U.S. role in a digital world where you’ve got this incredible scope of challenging issues—from surveillance to free speech, right? We’re dealing with the right to be forgotten, you know, around the world. We’re dealing with data localization around the world. You know, companies are being told to keep their data, and not keep their data, in multiple locations so that banking regulators can have it and privacy regulators can have it. And so we have this sort of tension tearing apart just about everybody around the world.
That seems to need some real, you know, empowered role. So I’d argue it’s at NEC. I don’t know that this current White House is going to be thinking about these issues in the way the previous was, but.
STEPANOVICH: I’d just—there’s a gap there. I think that in order to think about who gets the role, you also have to first consider if the role is going to exist. You know, we had the first CTO in the Obama White House. I hope that role continues to exist. I think it’s really important. We currently don’t have a chairman for the Privacy and Civil Liberties Oversight Board. We have three commissioners at the Federal Trade Commission.
These are roles that need to be filled and respected. And so I think pushing for filling out these bodies becomes really important for the preservation of the USDS, which has played a really pivotal role in these conversations, for 18(f), which is under attack right now and I think has been very important in the transparency conversation. I think we need to make sure that these bodies continue, let along who gets to be appointed to serve in those roles.
MODERATOR: We have time for a couple more questions. Wow, stunned silence.
POLONETSKY: Well, let me throw out sort of one example of why I think sort of the fortuitous thing is what sort of ends up driving the agenda. On my first day in office as consumer affairs commissioner in my Giuliani days, I think I was, you know, January 2nd or 3rd, the mayor was inaugurated on New Year’s Day in New York. So the mayor was on his—I had just shown up. I didn’t know anything. I didn’t who or what. And somebody called into the mayor’s weekly radio show and complained that they had bought a TV for Christmas and it was the wrong type, or the wrong size. And they tried to take it back, and Circuit City was going to hit them with a 15 percent restocking fee and wasn’t that outrageous?
And the mayor’s saying, what? You’re going to have to pay $200 to bring back your TV because it wasn’t the right one? That is terrible. I’m going to tell my consumer affairs commissioner to do something about it. And so my first directive was do something about that. And I’m, like, well, how do we do something about that? They were allowed to have restocking fees. And I talked to the agency senior staff and they said, well, no it’s a—I said, I don’t know. The mayor, I just started, and I’m—again, I want to make my name here. Go figure out what we can do. Go to those stores, right?
Luckily, when we showed up at the stores, the restocking fee—where was it disclosed, on the receipt. So you didn’t learn about it until after you purchased. And so we said, a-ha! (Laughter.) That’s, like, an essential term. You need to display that, like, on the register, on the front door. You didn’t so, bam, right? We brought an action in and Circuit City had to give back, you know, 8,000 people their restocking fee. It turns out the reason people were buying these TVs, why this was a new thing, is people would buy big flat-screen TVs, which were really expensive in those days, to, like, watch the Super Bowl or something and have a great frat party, and then they’d return it. So Circuit City had imposed this to sort of deal with that. So, you know, it worked out, but that was my first mission. And it was completely fortuitous, so. (Laughter.)
MODERATOR: So I want to—I want to close with a broad question. I mean, sort of there’s been a lot of uncertainty in this conversation about direction and also about intention in some respect. But I’d like to close with a—with a different kind of uncertainty, which is if you—each of you had a—where is the place where you see real opportunity for something that—you know, one thing that a transition always does is it gets rid of a sort of stale preexisting way of thinking and injects a whole lot of new people and new thought into something. So where’s the era of opportunity here where a group of people in the sort of blank slate department goes on a listening tour and really is in listening mode, and hears and comes out with, wow, there’s a real opportunity to do something valuable in this space here.
Amie, let’s just go down—get us started.
STEPANOVICH: I think Jules—(laughs)—
POLONETSKY: So, look, here’s what I think. As data opportunities become incredibly exciting, right? Health care, I just rode in an autonomous Uber, you know, in Pittsburgh yesterday, and you can see—
MODERATOR: Still alive.
POLONETSKY: It drove like my grandma. (Laughter.) You can see the opportunities, you know, for the disabled, for folks that are blind. In almost every area you see the excitement. And it’s not just about marketing, right? It’s not just about picking up more advertising. It’s about, you know, truly game-changing opportunities for smart cities, for government, for education. But in every area, you’re seeing that risk, right? And we’re today just sort of talking about, oh my God, what if the new administration doesn’t get it, doesn’t understand, and so forth, right? But around the world, we’ve got folks who don’t understand or who are skeptical inside and outside government and industry, who worry about the Orwellian or the negative.
And you’ve got this—it’s getting so much more complicated. It’s artificial intelligence. It’s—and companies are saying, look, this is really, really hard. We’re still trying to figure it out. Leave us room, give us space, we’ll get it right. And nobody trusts that, right? So how do we build those processes? How do we build those ethics review processes where the respected processes can exist internally, where folks show that they’re minding their Ps and Qs and they’re actually thinking hard, bringing in sort of outsiders and doing it in some way that is respected right?
The academic world has often had these institutional review boards, so we trust that if there’s a review and a scientist wants to do some sort of research that, you know, there’s appropriate oversight. Increasingly companies, and a lot of the ones I deal with, are saying, you know what? I guess I need something like that as well. Maybe, you know, in a better way. Maybe without the bureaucracy. But I’d like to discover that there’s a discriminating sort of issue on my own and fix it, or I’d like to think these through with the appropriate, you know, thought leaders.
So institutionalizing credible ethnical review processes, because this stuff is just too darn hard today to be subject to sort of the to-and-fro of the latest political statement, the latest media story. But if you don’t have that process, and it’s a regulatory one and it’s an adversarial one, it’s hard to have really hard thinking about things like what decisions does a car make about safety. If I have data that says you won’t succeed in this major, the big data says not, do I tell you and tell Einstein not to, you know, major in physics? How can I use these things in ways that are inclusive, right? If I want to nudge you to be better and healthier, should my Fitbit warn me that I’m coming down with Alzheimer’s, right? I’ve got smart devices in my home. Should they report child abuse, right? Or should they ignore it, right?
We have these really incredibly hard questions. I’ll close with this. A researcher using Microsoft Bing search data showed that he could correlate pancreatic cancer, right, terrible killer, we don’t learn about it until it’s too late, with people’s searches for symptoms that they probably had no clue were related, right? So, wow, that’s important. But do I want a popup when I’m searching saying, hey, you may want to see your doctor based on your last—or, on the other hand, could we ethically ignore the fact that we might be able to intercede?
So these are darn hard questions. And we’re going to need those sort of sophisticated conversations inside companies to come up with the right thing to do. And then I hope the rest of the policy world, you know, can play from that. But if we don’t know what the right thing is, we can’t really easily, you know, lobbying outsiders to make policy around it.
MODERATOR: Joan, one big area of opportunity.
O’HARA: Can I copy? (Laughter.)
MODERATOR: Sure. Sign onto Jules’?
O’HARA: No, I think that the digital space and this whole new paradigm is the area. I mean, it’s evolving quickly. Even, you know, the Internet of Things is a whole new concept. Is that going to be regulated? What are the responsibilities? What are the expectations? I think—and particularly as a lawyer myself, it’s a really exciting area from a legal perspective, because a lot of the law that we currently have was not developed with this in mind at all. It’s a new reality. So I think that presents challenges, but it also presents really exciting opportunities.
My hope would be that with the incoming administration—and some people are very happy that we have now Congress and the administration being from the same party, some are not—but hopefully there can be some unity there and some cooperation between the Hill and the White House. And also I would love to see a really good collaboration between the public sector and private sector, where we get an exchange of ideas based on common values, and are able to see the United States take a lead in this area, rather than sort of continuing to fight among ourselves and letting other parts of the world be the leader in this space.
MODERATOR: On that note, we are going to have to close—
STEPANOVICH: Actually, oh sorry, can I come back? Because you asked about opportunity and that was really hard.
MODERATOR: Sure. Very quickly, because Jules has to catch an airplane.
STEPANOVICH: Just accountability. I think that this administration has shown a tendency toward thinking about accountability and transparency, a mistrust of government. I think if there’s going to be opportunity, that that might be where it comes from in this administration, where we should push.
MODERATOR: Thank you very much to you all. (Applause.)
This is an uncorrected transcript.