The General Data Protection Regulation (GDPR) went into effect in the European Union last month, providing more protection for consumers’ personal data on the internet. Panelists discuss implications of the GDPR for U.S. businesses and the future of federal privacy regulation in the United States.
KIRKPATRICK: So, OK. Oh, wow, powerful. Good morning, everybody. I’m David Kirkpatrick. I am a journalist and the head of—founder of Techonomy Media, which is a technology—
GOLDSTEIN: Can you hand me my glasses out of my purse?
KIRKPATRICK: A conference and publishing business focused on how technology changes everything, of which this is a great example today. We are now at a session on the future of U.S. data privacy after GDPR. This is the Council on Foreign Relations, for those watching on the livestream. Our panelists are Lynn Goldstein, Marc Groman, and Karen Kornbluh. I’ll introduce them in a moment. And it is on the record, just for everybody to keep that in mind. And we are both here and in Washington, which I’ve never moderated before, but OK. Oh, terrific, Karen is now with us. Terrific.
So well, let me quickly introduce the panelists. Here in New York with me, Lynn Goldstein is our sort of official business representative, a lawyer and a long-time privacy officer and lawyer for major financial institutions, including 10 years as chief privacy officer at JPMorgan Chase. She has done a lot of work in civic space as well in recent years and is involved with a number of very interesting nonprofit operations today, trying to figure out what to do about privacy. It’s something that she’s deeply engaged in and currently running a business called GDPRSimple, which helps small- and medium-sized businesses implement what they should do because of GDPR.
In Washington, on the left on your screen, Marc Groman is one of our two Washingtonian government veterans. He’s currently a principal at Groman Consulting Group, but he was in the later years of the Obama administration the senior advisor for privacy in the OMB. Prior to that he was chairman of the Federal Privacy Council that was established by President Obama, and the privacy lead on the president’s cybersecurity national action plan. He previously had a long career in the FTC, in Congress, as a privacy expert, and has done a lot of work outside of government as well.
Karen Kornbluh, on the right—did I say that right, Karen?
KIRKPATRICK: OK, thanks. She’s senior fellow for digital policy here at the Council, based in Washington. Prior to that, she was an executive vice president for Nielsen, responsible for global public policy, privacy, strategy, and corporate social responsibility. But prior to that, and something that I consider—think about when I think of her, she was U.S. ambassador to the OECD and has been very deeply involved in big-picture policy issues around how technology’s changing the world. And both in and out of government. And she—so she had a lot of government—she was vice—deputy chief of staff at the Treasury Department, et cetera, et cetera, lots of big—FCC’s director of office of legislative and governmental—intergovernmental affairs, et cetera.
And you know, it’s odd from my standpoint as a technology journalist that really governments generally don’t understand, or act accordingly based on what’s happening in a technologized society. So in that sense, the EU has taken an unusual stance. The country that, of course, is the outlier there is China, where the deeply understand exactly what’s happening with technology and are doing things their own way in response to it, not what we would do, of course. But they certainly understand what’s happening.
In terms of the United States, you know, I see Esther there. I think it might have even been at one of her conferences where Scott McNealy said, you know: You have no privacy, get over it. Was it at your conference? Or it was right—I remember I first heard about it at one of your conferences, probably twenty years ago. But the reason I mention it is, you know, this was twenty years ago Scott McNealy of Sun Microsystems said: You have no privacy, get over it. And at the time, it sort of seemed like, that’s probably right. We’ll get over it, right? But we didn’t think of things like IOT and The New York Times article day before yesterday about people doing, you know, domestic abuse using IOT. The range of ways that privacy can be invaded and disregarded and cause harm I think is so much wider than we would have even imagined as the world becomes truly an interconnected mesh of devices and data.
And that is really, really scary, no matter what you think, in my opinion. And I’m not—I’m not particularly worried about my personal data, generally speaking. And not to keep ambushing Esther, but she’s even put her entire genome up for public inspection. So some people are pretty willing to let their data get out there. But we are in a very strange place. And I think what to do is extremely hard to know. So I think what we’re going to hear about especially is what should government on this panel, and whether it’s likely they will do anything.
But let’s start with Lynn here in New York. Just give us your big picture view on how bad or how good the situation is. Maybe there’s some good to say. And where you see it going.
GOLDSTEIN: Yeah. So, first thing I think it’s important for companies here to understand is that even if you’re not present in Europe, the GDPR will still apply to you if you’re doing business—if you’re offering goods and services to individuals that are present in Europe. So—or, if you’re monitoring their activities, individuals that are in Europe. So that’s the one thing that I think is very important. That’s different from the law that existed prior to the GDPR, which is—was the directive. So if you are thinking that things are the same as they were before, they’re not. So if you weren’t paying attention to the GDPR but you have an internet business that has you offering goods and services or monitoring individuals—either to individuals or monitoring individuals in Europe—need to pay attention.
Secondly, a lot of companies—both in Europe and outside Europe—did not pay attention to the previous law because it didn’t really have any teeth to it. The fines were not significant and the data protection authorities, the regulators there didn’t have a lot of staff. And so the likelihood of anything happening to you if you’re a company that was established in Europe under the old law was pretty slim. They paid attention to the big technology companies. You saw a lot of press about actions being brought against them. But if you’re a smaller company, or even if you’re a bigger company with a smaller footprint in Europe, the likelihood of any—you know, any action being brought against you was small.
That’s changed. The fines are significant. They’re antitrust-type fines. It’s—and I always get this lingo wrong, so I’m going to—I’m going to repeat it.
KIRKPATRICK: It’s a gigantic number for most companies.
GOLDSTEIN: Yeah. It’s 2 percent of total worldwide annual turnover or ten million euros, whichever is greater. So if you’re a big company, it’s a big number. That’s the lesser of the fines. Or—
KIRKPATRICK: This is for even one violation involving one individual.
GOLDSTEIN: That’s right. One violation. Or 4 percent for a bigger-type violation. Four percent of total worldwide annual turnover or twenty million euros, whichever is greater. So we’re talking about big numbers, particularly for big companies. But even for smaller companies, you shouldn’t think that, eh, the numbers aren’t so great for me, because the other thing that’s changed is that individuals can now bring private cause of action against companies. So they now have—they didn’t used to be able to bring—but they can now bring actions like what we call in the United States class actions. So they can band together. They can bring actions against companies like you can in the United States.
And even if you’re a smaller company you’re going to think, oh, they’re not going to sue me because I don’t have very big pockets. Well, the other thing they can do I they can lodge complaints against companies because they now have a whole slew of individual rights that they didn’t have before. And there’s a list of them. And I don’t want to forget one, so I’m going to list them off because it’s important to understand what they are if you’re a company that’s doing business, as I just defined it. So you have the right to be informed, which includes what personal data was collected and how it’s being used. You have the right of access to personal data. You have the right of rectification, which means you have the right to have your data corrected or, if it’s incomplete, completed. You have the right of erasure of personal data when certain grounds apply. A lot of you, I’m sure, have heard about this right to be forgotten. That’s what that is.
You have the right to restrict processing of your personal data under certain conditions. You have the right to data portability when processing is based on consent or contract, and it’s to be carried out by—and if it’s carried out by automated means. You have the right to object to processing of personal data under certain circumstances. And you have the right not to be subject to automated decision-making, and that includes processing unless certain exceptions apply. That’s the big one that’s gotten a lot of attention by technology companies because that’s a lot of just the automated processing that occurs that a lot of our technology depends on these days. So if one of those rights is violated, any one of them, you have the right to lodge a complaint with a supervisory authority. You have the right to compensation, even for immaterial violations. And you have the right to bring a claim for nonpecuniary loss, as I said, for group or class actions.
So just coincidentally yesterday the International Association of Privacy Professionals reached out to all of the data protection authorities in the European Union and asked them how many complaints they’ve gotten since GDPR was put into effect. And they weren’t able to isolate them to whether they were particularly—the complaints that the data protection authorities received—whether they were particularly GDPR-related. But they told them how many—not all of them responded—but the ones that responded, they ranged from the U.K. got over one thousand in less than a month, just—and these are just complaints lodged with them. So it’s the first round that I told you—if your rights were violated, you have the right to lodge a complaint with the supervisory authority. So the U.K. got one thousand, France got something like five hundred—like, four hundred, something like that. That’s the impact in less than a month since GDPR was violated—was put into—implemented.
So that—if you think that these new rights—and that individuals aren’t asserting them, and that that’s not going to have an impact on a company that’s doing business in Europe, then you’re sorely mistaken.
KIRKPATRICK: Hey, Lynn, one thing that I think also is surprising, when I’ve heard it explained—and tell me if this is your understanding—the data that it applies to is any individual’s data that is in the EU, regardless of whether it’s an EU citizen, is that right?
GOLDSTEIN: That’s exactly right.
KIRKPATRICK: Yeah. So if any of our personal data happened to be in the EU, we could say, hey, you know what, I don’t want data-based targeting of me either. So go ahead and don’t do any automated processing on me.
GOLDSTEIN: That’s exactly right.
KIRKPATRICK: So it’s just an interesting—it’s one of the reasons why, by the way, Facebook has withdrawn all the data of non-EU citizens from all of its data centers in the EU, just so you know.
GOLDSTEIN: Yeah. So whether you think this is good or bad, when you go on vacation in Europe you have the protection of the GDPR. When European citizens come here, they do not have the protection of the GDPR. So it’s not—it’s not based upon where you’re a citizen of. It’s not based on residency. It’s based upon where this personal data is located and where the company is doing business. So I’ll stop there.
KIRKPATRICK: Great. OK, thank you, Lynn. Well, we’ll come back to many of those issues.
Marc, so chime in. Say what you think is going on with the future of U.S. data privacy.
GROMAN: Good morning. And thank you very much. It’s really an honor to be here and to have an opportunity to speak with you and the Council on Foreign Relations about an issue that I’ve been working on for about 18 years, one that I am passionate about and have been deeply involved in. I want to maybe take a slightly different perspective than Lynn, only in that I am glad that GDPR is a driver of dialogue here, and I’m glad that you’re all here and it’s forcing this very important conversation. I also think there’s a tremendous amount of misinformation out there about GDPR on the internet and otherwise. And I think that I’m not quite as alarmed about it as maybe some others are. And happy to speak about that in length.
More importantly, and currently I work with large companies, Fortune 500. I’m also advising a wide range of startups and also working with the World Economic Forum to try and develop a sort of more global approach to data policy, so we have an interoperable framework and so that, regardless of what country you’re in, your government and so forth, we understand different economies will take different approaches. But we’d like it to be interoperable and to have principles that we can all recognize. And so when I work with companies, I don’t work with them specifically about GDPR. I certainly can, but I actually encourage companies to take a global perspective. Don’t invest significant resources in a GDPR compliance program. Invest your resources in a strategic, comprehensive, forward-looking and global data program. And GDPR should be part of it.
That is how I view it. I think that GDPR has some positives to offer, particularly around ideas of data governance, which I have always believed is fundamental to risk, to security, cybersecurity, privacy, and even your business development—data governance meaning understanding where your data is. And I would assert that there isn’t an entity in the private sector or public sector that knows where all their data is. And so anything that drives that, as well as drives innovation—meaning encourages the investment and development in new technologies to identify data and to address some of the issues created by new technology, those are really great positives.
Now, looking here at United States and what it means for American entities, a few observations. First of all, it is my belief—and you see this play out—that privacy should be and ought to be a nonpartisan or bipartisan issue. People come at it from the full range of the political spectrum with different philosophies around being a libertarian, or civil rights, or issues you care about. But it really is nonpartisan. We can have very robust and respectful debates about the appropriate role of regulation and legislation. I think those are important discussions to have. I certainly believe, though, and have believed for a long time that the United States must have—absolutely must have a general comprehensive privacy law, and we’re long overdue.
And what that looks like, I think, is where we ought—you know, the discussion comes in about the scope of any regulation or legislation. But I think that having a comprehensive omnibus privacy law would be good for business and good for the United States. Certainty around the rules of the road, clarity, obviously preemption that matches the scope of the bill—doesn’t exceed it, but matches it, to give that level of certainty—will benefit consumers and business alike and allow us to have a much better narrative about the U.S. approach to privacy overseas, and hopefully encourage trust in American companies, American products, and what we do. I really think that is very important. And I think that we will see going forward less resistance from industry as various proposals are teed up around legislation and regulation. And we’ve seen a wide range of American executives calling for some type of federal privacy legislation. And I think that momentum will continue. And I think that debate will also continue here in Washington, D.C.
I don’t envision it to be even GDPR-lite, whatever that might mean. I don’t believe that we need to import that entire framework here. I do believe that what we generally call fair information principles around notice, choice, transparency, use, security, minimization, accountability—I think those remain relevant in the digital age, in the fourth industrial revolution. It’s how we tweak them and modify them to work in this era of rapidly emerging new technology, in this era of big data machine learning, artificial intelligence. But I think the core principles, which we see across the world stand the test of time. It’s how we think about them and apply them. And I think it will be part of our discussions going forward.
I think those are really the high points that I want to make today, which is that, you know, GDPR is driving a global conversation. Over 100 countries across the world have some type of privacy law. And dozens now, form Brazil to India to Japan, all working on their own frameworks. And many of them are now looking towards GDPR and the EU model and that framework. And in part, in my own opinion, it’s because the United States has lost our role as a leader in this area. We have given up that role. We are not seen as a trusted entity around the globe when it comes to data, data security, and data protection. And therefore, I don’t think we have much credibility or integrity when we speak around the world on this topic. And so others are looking towards the European model, and less so at our model.
Every entity that I’m aware of wants innovation. They want to adopt new technologies to better serve their citizens. They want their own Silicon Valley. But as they think about a data framework, most entities now are looking more towards a European model than what we have here in the U.S. I won’t go through all the things that I would observe today that are problems in this space. We could have that discussion, but I think we have lost that leadership position. And that’s very concerning to me personally. And I think it should be very concerning to American industry that we are no longer leading these discussions.
KIRKPATRICK: And Marc, I think—
GROMAN: So I’ll close there—yeah.
KIRKPATRICK: We’re going to get, I think procedurally, what—either when I’m asking questions or when the audience is—what the U.S. should do in more detail, because I think that’s a central part of what we ought to discuss. But it’s worth noting—you know, Lynn was mentioning the data protection authorities of all the EU countries. We don’t have a data protection authority. Canada has a national privacy commissioner, or something like that. We don’t have that either. We basically have no privacy infrastructure at the national level to speak of. So, anyway, it’s worth just underscoring the points Marc was making.
KORNBLUH: Yes, sir. So I’ve been working on internet policy since Al Gore was laying out the framework for the information superhighway, along with many people here. And I want to address—start by addressing something that you brought up, David. You know, we thought that privacy was dead—or, a lot of people thought that privacy was dead. And I think what’s been so interesting about recent news items, especially Cambridge Analytica, is it shows a different kind of harm that can come from lack of privacy.
So I think people are pretty sanguine about their shoe size being known, and the same advertisement for that pair of shoes annoyingly following you around the web. But it’s really different when you get the idea that people are learning or inferring your political or philosophical views, and they’re using that to micro-target and spread disinformation. So I think what we’ve learned in recent—in recent weeks is some of the frightening things that can happen. And people are a lot more attuned in the U.S. to some of the privacy concerns that people in Europe were already aware of because of recent experiences both with the—you know, what life was like behind the Iron Curtain and before that in the Second World War. People there just have much more recent experiences.
So I just wanted to make a couple points. One is, May 25, when GDPR went into effect, was not, as Churchill would have said, you know, wasn’t the end or even the beginning of the end, it was just the end of the beginning. There’s a whole bunch of stuff that has to happen to figure out what this is all going to mean. It’s going to be interpreted and enforced by the national DPOs in coordination, the European Court of Justice. There’s another set of regulations, EU privacy regulations, that are incredibly important in terms of behavioral advertising that were supposed to go into effect right afterwards, but there’s been a tremendous amount of lobbying so that’s delayed. Those will focus on tracking and profiling.
Companies are in some respects playing a game of chicken with the regulators, and they’re playing around with whether or not they’re really going to do some—especially some of the things that have to do with asking for particularized consent. You’re not supposed to ask people to consent to use the entire service, but just that one use of the data. And you see that people are actually trying to get full-on consent. There’s this use of dark patterns, where it looks like you’re being the given the choice to opt in, but the way the web interaction is designed you’re sort of forced into giving your consent. So there’s a game of chicken going on between companies.
Different companies, obviously, have completely different interests. You know, the platforms have very different interests than, you know, your hotel chain that uses data or a Microsoft, you know, and an Apple have a very different interest than a Facebook and a Google. So we’re going to see how that plays out. And then, as you were mentioning, the activists have been changing—challenging the business models as well. So, you know, they’re using this law to challenge the entire business model. And the whole idea of these companies being so big do you really have choice if they’re asking you to consent to the whole service, and there’s no another Facebook or another Google? So some of the subtext behind some of the challenges really get at the entire business model.
One thing I want to say about—so that’s GDPR. We’re going to see a lot of activity going forward. The second point I want to make is that things are going to happen quickly here, I think, right now. And it’s not just because of GDPR, but it’s part because of GDPR. California is right now forcing tech’s hand and the country’s hand. There was a referendum on the ballot there, and now the legislature has stepped in to—it looks like they’re going to get a bill through by the 28th, this week. If not, the referendum will go through. And it has some of these things that we’ve seen in GDPR. It has a right to access your information from those collecting it, a right to deletion, a right to know. There is various levels of opt-out and opt-in for kids. And, importantly, there’s a private right of action on a data breach, with big damages.
So that’s really interesting. They’re going to have a one-year delayed implementation. And I think what people suspect is that this will create a lot of impetus for a federal action to preempt California. So I think we’re in a much more interesting phase. And there have been some recent reports that the White House is starting to talk to folks about what would technology—what would privacy look like. And there was a meeting of tech companies in California yesterday that talk about some of this.
So the other point I wanted to make is that—and Marc made this point—that the U.S. kind of forfeited leadership. That there were various efforts under the Obama administration, some people in this room worked on them, to get a privacy bill through. When I was at the OECD we got this set of internet policymaking principles through. That was an administration-wide effort that was trying to say to countries: Look, go ahead, you can make policies, but it’s got to be within this framework we’ve all agreed to of free flow of information and respect for human rights. But within your country, go ahead and protect privacy, protect consumer protections, cybersecurity.
But we weren’t able to move forward on some of the initiatives here in the U.S. And we’re now seeing other countries pick up the slack. It’s not just Europe with this GDPR effort that may be a little more bureaucratic than many companies here would like, but we’re also seeing a different approach, as you mentioned, from China and Russia. And then a bunch of smaller countries are really frustrated and upset. And so, you know, not just with privacy. Papua New Guinea just shut down Facebook, which for all intents and purposes in many countries is the internet, because they couldn’t control disinformation. So on a lot of fronts, the U.S. has let down leadership.
And then one last thing I just want to mention is this issue of the FIPPs, and what are the principles that we have going forward. I’ve been working with a bunch of really smart people on privacy. I’m not a privacy expert. But they’ve been raising questions about can consumers exercise consent? This is really coming into question when you’re in a situation where things are so complex you can’t really understand how your data is being used. When you’re talking about Internet of Things. So this whole issue of consent I think is a really interesting one now. Do you have a choice when there’s no competition in some of these markets because of network effects? And there’s a German case that’s taking a look at that. And, again, is there really a choice when use is conditioned, or these dark patterns. And this is one of the things Schrems is raising.
So I think we’re in a really interesting space. I think there’s going to be a big debate in the U.S. And I hope it really gets at some of these next 21st century questions, as opposed to looking backwards.
KIRKPATRICK: Thank you, Karen. And just for anyone who doesn’t know, Schrems, who Karen just mentioned, is this guy, Max Schrems, who is a young Austrian guy who, even when I was still just a student, started suing Facebook in Europe, and having a huge impact. And has gotten some very high-power lawyers in Brussels now working for him. And he’s got a new—essentially, a class action suit under the rules that Lynn earlier describe, which has quite a bit of promise. This guy has had an amazing impact on European law, starting just when he was, like, a graduate student who decided—I think what happened is he requested his data from Facebook, and found they had way more data about him than he realized. And then he just became a—you know, go on a jihad.
But, you know, I wanted to make another observation, Karen, about something you said: Consumers may not be able to give consent. But there’s an irony on the other side too, or a complication. You know, when you hear all the things that companies have to do, or might have to do, many of those are things they may not actually be able to do. You know, all these subtle variations of how they ought to be able to manage an individual’s data within their ecosystem of algorithms and data centers, et cetera, some of those fine-tuning points might be possible with an extremely expensive gesture of programming, you know, that isn’t necessarily something that they could just do on the face of it.
So we’re in a really weird moment. And to me, it raises a bigger question which is: The failure, which I tried to allude to at the outset, of industry and government to work together, both in individual countries and globally—because, weirdly, that is what we now essentially require and have no pathway to get on this as, as well as many other issues—is some kind of global work between especially the bigger internet companies, but data-based companies generally and governments, because government are not capable, companies are not capable. Right now, the bunch of biggest tech companies are meeting on privacy in California this week under the Information Technology Industry Council to try to say what their strategy should be. But they’re not going to be able to do it on their own either.
So it’s—so, actually, that goes to—leaving that in the background, I wanted to ask Lynn and Marc both to comment on something before we go to the audience. Because our rules that we should go there. But we’ll have just one more second. You made a great point to me in the green room, Lynn, about why we haven’t made progress thus far. And I’d like you to say that and then have Marc comment on it. Could you just quickly make that point?
GOLDSTEIN: Sure. And the Washington people should comment on it, because they understand it better than I have. But Marc alluded to the fact in his comments that we had a federal privacy law proposed during the Obama administration. And I think most people would agree that privacy is not a blue or a red issue. It’s not a—Democrats and Republicans universally can agree that we need a federal privacy law. But we’ve been unable to pass one for a variety of different reasons. A fair amount of lobbying was done when we had a proposal.
But there’s also a way that our Congress is organized and the committees are organized in Congress that make—with the various sectors and that—and the committees associated with those sectors—that make it very difficult to get a comprehensive privacy bill passed. And so I think that had a lot to do with the fact that we didn’t get one done before. And I think it has a lot to do with the fact that it’s going to make it very difficult for us to get a federal level privacy bill. And one of the things that we’ve—many of us have talked about before is that it might have to get left to the states. And Karen mentioned the California bill that’s making its way through. And California’s very influential in the privacy area. It’s a very large state in the United States. It’s the fifth-largest economy in the world. And it’s been a leader in privacy in the U.S. since privacy became, you know, an area of law to pay attention to.
So I think what happens in California will make a difference. It probably won’t have the impact that the California Online Privacy Act had an impact because when you impact the internet it has a much broader impact than just giving California residents rights. But I think—and the Washingtonians will have a better—able to explain why it hasn’t passed in Congress. But I think there’s a lot to do with the way Congress approaches legislation that will impact why we don’t have a federal bill and may not get one in the future.
KIRKPATRICK: Marc, would you comment on that? Do you agree with that?
GROMAN: I think that is one element that has been a challenge in passing privacy legislation. I want to—I will return to that. I need do make a threshold point though, because privacy is a rather multifaceted and complex issue. And I want to make sure, at least in this room and in New York we’re on the same page. When I speak about privacy, I break it down or separate out what I will call commercial sector privacy, which is generally what our conversation has been about right now, which is collection and use of data by companies, versus the public sector, meaning collection and use of data by the U.S. government. And then when you get to the U.S. government, it’s important to further segregate that out to our civilian government, to law enforcement, to our intelligence and military community, because they have very different regimes as well.
And in the context of interoperability around the world, those are all relevant. And I’m teeing that up because you mentioned Schrems. And it is not generally understood—and this could be a much bigger conversation—but that the issues that struck down our safe harbor, that then led us to renegotiate for privacy shield, were not about the commercial sector, but about the intelligence community in a post-Snowden world. Whole separate panel or discussion one morning, but I need to make sure people understand that.
Now, let’s talk about federal legislation in the U.S., which is, in that context, back to the commercial sector, because that is what those general bills were about only, which is the commercial sector. So for those who don’t know me, I was privacy counsel for two years—2009 and ’10—in the Energy and Commerce Committee on privacy when Henry Waxman was chair. And I wrote the Best Practices Act, from the first word to the last word, which then became Kerry and McCain. And I was working with the White House at the time. And actually, we worked at that moment in time in a strikingly bipartisan way. Some of the most supportive, engaged members, like Joe Barton and Cliff Stearns, and Mary Bono Mack were all Republican members.
One of the issues—or, I’ll say this—the most challenging issues in drafting a framework were not sort of substantive requirements. Few people came up to fight about that. The battles were around preemption of state law and how the laws would intersect with current laws. Those were really the big issues. And I know my friends here in Washington know this, but for me—our listeners in New York, I’d like you to just understand the perspective of the guy sitting in Congress or the White House and how this plays out. It plays out this way: So I met with, I don’t know, seventy stakeholders over the course of six months and five hundred hours of my time. And here’s how this works. And I’m not picking on any industry, but I will select some for the moment. So in one meeting, the telecom industry shows up and talks all about the evils of Google, Microsoft, Facebook, Apple, and Amazon, and why the law needs to focus there, and they need to be exempt. And then they leave.
And the financial sector comes up and talks about how great they are because they have GLBA, which is barely a privacy law. And they talk about the ISPs and the pipes and Silicon Valley. Then you have the big publishers come in and talk about the benefits of third parties. The problem is the third parties. The third parties come in and talk about their responsible self-regulation, the problem is the pipes. And then they all come with slide decks and explain the technology and industry in a completely different way. And so we talk about this as industry and government can’t work together, but the fact is it’s a difficult process as we’ve set it up here. Because we meet with 50 stakeholders, who all offer a different view and approach. And it’s up to policymakers to navigate that.
And in part, it’s because our current sectoral approach to privacy. If we had a clean slate, like many countries do today, I think this would not be a challenge. I think we could craft really sensible, practical, scalable, and implementable provisions for privacy. That is not the world we’re in. We have HIPAA. We have GLBA. We have FCRA. We have COPPA. We have CAN-SPAM. Privacy Protection Act for Video. Privacy Act. Cable Act, and I could go on and on. At the federal level, the standards and definitions are different in every single law. Which means that what occurs is I don’t think this is that difficult, but what will occur is any debate there will be some competitive effect. Some laws, like GLBA, which basically are nothing, that community who has GLBA will be subject to higher standards. Other areas won’t be touched, like FCRA and probably HIPAA.
But we’re not starting from a—
KIRKPATRICK: I’m not interrupting you to define every acronym. But I don’t understand what they all are. But keep going anyway. I don’t think everybody knows—
KORNBLUH: I just want to tell people in New York that the policymakers here in D.C. all have the PTSD, as Marc was talking, about meeting with all those different groups, because we’ve all been through it. That was a pretty accurate description.
GROMAN: Right. So it really does play out that way. And then at the state level, what I mentioned was the preemption debate. And so I believe that we need regulation and legislation. I also believe we need federal preemption. I think that will benefit everyone. But I spent more time on what that means and how to define preemption in a bill than you can possibly imagine. Seven years ago when I looked at this, a broad preemption could wipe out five thousand state laws in a day. Today, it’s far more than that. I know some industries would cheer, but the impact of that is dramatic when different laws are all, you know, allegedly wiped out in a day, some of which—many of which were not covered by the actual law, right?
So we had cases where we had excluded health, but people wanted the federal order to preempt any state laws on genomes, right? So you didn’t have parity. And the breadth of preemption was a challenge. And so the committee approach certainly plays into that. I found often dealing with members of the same party on two committees was more difficult than dealing with members of different parties on the same committee. So we’ve got that issue. But we have to deal with preemption and we have to deal with how any new framework will map to our current sectoral framework in a way that will be practical, implementable, and have the least disruption. But there has to be some disruption.
KIRKPATRICK: You know, I’m so glad you went through that. It was fascinating to have someone with your depth of experience explain it. But we’ve got to go to questions. So I’m going to take—the way we’re going to—to reiterate, we are on the record. Wait for the microphone. Speak directly into it. And stand and state your name and affiliation. Limit yourself to one question. Keep it concise. And I’m going to start here. And then Karen has agreed to be our sort of quasi-moderator in Washington. So I a hand jump up over here. So let’s give the mic to this guy, and then—please, one question. We got a lot of hands.
Q: Hi. Oh, sorry. My name is—oh yeah. My name is Alex Yergin. I work at Datawallet. It’s a consumer-to-business data exchange.
My question is: Do you think that this push towards privacy around the world can create new business models, such as where sort of individuals themselves own their own data and then permission it, as opposed to data just being collected on people and given to corporations?
KIRKPATRICK: What a relevant question. Really quickly address that. And maybe all three could quickly touch on that, because that’s a hugely important issue.
GOLDSTEIN: The answer is yes, and it’s already being done. And there’s questions about who owns the data. And there are businesses, primarily in Boston, that are actually exploring that. Yeah.
KIRKPATRICK: There are businesses all over the place.
KIRKPATRICK: Marc, quickly on that, do you agree that that could be a big deal?
GROMAN: Well, what I expect to see is a wide variety of innovation and investment, and a full range of new technologies around control, transparency, data governance, managing your data. I think that’s one that we’ll see. And it is this new investment in technologies that can offer different ways to manage data that I really think is a tremendous benefit of regulation, and particularly exciting.
KORNBLUH: Yeah, I think one thing to keep in mind is that that’s one reason for the U.S. to get its act in order and for the EU to move forward in sending some clear signals about how this is going to be interpreted and enforced, that we’re going to have a bunch of companies, as I said, playing chicken, trying to get us to the least—keep us at the least common denominator. And we just have to figure out where we’re going to be so that individuals and companies can start making investments and hopefully, you know, bringing this up to a higher level instead of at a lower level. But until we have some certainty, I think it’s going to be hard for people to place bets.
KIRKPATRICK: But, just quickly, it can’t be overstated how much innovation is happening around this issue all over the world. Tons of money is flowing into companies that think they can solve the problem. The Cambridge Analytica and Russian electoral manipulation scandals both, I think, prompted more of that than was already happening. But a lot of it was happening anyway. There’s tons of talk about blockchain, encapsulation of personal data, how could it work. Nobody has found it yet, the holy grail, but it’s huge.
Karen, pick somebody in Washington, please.
KORNBLUH: Mark. And identify yourself, please.
Q: So I’m Mark MacCarthy with Georgetown. I’m with the technology trade association SIIA.
And I want to thank Marc and Lynn for pointing out the practicalities of getting a new piece of legislation. But I want to go to Karen’s question about the role of consent. It’s been really the focus of what GDPR is all about. And in many ways that’s a missed opportunity, because privacy law scholars have known for a decade or more than consent isn’t fully protective of individual rights in this area. If you want to take a look at a good book on that, Woody Hartzog’s new book new book on privacy blueprints is a great message. So the question is, what else if we’re not going to do that?
And here, I want to take exception to David’s point that there’s nothing at the federal level at the United States. The Federal Trade Commission operates as an effective privacy regulator in the United States. And they take a risk-based approach. Their approach is think about consent, of course, but the real point is to prevent people from the risk of harm, prevent that kind of substantial risk of real harm. Could that be a framework for people to think about new privacy legislation? Instead of consent as the way of protecting people, think about how can you prevent the risk of harm to people, and what measures might be useful in that area?
KIRKPATRICK: OK. One or two quick comments that any of the panelists have on that? Anybody—
GROMAN: So it has to be the basis for legislation. And my strong belief, and those who worked with me when I was at the White House as we redid federal approaches to privacy for the government, in A-130 and NIST, going forward, given the volume, velocity of data, the number of sensors and how data touches everything, we’re going to have to take a risk-based approach. That does not, though, mean that consent is no longer relevant. And I want to point out that the FIPPs, Fair Information Practice Principles, notice and choice, which should be challenged, are two of them. And so when I speak about the value of FIPPs, I mean the full range, meaning we must have accountability and purpose and use limitations, data security, transparency, consumer control. All those together can form a framework. At the core of any framework must be risk.
And in fact, it’s the core of GDPR. The question and—it is. The word, it comes up seventy-five times in GDPR. However, they present it differently. They present it as the risk to fundamental rights. When we think about a risk of harm, it’s a very different way to think about risk. But risk is at the core of it. I think to the extent that which we focus disproportionately on consent here is a mistake. I don’t think it’s what they anticipated either. I really don’t. I mean, there’s legitimate interest. There are a lot of bases for using information that my clients are using. And it’s not consent-based, except in certain categories.
And then finally, Woody’s book, which you pointed out, you know, I want to—which I read, and I also agree with—consent in the U.S. has not actually ever—I’m going to be controversial. I don’t think we’ve really robustly tried it, in that companies—and I was in industry for four years, to be clear—we know how to build a great UI. We know how to build a user interface. We have the studies. And we know how to have a consumer make choices and do certain behaviors. We’re not that complicated. So when we build a choice mechanism, we could do them better. We could make them more clear if we wanted to. But for many reasons, we’ve made intentional decisions that haven’t been. So I think it touches all of those.
KIRKPATRICK: They’ve intentionally made them complicated, is what I would argue. But quick, there’s so many hands—
KORNBLUH: David, let me just—can I just—David, can I just weigh in on one thing?
KIRKPATRICK: Yeah, please. Please.
KORNBLUH: On the FTC point. We have a roundtable here at the commission—I mean, at the Council on some of these issues and these digital politics issues. And one of the things we’ve been discussing is the FTC and, you know, how much its hands are tied in terms of having the ability to pay and hire experts that really have the skills to look at this stuff. In terms of, you know, they have no rulemaking authority. They can only look at things after the fact. They don’t really have finding authority, except through a consent decree. So I think if you’re going to look—I mean, I think it’s really interesting to talk about the FTC. I think the U.S. has forgotten how to regulate.
You know, when you saw that hearing where the senators were questioning Mark Zuckerberg, and they were—even the Republicans were saying, well, we clearly need regulation, but we can’t possibly do that. How would we do—Mr. Zuckerberg, how should we regulate you? (Laughter.) We’ve just completely forgotten that there are these things called expert agencies that we usually rely on. I mean, when you’re regulating drugs, you don’t have to know organic chemistry to use the FDA. So I think—I think exploring, you know, the FTC and its role, or God forbid a new expert agency, or let’s think in the 21st century, you know, do we really need that kind of approach, or is there a more 21st century approach? But certainly let’s think about how we regulate in a smart way.
KIRKPATRICK: Well, we used to have a congressional technology office that was abolished by Congress. And the European Union has, like, a twelve hundred-person technology evaluative authority that has a huge range of expertise. That’s worth noting.
GOLDSTEIN: Just briefly, David. So the European regulators, even with the advent of GDPR, have understood that even GDPR isn’t enough, because consent has its limitations and legitimate interest has some bureaucratic aspects to it. And they’ve started talking about how you need to take an ethical approach to managing your data for organizations, how they need to take an ethical approach. And there’s an organization that I’m a senior strategist for called the Information Accountability Foundation, which had done a lot of work with talking about ethical data impact assessments and how when consent is not available—and this is particularly important for organizations in the United States where—and other countries where consent if the primary form of governance around data, you need to look at other ways of managing your data responsibly.
And so one of the things you need to do is thinking—start thinking about—and the data protection authority in the U.K. and the European Data Protection Supervisor in the EU have all talked about this ethical data approach. And so this is one of the things that you need to start thinking about, because this is what’s coming next, is ethical data impact assessments for when consent and other means of proving that you’ve—or demonstrating that you’re responsibly managing your data is using ethical means. And so I urge you to start thinking down that road as an organization. And you need it for artificial intelligence, machine learning, when there is no means. We talked about Internet of Things. There is no means to get consent when you’re using advanced data processing activities these days. So you have to start thinking about things other than consent. Legitimate interest is a way you can start doing it in Europe, but you have to start thinking about alternate means.
KIRKPATRICK: Thank you. OK. Esther. Since I ambushed you before, you can at least speak for yourself.
Q: Yeah. So Esther Dyson.
Just a brief observation and a question. I keep thinking, wouldn’t it be wonderful if we regulated health the way we’re talking about regulating data, in terms of looking at the impact of companies’ activities and the stuff they sell on people’s health? The question is about the GDPR and—(feedback)—whoops—ICANN, the WHOIS system. So in Europe, ICANN, I think very properly, though they’ve done a really bad job of actually implementing it. If you have a website, you have to more or less state who you are and there needs to be a way to reach you. That’s a requirement of the domain name system. And now people are claiming the GDPR should give privacy to those website owners. So there’s this fundamental battle between transparency of things that have power versus individuals who deserve privacy. And I just like your opinions on how you deal with that particular conflict.
KIRKPATRICK: It’s a really interesting—that’s a complicated—can anybody make a very quick observation on that? Karen, you got a thought on that?
KORNBLUH: Nope. (Laughter.)
KIRKPATRICK: OK. Well, I would like to treat it as an observation and a comment rather than a question. I think it’s a really good one. I mean, you’re an expert too. Let’s go to somebody in Washington, because we’ve got so little time and so many hands.
KORNBLUH: David, I’m going to combine our last two, here and here. And could you ask one-part questions, not two-part questions, and identify yourself?
Q: Thanks, Karen. John Croft with Northrop Grumman Corporation.
I want to go with Marc’s original point about certainty and security. In the last three years, a typical company has had to do safe harbor. Safe harbor goes away. You have to find an interim strategy to move your data after safe harbor. Then privacy shield comes along. You register for privacy shield. Then GDPR comes along. You’re compliant with GDPR. Now Brussels is saying: Maybe privacy shield’s not relevant anymore. My question is really, what are the prospects for some kind of transatlantic stability between the U.S. and the EU to have data be moving, because it seems to be very, very uncertain right now.
Q: Hi. Charlotte Newman with Amazon. I focus on financial services public policy and not this area for the company, just at the outset—just to say that.
My question is for Karen. So you talked about the historical underpinnings that really—
KIRKPATRICK: Hold the mic a little closer, if you would.
Q: Sure. So, again, Charlotte Newman. I’m at Amazon.
My question is for Karen. Karen, you spoke about the historical underpinnings in the EU that really drove the individual data privacy commissions or organizations that exist across the EU. And I’m curious, here in the U.S., now we kind of live in a post-Cambridge Analytica world. But was that a sufficient enough watershed moment to drive public advocacy and outcry that would lead to a federal law of the sort that we’ve been discussing?
KORNBLUH: So transatlantic and will there be a U.S. law. So, Marc, you want to try?
GROMAN: OK. So obviously I think that consistently would benefit all countries and all industry. And we need to promote and have cross-border labor flows, which are essential to the fourth industrial revolution and the digital economy. I’m skeptical at this moment in time that, at least in the short term, we’ll achieve that. I think that there are questions about even the viability of privacy shield now. I know when we negotiated it—and I was there and participated—we made a very big deal about our Privacy and Civil Liberties Oversight Board, an independent body that had the ability to review our intelligence community.
We don’t have one now. And so—and every time I speak with a European former colleague, they highlight that, and many other things. So I would like to see consistency. And I would like to see certainty around that for industry. I’m skeptical we can get there in the short term. In the longer term, I’m more optimistic. I actually think there are a lot—there’s a lot more in common—and I’ve read, I don’t know, privacy laws from twenty countries—at a foundational level. And I think that we can draw on and find those common principles.
Government surveillance is a whole separate issue which is very complicated and beyond this scope, and that we’ll have to tee up differently. As for the prospects, I don’t know that Cambridge Analytica in and of itself is the motivator, but I think—I loved Karen’s point that it did highlight for a lot of people a very different kind of harm that really people hadn’t thought about before. And more broadly for a lot of people. And so obviously It’s prompted hearings and discussions. It’s brought out all of the great competitive tools and swords of K Street and industry to frame the debate. But I think that, combined with GDPR and other issues that are percolating, together are really going to force this debate. You know, California and states, as Karen pointed out, being among them. So I would put that as one element of this sort of moment in time.
KIRKPATRICK: Lynn, I know you had had a comment about what will actually inspire us to act. I mean, quick.
GOLDSTEIN: Just to close us out, which is we’ve had huge data breaches. We’ve had Cambridge Analytica. We’ve had all sorts of events. And we still don’t have an omnibus privacy law in the United States. I’m not sure what’s going to drive us there. I don’t know what it is.
KIRKPATRICK: Well, on that possibly slightly grim note, unfortunately despite the hand we have to wrap. I’m sorry we didn’t get to everyone. But I think it just shows this issue is unbelievably important, multifaceted, and subject to further discussion. So thank you all very much. (Applause.) Thank you to the panelists in Washington too, and here. Sorry, I should have said that.