Political Risk: How Businesses and Organizations Can Anticipate Global Insecurity

Political Risk: How Businesses and Organizations Can Anticipate Global Insecurity

More on:

Politics and Government

Political History and Theory

Technology and Innovation

A generation ago, political risk mostly involved a handful of industries dealing with governments in a few frontier markets. Today, political risk stems from a widening array of actors, including Twitter users, local officials, activists, terrorists, hackers, and more. Condoleezza Rice and Amy B. Zegart, coauthors of the forthcoming book Political Risk: How Businesses and Organizations Can Anticipate Global Insecurity, discuss how businesses can prepare for an increasingly complex set of political risks, suffer fewer surprises, and recover better.

FARRELL: Good afternoon. Thank you all for joining us on this beautiful summer day. I want to welcome you all officially to today’s Council on Foreign Relations Corporate Program meeting on Political Risk: How Business and Organizations Can Anticipate Global Insecurity. I’m Diana Farrell. I’m the president and CEO of the JPMorgan Chase Institute. And I am absolutely delighted to be presiding over today because my connection to this issue and to our wonderful panelists is extensive. In addition to having served in government myself, Condi chairs a—the Aspen Strategy Group now, co-chairs it. And I’ve been a member of that for a while. And, of course, Amy and I were at McKinsey together. So I sort of feel like all of these connections are coming together. So it’s a real honor to have with us two authors, and extremely distinguished women and leaders, who really need no introduction. And if you see their bios on your chairs, you will already know half of it, but then be impressed by the other half.

I wanted to start off with, of course, your book. And most of our conversation will be about your book, I hope, because I found it really interesting. My understanding is that the genesis of this book is a class that you’ve been teaching for a while. So maybe you could bring it to life. Why did this book come about? And what made you finally put pen to paper?

RICE: Well, Amy and I both teach in the Graduate School of Business at Stanford. And we started teaching in a course called The Global Context, which was a multi-faculty course. And we taught that for a while. And as we were doing that, it was a course that all MBAs took. And we realized that there was very little in that course, really about political risk as we understood it as political scientists. And so we decided to teach a course for the business school for MBAs, mostly second-year MBAs, called Political Risk. Then we couldn’t find anything to assign, because there weren’t really very many good readings on political risk. So we sort of cobbled things together. And we taught the course, did simulations and the like. And eventually I said to Amy: You know, maybe we should actually write some of this down, since we’ve had trouble finding materials. And that was sort of the genesis. And our students also encouraged us to actually write it down and do a book.

ZEGART: At the end of every quarter, the students would say two things: Make this class longer—no students ever say that—(laughter)—and you really should turn this into a book. So after class one day, Condi said: You know, it’s time. We really should turn this into a book. And so we did.

FARRELL: But they asked for that a.m. slot, did they? (Laughter.)

ZEGART: Class is in the afternoon.

FARRELL: Class is in the afternoon, great. Well, one of the things that really comes out of this book, and I hope you’ve had a chance to buy it coming in or on the way out, is that it organizes a lot of complicated information in really good ways. And I was telling Amy, I do see the former McKinsey-ite and I do see the MBA in all of this. But maybe we can sort of talk a little bit about this. You lay out a very powerful framework, I thought, to understand, analyze, mitigate, and respond to risk. Those are straightforward things. But what you do is really bring them to life. So I don’t know in what order you want to take some of these, but let’s just kind of bring that to life for folks.

ZEGART: Oh, sure. OK, so, you know, I’m always reminded of the line from the movie “Bull Durham,” where they say, “Baseball is a simple game. You throw the ball, you catch the ball, you hit the ball.” But of course, baseball is actually quite a difficult game to play. Political risk is the same thing. So our framework is you have to understand the risks, you have to analyze them, you mitigate, and then you respond. It sounds obvious, but we found many companies actually don’t do these obvious things. So if you take just understanding risk as an example, understanding—you have to look for political risk to find it. So one of the examples we have in the book is legendary CEO of GE Jack Welch, right? One of—the biggest merger at the time, in 2001, was the GE Honeywell merger. And Jack Welch was so excited about the business case that he postponed his retirement, you may recall. Sailed through the U.S. regulatory process. He thought: This is—he said, the cleanest deal you’ll ever see.

Except that European regulators had a different point of view. And they had a different philosophy. They had actually—were looking for a way to show their independence. And if you had been looking, you would have seen that four years before they nearly scuttled another big merger in the aviation industry. So the risk was there if you’re looking for it. And Jack Welch himself said, at the end of this process when the deal got scuttled, you’re never too old to get surprised. So you have to look for political risk to find it. Now, that sounds obvious, but many companies get so caught up in the business case—because it’s quantifiable, it’s immediate—that political risks are—tend to be more abstract and longer term, and it’s often hard to see them.

FARRELL: Amy, what is a particularly creative way to look for risk that you’ve seen in a company, something that sort of struck you both as very innovative, as opposed to just throw out some people and do some Google searches and do some analysis. (Laughs.) What are creative ways to look for risk?

ZEGART: So we found a variety of mechanisms that companies use. And we don’t advocate one particular best answer, because it really depends on the size of company you are, what kind of business you’re in. But just to give you a range, one company, called Paychex, has something that they call—you’ll appreciate this, right—we us a lot of sports, actually—

RICE: Sports, actually, yes. I wonder why. (Laughs.)

ZEGART: So a tournament of risk. So like March Madness, the NCAA Tournament, they have a tournament of risk. And it’s a way to get people talking about risk. So they have the business leaders throughout the company—not just the CEO, but it’s really led from the top down—and they have risks go head-to-head, where business leaders actually have to vote on which risk that year they think is going to be more important. So more than just identifying the risk, it’s a way to get conversation across the units of the business about how do we prioritize risk, what’s our risk appetite, do we know what it is, do we agree on what it is? Because that’s—you have to understand your own risk appetite, before you look at opportunities out there.

RICE: And you have to understand the particulars of your industry. We—I was an oil company director. I was a Chevron director at one point. And I remember saying to the then-CEO, Ken, is it true that you only find oil in unstable places? And he said, yeah, it may seem that way. He said, probably the most unstable was actually Santa Barbara, because—(laughter)—apparently the only concession that Chevron was never able to really recover. And so—but we know that for a company—for an oil company, you follow the geology. But even within oil companies, there are different risk appetites given the size of the company, what the share of the market looks like, what your other options are. And so we try to be—to ask companies to be systematic also in looking not at just your industry, but your specific profile within that industry, and what are you trying to achieve.

Because we didn’t want—I think in some of our first drafts, we felt that it sounded a little bit risk-averse, if you will. There’s political risk, and that’s to be avoided. But in fact, you’re always going to have risk. The question is, how do you size it, how do you work to make sure that it doesn’t overwhelm you? But you don’t want to forgo opportunities because of they’re risky.

FARRELL: And I found particularly compelling the exposition that you both have around aligning people on risk. Because within an organization not everyone will see it the same way. And I thought you had some good examples on that. If we move further onto the framework, let’s talk about sort of mitigating and responding to risk. As there too you brought up some very good examples. And what are the most important things that people can do to mitigate risk, should they be able to prevent it? But if they can’t, respond to it? So may be each of you could take one of those.

RICE: Well, if you take mitigation, for instance, you need to mitigate before you need to mitigate. And so one of the things is what can you do in advance. So let’s say that you are going to go into a proposition or a risky proposition because it is in a country where there are ethnic issues, ethnic divisions, where it’s know that there’s even been violence around ethnic divisions. Well, you’re going to want to have allies under those circumstances. So you might seek allies in civil society, making sure that’s sought people’s views, that you’ve kept people informed so that if something happens, and something will happen, you already have people who are—they may not agree, they may still criticize, but at least you’re not out there on an island by yourself.

And so thinking through mitigation strategies ahead of time—who do you want on your side—is one of the ways that you can help to mitigate. And one thing that we do in the class is that we actually do a lot of simulations. And very often, the students—one of the things that they don’t really think through is how do you work with stakeholder groups, how do you work with those who can ruin your day if they’re not brought on board, before you need to bring them on board?

ZEGART: And so a couple of points. The most important thing is relationship building. We—you know, our colleague George Shultz always likes to say: Diplomacy is gardening, right? You have to tend the garden. So you don’t want to have the first difficult ask be a moment of crisis. But in addition to that, we found—and we talked to leaders of a variety of different companies to find out what they’re doing in the real world. So it’s not just, you know, what we dream up in our heads. It’s really based on what we find in the real world. And one example that we found really compelling is FedEx.

So every night of every day of every year, FedEx flies an empty plane from Denver to Memphis, their super hub. Now, why would they spend all this money flying an empty plane? Is that they have that spare capacity to pick up packages that are somehow not picked up in the way that they should be every day. So because, as Fred Smith told us, he’s not delivering packages, he’s delivering trust. And people are sending very important things through FedEx. And they better get there on time, right? That’s their value proposition.

And so what we find is, as many of you all know, these dramatic changes in supply chain management mean that just in time has replaced just in case. So we have very lean, long, and global supply chains. But if there’s no any slack in the system, it’s very hard to mitigate risk because the chances are growing that political risk or disruptions in your supply chain are going to happen somewhere in the world, given how many nodes there are on the supply chain. And so you need to have more slack and fly that spare plane, fly that extra plane.

RICE: Yeah. To the degree that you can, redundancy is a mitigation strategy. Now, you have to decide how much redundancy can you actually afford? I actually remember when Katrina hit, and I was secretary of state, and our passport center in New Orleans went down. We were able to use redundancy in other places to deal with the issues. And so redundancy can be important. But of course, redundancy costs. And one of the kind of themes that runs through the book is nobody ever gets rewarded for preventing something because it didn’t happen.

FARRELL: Right.

ZEGART: Right.

RICE: And so you actually have to value the people the efforts that are going to prevent things from happening, when that’s actually very hard to do.

FARRELL: Very hard to do.

One of the things that the book does extremely well, I think, is kind of expand the ways in which one should think about political risk. And you list out 10 categories of risk. And geopolitics, internal conflict, laws, regulation, polices, breaches of contract, corruption, extra-territorial reach, natural resource manipulation, social activism, terrorism, and cyberthreats. So that’s a lot of risks. Maybe you could highlight the ones that you think people are not understanding as well as they should, although you might argue that that was true about most of them, and which ones are newer? Which ones are the ones that are really popping up on the horizon now?

RICE: Well, let me—I’d like to talk about geopolitical risk and—as one that, interestingly, everybody would say, all right, we’ve always known there are geopolitical risks, you know? If you’re an oil company, you know there are geopolitical risks. But actually, the kind of framework that people often have in mind when they’re—when they think about geopolitical risks is a rather old framework that came from the days when you worried about the socialist dictator who expropriated your—

FARRELL: Your Chavezes.

RICE: Your Chavezes and nationalized your property. And so there is a kind of overhang of thinking of geopolitical risk and meaning that. Now, when you think about today’s geopolitical risk, you have to think about the fact that we are in a period where great powers are behaving badly. So the Russians are behaving badly. You might think, OK, well, that’s really a risk if I’m trying to do business in Siberia. But it turns out that the most recent example of a political risk that came from great powers behaving badly is that great power interfering in American elections, using a social media platform in exactly the way that it was supposed to be used, by the way, and then getting that company in a lot of trouble.

Now, that you wouldn’t think of as geopolitical risk. But of course, because it is a great power with really kind of interesting capabilities at its disposal, and using this platform, that became a political risk for that company. So I think you have to start to think of geopolitical risk and not just kind of old-fashioned, what can the government do to me in its own territory. Governments have reach, particularly big governments and big powers.

ZEGART: And they have reach in more industries in more ways that they had before. I would say, your question, Diana, about what’s the newest or hardest risk—I would say, I don’t know if you agree, social media activism.

RICE: Yeah, absolutely.

ZEGART: So and we spent a lot of time talking about what’s going to be on the list and why. We had a lot of sessions about that. And social media activism in some ways unfolded as we were teaching this class. And the reason why it’s become so difficult for companies and governments to deal is we have now lowered the cost of collective action, whether it’s protests in a country or whether it’s protests against a company. And so we start the book with my favorite example—which Condi says I like too much—which is Sea World.

RICE: (Laughs.) I just said we didn’t have to cite it quite as many times as Amy—(laughter)—

ZEGART: Every time in the manuscript I’d say, and Sea World. And she’s, I think we covered that already.

RICE: (Laughs.)

ZEGART: So, Sea World is an example of here’s a company that loses more than 50 percent of its shareholder value, still hasn’t recovered five years later, because of a $76,000 documentary called Blackfish that was made. And there was an infrastructure of animal rights activists. And this went viral. Now, who would have predicted that this cheap documentary film would have provoked a massive change in consumer behavior, which then had cascade effects. So political risk isn’t just: I’m a government and I do this to Condi’s company. Political risk is the cascade effects of one event. So in Sea World’s case, this documentary causes a sort of viral sensation on Twitter. Well, what happens next? The California Coastal Commission gets involved and passes regulations about orcas in captivity. Congressional hearings, investigations. So what started off as consumers quickly triggered governmental questioning and intervention that exacerbated Sea World’s problems.

RICE: And you don’t even have to make a documentary. You might just be a passenger on a plane with a cellphone and notice that a passenger’s being treated roughly by the crew. And you take a picture, and before you know it the CEO of United Airlines is defending himself before Congress. And so you’re getting these multiple sources of this. And one other point that we make is the social activism is also now transnational. And it’s organized across borders. And so if you are—have a fire in your factory in Bangladesh, it’s bound to get the attention of transnational human rights activists, not just activists there.

FARRELL: So, Amy, I was surprised—although I shouldn’t have been too much—that you would put social media on the top of the list. I know it’s important, but you’ve spent a lot of time in the world of technology and cyber. And that too has taken on a whole new meaning in the last while. Maybe comment on that, since I know CFR does a lot of work in that area and many members here are interested in that.

ZEGART: So when we first started the course we had—we talked about cyber, but increasingly we devoted more time to cyber. And so cybersecurity is—one of the—one of the quotes we have in the book is it’s not just for sexy companies anymore, right? So whether you’re a building manufacturing company or an airline, cybersecurity’s affecting all of them in ways that we hadn’t thought about before. There are some fundamental challenges with cybersecurity. Most companies will say that they’re very worried about it, but very few have actually done any kind of systematic assessment.

So they know it’s a problem, but they don’t quite know what to do. So when I did a session at Stanford for executives, and I asked the crowd three questions. I said: Have you been breached in the last year? Do you have a cybersecurity strategy? And have you stress tested that strategy? And if you answer to any of those questions is no, you’re in trouble. And I got some quizzical looks. Well, what you do you mean? If I haven’t been breached in the last year I’m doing well. I said, well, you just don’t know it.

FARRELL: You just don’t know it, right, right.

ZEGART: So, you know, the three numbers that I always try to keep in mind are 11 minutes, 97 percent, and 205. Google reported publicly a couple years ago, one of the Google directors, that they are fending off a state-sponsored cyberattack on average once every 11 minutes that they have to tell their customers about. That’s just serious enough to tell their customers, and that’s just state-sponsored attacks, right? Ninety-seven percent of companies, according to some studies, have—of Fortune 500 companies—have been victim to a cyberattack. It’s probably 100 percent, the other 3 percent don’t know it. And the median amount of time before you detect a breach, according to Kevin Mandia, is 205 days. That’s a long time to have adversaries mucking around in your, you know, proprietary information before you know it.

FARRELL: That’s a lot of damage.

ZEGART: And it’s also an area where the Edward Snowden affair has done damage to our ability to actually respond and react. So when it comes to terrorism as a threat, I’ve never seen a company that was not willing to, anxious to work with the government to try to figure out the terrorist problem. And indeed, one of the points we make is that if you’re, for instance, in the hospitality industry then you’ve become a soft target because we have hardened so many hard targets. But they are very willing, and anxious indeed, to work with the government, to know what’s coming, and so forth.

When it comes to cyber, though, there is no worse example of government and—of government-private interaction than the cyberspace, where for a variety of reasons companies don’t want to be, quote, “caught” working with the government. And I think it really does come to the lack of trust out of the Snowden matter. And this is particularly true out where we live, in the Silicon Valley. And yet, the government not only has some tools that you probably want to take advantage of, particularly in state-sponsored attacks where questions like attribution are much easier for the government to figure out than they are for you to figure out. And you want the protection of the government on some of these issues, because while the government may not have all of the high-techy hacker abilities that the companies in Silicon Valley do, they have something called intelligence. And they can marry them up in ways that private companies can’t.

And so we find that there is a lot of reluctance to work with the government, to tell the government what’s going on, and then to share best practices. So about three years ago, four years ago maybe, Mike Rogers and the intelligence community—the intelligence committees tried to put together a just sort of very basic understanding between the private sector and the government on how to publish and share best practices. And it got nowhere.

FARRELL: Thank you. You’ll see in these examples that even those risk categories that you thought you understood are being opened up in this book. So that was interesting. I thought another aspect of what you all exposed well was the levels of risk. So you mentioned the exposure to an individual dictator, there’s local organizations, there are national governments, there are transnational groups, super-nationals. Bring that to life, because it is complicated and there are many levels of risk that interconnect. Your example of Russia was a particularly good one.

ZEGART: I think—you know, if you think about—when we natural go to the national security world and then think about what’s applicable in the business world. If you look at the threat environment that the United States confronts today, it’s much more complex. And it’s at much more different levels than, say, during the Cold War, where the question, what would Moscow think, is the sort of centralizing question to a lot of different policies. So we have five levels of risk generators that we talk about in the book. And of course, they intersect and overlap. But what we wanted to do was just delineate them analytically so it’s more helpful to think through what’s going on at each level.

So much like we’re worried about in the national security world the diffusion of actors, what can a lone individual do to me today, businesses have to worry about that too. So we talked about collective action, right, social media. So small groups or lone individuals can have very outsized effects on companies today. So that’s the first level. The second level is local organizations. So, not in my backyard movements, for example, which are taking on greater resonance. And then you move up. And we have, of course, national governments. And we actually had a robust dialogue about two different kinds—I don’t know if you want to talk about whether it’s the unity of national governments that’s a risk or the bureaucratic fragmentation of national governments that’s a risk.

FARRELL: (Laughs.) Or both.

ZEGART: Or both. So there’s that, which people are more familiar with. But there are also local government officials. So we have one example in the book of local officials in Manila who wanted to solve a local problem. They had traffic jams during the day. And they came up with this idea, let’s ban trucks for 16 hours a day, so that people who live in Manila can get to where they want to go. It sounded like a good idea, except that Manila, as you may know, is the most important port in the Philippines. And so suddenly the port grinds to a halt. So if you’re Toyota, which is a major manufacturer that uses this port to put together part of its supply chain in the region, this has a big impact on your business. And so the national government ended up having to step in and reverse or work with local officials to reverse what was a domestically oriented move. So local officials are not generating risk because we have a global supply chain. And then we have, of course, transnational groups and international and super-national institutions, like the U.N. and the European Union.

RICE: And the European Union is a particularly interesting one because, as you know, it has three major institutions, the parliament, the council, and then the commission. And one of the things that causes a problem with the European Union is you often have overlapping jurisdiction or unclear jurisdiction. And so if you are talking about fracking, for instance, if it is energy policy then it is the purview of the council and of the state. So that you have energy policies as different as the French, with 80 percent of their generating power from nuclear, and the Germans, who have shut down civil nuclear. And on the other hand, you have the commission that has an environmental policy. And so is it environmental policy or is it energy policy? And so you can get, particularly with the super-national organizations, these kinds of overlapping jurisdictions, and where do you go?

FARRELL: So you’ve both made an incredibly strong case for why this is so important and why companies should spend a lot more time. But why is this so hard? You mentioned one, which is it’s just hard to reward things that don’t happen. But what are other things that make this so hard?

ZEGART: OK. So we have—there are four other things that make it hard. Political risk is hard to understand. We’re terrible with numbers. We—all of us are terrible with numbers. Even our very smart MBA students, right? We’re—probabilities are hard for us to fathom. So why are people more afraid of dying in a car accident than an airplane, or shark attacks versus heart attacks? It’s, you know, cognitive biases that we read a lot about, right, the availability heuristic. If we can readily recall something, we think it’s more likely to happen in the future.

One of my favorite biases that we found research is the optimism bias. So we all tend to be more optimistic about our futures, our investments, even our favorite NFL team winning.

RICE: Well, not if you’re a Cleveland Browns fan, like me. (Laughter.)

ZEGART: And so optimism bias also gets in the way. Even when we want to try to understand the risk world, we can’t very easily. So it’s hard to understand. It’s also hard to measure because risk is squishy. Political risk is about the intentions of others. It’s not just about GDP per capita. So it’s harder to measure in a quantitative way. You have to deal with perception and emotion, national mood. And so we have some examples of that.

Perhaps my favorite hard that we talk about—why is it so hard for companies to deal with political risk—is it’s really hard to communicate risk. So we do this little exercise with our students every year. Imagine that we were giving you a pill. And it’s the beauty pill. And if you take this pill, you’re going to look the best you ever looked in your life. Whatever that age may be, whatever that haircut was, for the rest of your life. And that pill is 99.9 percent safe, no side effects—99.9 percent safe. How many of you would take the beauty pill? All the hands go up, right? You hand—right. Thank you. (Laughter.) Everyone else’s hand wants to go up. So and then we say, so now let’s say we tell you that if you take the beauty pill, there is a 1 in 1,000 chance that you will drop dead. How many of you would take the pill now? Hands go down. Statistically speaking, they’re exactly the same, right? So how you communicate risk really makes a big difference in what you do about it.

FARRELL: Mmm hmm. Great. Well, my last question, before I turn to the audience, you’ve answered, Amy, which is as you’ve been teaching all these—business school is such a case-oriented thing—what is your favorite teaching case? And clearly yours is Sea World. (Laughter.) What is—what is one of your other favorites?

RICE: Well, I’ve enjoyed watching the students—and by the way, this has evolved over time—deal with the question of whether or not you would take an oil concession that is contested between the Kurds and the Baghdad. And what assumptions do you make about the possibility that you will have a concession that is worth nothing if—and then we actually use the examples. These are different sizes of companies. And usually when the company is very small and has no other option, they’re willing to go for it. But we were surprised at the number of student who are just willing to take the chance, right? They just are willing to take the chance that Baghdad and the Kurds will work it out, or it won’t matter, or whatever. So that’s always been one of my favorite cases.

We also have done a very interesting case on cruise liners in Mexico. And we paint a fairly grim picture of the violence issue in and around certain parts of Mexico, and what is your strategy? Because of course, as one of our students said, if anything happens to any cruise liner, the whole cruise liner industry is in trouble from then on. And so what are your mitigation strategies? What are you going to do? And interestingly they split about half and half as those who just want to get out and not deal with it. And they’re going to go to the Caribbean or someplace else. And others who say, you know, we’re going to stay in this market because it gives us a real advantage. So we find that there’s a lot of—a lot of variation on some of the cases and not much variation at all on others.

FARRELL: Great. So we are now at the time when I get to turn it over to members to ask questions. And you can see how bit this group is. So let me ask a couple of things. This is on the record, so keep that in mind. Please wait till the microphone makes it to you. And we have people on the side to do that. Please stand so everyone can hear you. Introduce yourselves. And please keep the question brief so that many other members can participate. I saw the first one up right there. And I will try to get around the room as best as I can.

Q: Yes. Thank you. Great presentation. Bill Courtney with RAND Corporation.

Recently the U.S. government took two actions in the metals markets that seemed to have unintended consequences. One were tariffs—additional tariffs on steel and aluminum and the other was sanctioning a Russian oligarch who was important in the aluminum industry. In both cases, the government seemed to have to backtrack, with waivers on tariffs and then the Treasury Department gave more time for consumers of aluminum from Rusal to back out of contracts. Should the government be required to do some kind of modeling before it takes actions that can have big political risks?

RICE: You know, theoretically it would be a terrific idea. (Laughter.) Do I think the government’s going to do it? Not likely. I mean, people do actually—I was actually in the administration when we did some protective action in the Bush administration on steel and on aluminum. And people do go through the calculations of what the backlash might be, and what the consequences might be. But as Amy was suggestion, the problem is it’s sometimes hard to look around that corner and see what action—how, you should probably know that you’re going to have an effect on the markets if you do this. But you always have this other action. And how exactly they’re going to react is just very hard.

And usually—and this may be the difference between now and then—at the time there was a kind of tolerance for administrations doing these, you know, anti-dumping moves, legislation and so forth, because it was really understood to be a domestic policy issue that some president had in some campaigned promised the state of West Virginia or something that they were going to take action. And so they take action, and then that’s the end of it. It’s a sort of you’re going to keep your campaign promise and then that’s the end of it. I think the reason you’re getting a different response this time is that people are not certain if this isn’t really protectionism. Nobody really thought that Ronald Reagan or George W. Bush were protectionists—or Bill Clinton or Barack Obama for that matter. But if you really do have protectionism, now the imposition of these measures, which other administrations have used, carry a different kind of risk. So I’d like to think governments would model it, but I’ve actually not ever seen them do it, though the arguments are there.

FARRELL: Excuse me. We had a question right up front here.

Q: Yes. I’m Ray Tanter from the Iran Policy Committee and the American Committee on Human Rights.

I have one question for Dr. Zegart. When I was a visiting professor at Stanford, I worked with Amos Tversky and his colleagues in the decision analysis program. And have you incorporated your work into the rubric of the Tversky decision analysis framework?

And for Dr.—Professor Dr. Rice. Hello.

RICE: Hi. How are you, Ray?

Q: Another book, your book on democracy. I tied the Richard Haass book on A World in Disarray with your book on democracy and The New York Times article by Walter Russell Mead. And how would you relate your democracy book to this particular book?

FARRELL: OK. Good. I think that was two questions. So you get a half-answer each. (Laughter.)

ZEGART: It was a valiant effort to smuggle two questions into one. (Laughter.) So, yes, we do incorporate Kahneman and Tversky. One of the things that we wanted to do in this book was to marshal the best of social science, not just political science—although we do have a two-by-two matrix which is a requirement for political scientists. But to marshal the best thinking in the social sciences and apply it as best we can to the real world so that it’s usable for business leaders. So, yes, we do incorporate that into it.

RICE: Right. As to the relationship between the democracy book and the political risk book, so I think it’s an interesting question. Which is riskier? To deal with a democracy that has multiple voices, multiple points of failure, multiple veto groups but rule of law and transparency in its activities because by its very nature it’s transparent, or to deal with an autocratic government that may be of a single voice, can actually efficiently execute policy, but can also—is not transparent, does not have rule of law, and can change its mind? And so that’s an interesting question. I’ll take, myself, the transparency, rule of law. And I think what you see is investors tend to take transparency and rule of law over the ability to act in an autocratic and efficient way.

And the other point I’ll make is that sometimes we get very frustrated in democracy because we can’t get things done. Remember that autocrats efficiently exercise—efficiently carry out both good policy and bad policy. So China efficiently executed the One Child Policy. And now 34 million Chinese men don’t have mates. So I think the autocracy bias that we’re starting to see—I almost call it autocracy envy—(laughter)—is somewhat misplaced. I’ll take the transparency and rule of law.

FARRELL: Excellent. Yes, please.

Q: Barbara Slavin from the Atlantic Council.

Madam Secretary, pleasure to see you again after so long.

RICE: Nice to see you.

Q: I have to ask you a foreign policy question, since I’m not a businessperson. You were very effective in uniting the United States and Europe toward a policy toward Iran when you came in. We are now in a situation where the president of the United States looks like he’s going to walk out of this nuclear agreement, and the Europeans are desperately trying to find some way to keep him in. What do you think U.S. policy should be? What do you anticipate would be the second- and third-order effects of a U.S. withdrawal on our relations with Europe, on the region, any—you take it anywhere you’d like. Thanks.

RICE: Sure. Well, Barbara, let me start by saying I didn’t support this deal. That’s on the record. I didn’t actually think it was a particularly good deal. And I think that there were better deals available, given what I think was Iranian desperation at the point to relieve sanctions. That said, I have said that I would have stayed in the deal because of largely the alliance management issues that you mention. And the allies are quite wedded to it. That said, I don’t think we should overhype what happens if the president decides to leave. First of all, he’s given plenty of warning that he does not like this deal, that he wants out of this deal. It’s not like he did it on day one, which I think would have been—you know, when people run for office they have a tendency to say: On day one I will. Well, on day one they don’t. And fortunately, we’ve given it some time. And I hope that perhaps in the conversations with Macron and Merkel, they came to some understanding of how this could be done smoothly and, if we do leave the deal, what we might do next.

And I would say that there’s plenty of room for looking for ways to improve the verification particularly. I know there was a lot of pooh-poohing of the Israeli presentation, but I’ll say one thing: I think I know a lot about this. And there was a lot there in terms of the depth of the Iranian cheating about the program. And so the baseline looks to me not very clear, not very accurate on which the deal was based. So maybe it is a good time to go back and see if you can improve the deal. But it’s not as if they haven’t known this was coming for some time.

FARRELL: Yes. I’ve got a few here and I’ll come back around. Please. Yes. If you don’t mind waiting for the microphone, please.

Q: Hi. Allison Binns with Chevron.

You’ve talked about a number of cases in which shareholder value has been impacted by mismanagement of political risk. Can you talk about what investors should be looking for to make sure that these risks are being managed correctly as far as equities valuation, and whether or not they should be pushing for more disclosure around how these are managed?

ZEGART: Well, the big message, I think, of our book, is that you want to look for companies that take political risk seriously, that manage it systematically, and that lead from the top. So it’s not just the sort of C-suite that needs to make it a priority. Is the board of directors focused on risk? Where is risk located in the board of directors? We don’t take a stand in the book, but I, from my own experience sitting on a board, I think that the audit committee is probably for many companies not the best place to think about risk because it’s compliance oriented as opposed to creativity oriented, because risk is also an exercise in imagination, not just analysis. So I would look for are companies—and Chevon is one of them—do they have a robust political risk unit that is—and is that unit—are they forward deploying throughout the company? Because the interviews we did with Chevron folks were that they know they have to be useful. They don’t want to be the Debbie Downer of every conversation, right? You can’t go here because of this risk, to Condi’s point earlier.

So is it integrated into the business in a systematic way? So if it’s just—if political risk and risk in general is just a function of the risk folks down here and not integrated throughout the company, then you’re not going to excel in seeing those risks around the corner.

RICE: One thing that’s happening in some companies as cyber risk has become so much greater, is that you’re actually seeing boards of directors seek to put among them people who really understand cyber. And so we had a couple of companies saying we’ve actually sought directors who have that as a kind of specialization, you know? And you usually look at a board and it’s somebody who’s had government experience, and somebody who’s a former CEO, and maybe somebody who’s a lawyer and so forth and so on. Well, now asking yourself, given the threat of cyber—and, by the way, it’s not just being hacked. It’s all the things that could happen to you through cyber. Whether or not it’s worth actually putting that expertise on boards, I think, is a very interesting idea.

FARRELL: We had a question back there somewhere. Thank you. Yes, I’m looking at you in the red shirt. Thank you.

Q: Hi. Pamela Bates from Securitas Global Risk Solutions.

We are a specialty insurance brokerage for political risk. And I just wondered if you could talk about the role of political risk insurance to decision-making, along with trade credit insurance and, now, the launching field of cybersecurity insurance.

RICE: This is yours. You always talk—Amy talks about this a lot, so I’m going to—

ZEGART: She’s heard enough from me talking about insurance. (Laughter.) So insurance is certainly one mitigation strategy. It’s obviously a really important mitigation strategy. But of course, there’s moral hazard with insurance. The more you insure, the more you think you have it covered. And particularly with cybersecurity, the moral hazard challenge is certainly an issue. The more that you think that you’ve got secure networks and you’re insured against any kind of downside, the less impetus there is to deal with the weakest link in cybersecurity, which is the human being, right? So we all know that humans are the biggest danger when it comes to cyber vulnerability. And you can’t eliminate that risk.

We’re starting, actually, a pretty—a large new project at Stanford looking at cyber insurance, the industry, because many companies have found—and we have a case in our class—that there isn’t enough insurance. They try to insure and mitigate risk by purchasing insurance, and they can’t get enough of it. So Target and Home Depot each tried to get more insurance than they did. They were both hit with major breaches in the same year. So it’s a vital part of risk mitigation. But to Condi’s earlier point, one of the more—I wouldn’t say more important—but equally important strategy is making relationships with external stakeholders, whether they’re third-party NGOS, whether they’re government officials where you’re investing, whether they’re local communities, so that you have those relationships that can come to your defense and help you understand risks around the corner before they’re knocking on your door.

FARRELL: Great. I’m looking at this side of the room. We had some hands up earlier. No takers. OK, I’ll come back to you.

Q: Hi. I’m Matthew Assada. I’m a Foreign Service officer at the Department of State.

And my question is about incentive structures. How do the incentive structures in the corporate world, in government, in national security, how can they be changed or modified to encourage employees at all levels of the food chain to think more about risk? Was it an evaluation process? Is it, you know, through monetary bonuses? But what research has been done in this particular aspect about the incentive structure?

RICE: Go ahead. OK, so I was just going to say, you know, the whole literature on incentives, which I used a lot as a political scientist because I was interested in how institutions incentivize certain kinds of behavior, particularly military institutions. And it’s really interesting that when I was then trying to get certain behaviors in big organizations—whether as provost of Stanford or secretary of state—I found the way people thought about incentive structures totally un-useful, because the first thin with incentive structures is to make sure that you actually know what it is you’re trying to incentivize, right? And that is going to—and it’s extremely important to know that when you’re trying to get an organization to change.

So just to give you an example from state, you know, I asked how many awards do we have in the State Department for civil-military cooperation? This is a time when we needed foreign service officers in Kabul and in Baghdad. And so, none. How many did we have for democracy promotion? At the time, none. How many did we have for political reporting? Thirty-five, right? At a time when, frankly, a lot of the political reporting was coming directly from contact. So just use that as an example of incentive structures also have to change when you know what it is you’re trying to incentivize. So when you’re looking at risk, you definitely want to incentivize people to be aware of risk, to sound the alarm bell about risk, to find a way to get up the organization. But you don’t want to incentivize Debbie Downers. And you don’t want to incentivize crying wolf because if you do that, then the risk people are not going to be listened to and when there really is a real risk they’re really not going to be listened to.

And so I actually think the question of how you build incentives into an organization for the—on the risk side is very difficult. And I would look to can you incentivize people to have strategies for understanding it, for mitigating it. One of the things is that if you can have strategies for mitigating up front, maybe you don’t face these horror situations down the road. And if you can also incentivize people to have response plans in mind. So one of the things that we do is we look at how Home Depot and Target, in the class, responded to hacks that took place in very close proximity to one another. And one organization had really not terrible good marks on almost all of our risk profile measures. But, boy, they responded really, really well. And so when you’re looking for building incentives into a structure, ask yourself all the time: What are you actually trying to incentivize in people?

And one—just one point that Amy made that I just want to follow up on. Incentivizing humans not to do stupid things, the number of companies that have had experienced this because somebody clicked on a virus and should have known don’t click on an email that doesn’t look familiar to you, you know, how do you incent people not to do that?

FARRELL: Yes. We’ll start again around here, and then you.

Q: Odeh Aburdene, the Capital Trust Group.

I would like to ask you about the Aramco deal. Many U.S. investment banks will participate in underwriting. It might be the largest oil company on Earth. What are the risks for U.S. investors, for U.S. banks? And what are the political risks for the Saudis in case they list in New York, or London, or Riyadh?

RICE: Well, when you are moving from state-owned to private, there are innumerable risks, right, because of course it goes back to the question of transparency, and are the measures in place in what have been state-owned operations and enterprises that one would expect if you’re actually dealing in a market environment? And so the question of how much do you know I think is always an issue. You’ve dealt with state-owned enterprises. People will deal with state-owned enterprises in China. I certainly have seen state-owned enterprises go from state-owned to private in Russia, in, actually, far too quick a fashion. And this question of how do you how transparency and really know what you’re getting into I think is really the issue.

It doesn’t mean that you don’t do it, because obviously something of that size and importance and potential wealth generation, you’re not going to get people not to do it. But knowing the right questions to ask so that you can get to the core of whether or not you really know what you’re getting into, what has been the history of that, what’s hidden, that really, to me, is the question. From the Saudis’ point of view—and, look, I believe that the Saudi crown prince is trying to do some extraordinary things. And I think on balance if he succeeds Saudi will be a more liberalized place, and that will be good for the future of the Middle East. But I’ve often found that when people go from state-owned to private, they are not aware of what it is really like now to be in private markets, and the questions that get asked.

I look at what has happened to the Chinese when they tried to open their markets, and they suddenly realized stocks could go up and stocks could go down. And they actually told a lot of citizens—ordinary citizens, go ahead and invest. And then, of course, markets went down, and then they had social unrest. So I very often find that people who deal in state-owned enterprises just don’t really understand the market. And they think they do. But they don’t understand that markets are unpredictable, volatile because there’s so many actors. So there are risks both ways. But I will tell you, the opening up of Saudi Arabia could be one of the major and most important developments in the last 50 years in international politics.

FARRELL: Right behind, yes. Nice to see you.

Q: Hi. Dov Zakheim—(off mic)—CSIS.

MR.     : Dov, say your name again.

Q: I’ll say my name again, Ray. Dov Zakheim. (Laughter.)

One of the boards I sit on has a risk committee, although it doesn’t define political risk anything like the way you have. Do you think, because most boards don’t have one, do you think that’s something they really should have? And to the extent that they should have it, you know, board members aren’t as sophisticated as you are and don’t write books. What should they be thinking about?

ZEGART: So when we wrote this book we asked—we did not want the book just to be for a Fortune 100 company. We wanted it to be useable if you’re a 25-year-old Silicon Valley MBA from Stanford starting a company. And so there are some back of the envelope things that you can do. I tend to think it’s really useful to have a board that is focused on risk generally—political risk being a subset, of course, of a lot of other risks. But one piece of advice we got from a prominent investor has really suck with us. And that is just ask a simple question: What if we are wrong?

That alone—and what they do is they train everybody in the company to say: OK, here’s our modeling. But what if we’re wrong? Are the returns going to come in time for us to make this a useful investment? In the case of Saudi Arabia, if we’re wrong are we really going to be exposed here to risk for downside that we can’t handle as a company? So risk is very individual, right, as we talked about earlier. The first step is being self-aware about what business we’re in and how we think about the world. But just the simple act of having a board that says always, this sounds good, but what if we’re wrong, can go a long way.

FARRELL: Great. Yes, please.

Q: Allan Gerson, AG International Law.

Particularly to Dr. Rice, I’d be very interested in your assessment of your own intellectual evolution since returning to academia and looking at decision-making from the prism of decision-making by businesses. If you were to return to a senior position in government dealing with national security decisions, how might you look at things differently?

RICE: Well, I’ve been back and forth a couple of times. And so one of the great things about going back is you have time to go back and reflect on what it was like to be there, so to speak. And I think that when I go to teach about decision-making to my class, I try to get them to see that the biggest and hardest thing about being in those positions in government—particularly when you’re going through a period like we were going through after 9/11, when there wasn’t much of a compass and there wasn’t much a playbook, and the United States hadn’t been attacked on its territory since the War of 1812, and things were coming at you that you really never expected.

That the problem is that you actually have to do things on the fly. You don’t have an option to sit and have 15 different options papers about how you’re going to protect critical infrastructure the day after you get a terrorist message that they’re going to go after critical infrastructure. And so you have to develop a muscle that I think sometimes we don’t develop very well, which is the ability to know how to go to the essential questions, where to go to get the essential questions answers, and how to put something in place that can evolve. The one thing that I think I learned about times like that is don’t assume that what you put on the table at that moment is going to survive the test over a long period of time. It’s got to be able to evolve.

And so I’ll give you an example of this: Homeland Security. I remember the days immediately after 9/11. We looked around. The day before we’d actually had a National Security Council meeting and it was, you know, the chairman of the Joint Chiefs of Staff, and the president, and the secretary of state, secretary of defense. The day after 9/11, we go to the National Security Council meeting, and it’s the people who do borders, and it’s the Treasury, and it’s the people who do energy and so forth. And I’m thinking, I’m supposed to make sense of this. And so we started right away just saying, OK, you’re in charge of critical infrastructure. It was Larry Thompson, who at the time was a deputy AG. And he said to me: I didn’t know anything about critical infrastructure protection. I said, Larry, nobody knew anything about critical infrastructure protection. But you were good at your job. We figured you’d be good at this one. (Laughter.)

So recognizing that you’re going to have some temporary solutions, and let them evolve. We often think of policy as something that’s fixed. I made a decision and how I’m going to carry it out. Having the ability for things to breathe, change, evolve, is extremely important. And I don’t think we have that muscle very often in government. We just aren’t able to do that.

FARRELL: Yes, please.

Q: David Slade, Allen & Overy.

I understand that there’s a bill that’s been introduced in both the House and the Senate to dramatically expand OPIC, the Overseas Private Investment Corporation, our primary government agency in charge of political risk. Should everyone on Capitol Hill be reading your book right now—(laughter)—or has political risk moved on so far beyond that that there’s not much of a role for government to play?

RICE: Yeah. You know, I worked a lot with OPIC, particularly when we were trying to figure out what to do with the collapsing Soviet Union. You’ll remember that, David. We were interns together at the State Department back as kids—when we were kids, which we’re not anymore. So the whole idea initially was really to safeguard the investment for risky—so that people would go into risky environments. I actually am not sure that I think that’s the way to think about the government’s role. I would rather have companies actually go into environments in which they’re willing to take the risk, either because they believe it’s for good economic purposes, they want to be in the market, they think they can mitigate and manage the risk.

And it’s a little bit of a moral hazard. I think if there’s too much government backing, it’s a little bit like what we got into with mortgages, right? If there’s a little too much government backing, then you don’t really think it through. I don’t know what the expansion of OPIC looks like. I haven’t really looked at the bills. But I’d be very careful that you’re not actually—to the point of incentives that was made—that you’re not actually incenting people to do things they shouldn’t do. I actually think some of the decisions that people made about the former Soviet Union were probably not good decisions on the face of it, because it was a cheaper way to go in.

FARRELL: Well, I’m going to hold the last question, if you don’t mind, mostly to celebrate CFR for having an all-female panel. Thank you very much. (Applause.) These two pioneering women in their roles, having done such extraordinary work, and just to ask if you think there is anything—I know both of you are very active in women’s leadership and getting people—that speaks to geopolitical risk and women.

RICE: Go ahead.

ZEGART: Thanks. (Laughter.)

RICE: Amy was my student, so I can always throw the harder questions to her and then fill in.

ZEGART: I think that the sort of analytic thread through our work—and this can apply to the issue of women—is that you have to bring different perspectives to bear. Now, sometimes those perspectives are correlated with gender. Often they’re not. But the more that you have people who genuinely bring different perspectives to the table, the better your thought process can be, whether you’re an FBI Joint Terrorism Taskforce, or whether you’re a corporate board. So I think we need to be thinking broadly about that. If we have organizations that automatically limit half the talent in the nation from participating in that process, that’s a problem. So we need to be able to harness the talent, wherever it resides. I think that’s the big takeaway for me.

RICE: Yeah. I would also say, when you’re looking out there and you’re thinking about countries—let’s now talk about international investment or international business. If you’re about to be involved in a country that mistreats women, look out, because I really do believe that the treatment of women is a kind of bogey for something else. Countries that treat women badly are dangerous places. So I’ve often said, if I could wave a magic wand, I would empower women in the following ways: If you educate women—you want to do something about population explosion—women will not have their first child at 12 and they won’t have 13 of them. If you want to do something about women being trafficked into brothels, educate them and you won’t have a sex trafficking problem. If you want to have women have political rights, educate them and it won’t be long before they start demanding political rights.

And the opposite is true too. If you are in a place where the undercurrent is that half of the population is mistreated, that’s eventually going to explode. And so one of the risk factors that you always deal with internationally is what really are the underlying tensions in this country? What—you may not be able to see it at that moment, but what is—what makes this country brittle? And places that have extreme gender inequality, that have extreme policies toward women, there’s a lot of tension building up underneath. And I would say be absolutely careful in those cases.

FARRELL: Wonderful. Well, thank you very much. It’s—one big round of applause. (Applause.)

(END)

Up