Renewing America

For the United States to succeed, it must fortify the political, economic, and societal foundations fundamental to its national security and international influence. With its Renewing America initiative, the Council on Foreign Relations is evaluating nine critical domestic issues that shape the ability of the United States to navigate a demanding, competitive, and dangerous world.

This project is made possible in part by the generous support of the Bernard and Irene Schwartz Foundation.

 

Democracy and Governance

A well-functioning political system is necessary for advancing competitive economic policies. Political division and democratic backsliding in the United States hamper its ability to project power abroad.

United States

The experience of the soon-to-be-concluded 117th Congress offers some hope for the next two years

U.S. Congress

Daniel Silverberg, managing director at global strategy firm Capstone and adjunct senior fellow at the Center for a New American Security, and Christopher Tuttle, senior fellow and director of the Renewing America Initiative at the Council, sit down with James M. Lindsay to discuss the results of the 2022 U.S. midterm elections and its impact on U.S. foreign policy.  

India

Indian analysts noted important points from the U.S. midterm results in their news coverage, including how the results might affect U.S. foreign policy and what this could mean for India.
Education

An educated citizenry and high-skilled workforce are critical to maintaining the United States’ economic edge. The United States has fallen behind its peers in K–12 schooling and today’s education models aren’t delivering the skills students need for the future.

United States

Education in science, technology, engineering, and math is critical, but the United States needs a balanced approach in building the workforce of the future.

United States

Higher education provides students many socioeconomic benefits and increases the global competitiveness of the United States, but mounting student loan debt has sparked a debate over federal lending policies.

Education

Mordecai Ian Brownlee, president of the Community College of Aurora, will lead the conversation on navigating the digital equity gap in higher education.   FASKIANOS: Welcome to CFR’s Higher Education Webinar. I’m Irina Faskianos, vice president of the National Program and Outreach at CFR. Today’s discussion is on the record, and the video and transcript will be available on our website, CFR.org/academic. CFR takes no institutional positions on matters of policy. We are delighted to have Mordecai Ian Brownlee with us today to talk about the digital equity gap in higher education. Dr. Brownlee is president of the Community College of Aurora in Colorado. He also teaches for Lamar University in the College of Education and Human Development. Dr. Brownlee publishes frequently and serves as a columnist for EdSurge. He has been featured on a number of national platforms including by Diverse Issues in Higher Education magazine as a new school leader representing the next generation of college presidents, and he was most recently appointed to serve on the board of directors of the American Association of Community Colleges. So, Dr. Brownlee, thank you very much for taking the time to be with us. I thought we could begin by having you define digital equity and give us an overview of the digital equity gap in higher education, and I know you are going to share a presentation with us so we look forward to seeing that on screen. BROWNLEE: Absolutely. Thank you so much for the opportunity to the Council on Foreign Relations. Just thank you all so much. And to answer that question as we talk about digital equity, it’s the assurance of ensuring that all have access to the information technology available and to have the capacity to engage in society and productive citizenship. And so we’ll talk about that and let me just start sharing the screen and we’ll jump right into it. All right. Here we go. So, once again, thank you all for the opportunity, again, to the Council of Foreign Relations for this opportunity to talk about navigating digital equity. Bringing greetings on behalf of the Community College of Aurora here in Aurora, Colorado. And let’s just jump right into it. You know, as we talk about defining this work, how to navigate this work, we have to first understand the work, and to understand digital equity we must first understand the digital divide. And so, you know, as we talked about the digital divide at the beginning of the pandemic it, certainly, was dealing with the voice and mindset, the texture and tone, of accessibility and being able to engage in learning throughout the pandemic and, first of all, I would say as educators it’s so critical that even as we are, quote/unquote, “coming out of the pandemic” that we still acknowledge part of the challenges that are happening across the country and across the world in regards to accessibility—equitable accessibility to information technology, to the tools, and to have the capacity to not only learn but, certainly, engage in the economy and society. So as we talk about digital equity, we must understand the digital divide and so let’s kind of define that. One of my favorite definitions for the digital divide defined comes from the National League of Cities and they say the digital divide is the gap between individuals who have access to computers, high-speed internet, and the skills to use them, and those who do not. There’s two critical components as we talk about digital equity that I want to call out with the digital divide definition here. One is access. The other is skill. Access and skill. So as we think about equity and just think about how do we level the playing field, how do we close the gap on accessibility and skill attainment to engage. And it’s not just being able to access and that’s the other—I think the complexity here as we think about the term equity because just because I provide you the computer, right—and we found this during the pandemic—just because I provide you the computer do you even have broadband access? And if you have broadband access do you have dependable sustainable broadband access? And then if you have sustainable broadband access, are you skilled to not only learn but and engage through this instrument and tool, and that in itself is where we have found there to be challenges as we think throughout the pandemic and, certainly, beyond the pandemic on what we must do to close the gap for equity and the digital divide. So digital divide provides that access, skill. Equity will then take us deeper into this work. Here are key factors I want to call out in regards to how we must eradicate or address these challenges, these factors, in order to close the gap on the digital divide. Number one, what we have seen through research—and digitalresponsibility.org has done a great job of calling this out—number one, age-related issues as we think about the various generations that are engaged in society and still present in society. We have digital natives. I consider myself to be a digital native as a millennial. But this is very different than previous generations that may not have had the proper training and skill and their jobs do not have them engaging, utilizing these tools and instruments on a regular basis and so that in itself has created some challenges. And, again, there is, certainly, all those that are outliers and those among the generations that have been able to engage in these instruments and tools. However, it is truly a fact through research that age-related issues have been a part of this challenge, more specifically, speaking to our older population. Socioeconomic factors—have to talk about it. I think about it, especially in the higher education space. Our tribal institutions is where I’ve heard throughout the pandemic some of our most severe challenges that have been experienced in regards to the digital divide. One of the stories that I heard that just breaks my heart—I remember the first time I heard it, it truly had me in tears—we were at the height of the pandemic at this point and what we were learning is in one particular tribal community in order for those students to complete—these are young K-12 students—in order for them to complete their assignments they had elders and community members of that tribe that would walk the students up to the highest point on the mountain within that particular tribal territory just to be able to pick up an internet signal, and they were able to do this when there was not as much traffic on that internet broadband access—that grid, if you will. And so those students were having to do their work—their homework—between the hours of 2:00 a.m. and 5:00 a.m. in the morning. Very interesting reality—unfortunate reality. We, certainly, have to come up with the solutions to addressing this. This in itself is part of that digital divide conversation. Geographic causes—it depends on where you are in the country. I remember at one point in time I was teaching and served the University of Charleston out of Charleston, West Virginia, and for those that are familiar with that part of the country in the Appalachia, I would have my students that were having to use their own cell phones in order to complete their assignments and upload their assignments. They did not have either, in some cases, the actual tools or accessibility, would have to drive in to more populated spaces to pick up a signal. This was impacting their learning experience. This in itself is all a part of that digital divide. Last, certainly, not least, racial, culture, language. All of this plays a role and more in that skill set component along with accessibility component and how are we going to as educators, as key stakeholders within our community, leaders, be a part of the solution to close that divide. Age-related issues, socioeconomic factors, geographic causes, racial, cultural, and language. Again, digitalresponsibility.org is the source on that there. Step two, to navigate digital equity we must understand digital equity, and so now we’re going to go and delve into what does it mean—what does digital equity mean. So I’m taking my definition, again, from the National League of Cities. Digital equity is a condition in which all individuals and communities have the information technology capacity needed for full participation in our society, democracy, and economy. This is huge. So, again, as you heard me talk about the digital divide just moments ago, it’s the component of accessibility and skill. That skill is then where we get into productive citizenship through society, democracy, and economy, and so now we’re talking about how does this tool, this instrument—it’s much more than just accessibility. Now how do I engage? How am I advancing my family, my economic—social economic realities through this instrument and tool? The definition goes on to say—again, by the National League of Cities—digital equity is necessary for civic and cultural participation, employment, lifelong learning, and access to essential services. Case in point, life. As we think about all aspects of life from employment to social participation—as we think social media engagement, employment, we all understand what that means; lifelong learning, certainly as educators we have to think about that component—and then accessibility to the tools that we need, I think about my own child who this past weekend had to reach out for virtual assistance from medical care for an earache that he was having. My ability to have the skill set and accessibility to reach out, obtain those resources for my family, and engage through an electronic means to fulfill what my needs were are all a part of this equity. Life in itself should be able to remain whole in what I produce and how it is able to produce within me, and that is in itself digital equity. So step three, let’s discuss how to navigate digital equity in higher education and, again, hello to all of our educators that are on the call today. So here’s some tips that I want to leave for you on today just to think about, and I look forward to our conversation that we’re about to have here in a moment. Number one, as educators—and we’re talking about navigating digital equity—it is so important that we understand who we’re serving. I say that because, unfortunately, what can happen is especially as educators and we think about the economy, the disruptions that we’re experiencing in the marketplace right now, we’ll sometimes pursue who we want, not necessarily who we have, and that’s unfortunate. As we think about the respective institutional missions and the spaces in which we serve, we have to be mission centered and embrace who it is that we’re serving because we owe it to those students who are pursuing their academic endeavors and their professional endeavors through our respective institutions to totally be served. We must understand their realities. One of the conversations we have here at the Community College of Aurora is the conversation about you don’t know who is actually sitting, respectively, in that seat in that classroom and what they had to overcome in order to sit in that seat that particular day. Do we know how many bus routes they had to take? Do we understand the challenges that they were having with their children? Do we know are they now leaving their second job that they’ve worked for the past twenty-four hours to now sit in your classroom? So we have to understand, be aware, and approach that engagement with a sense of grace. I think that’s a word that we, perhaps, haven’t necessarily embraced in the academy in the way in which we have—should have, but now more than ever we have to. Secondly, create systems that level the learning engagement field. So it’s this idea of privilege—this thought of privilege—and, perhaps, what we assumed that everyone had access to and what everyone had the ability to engage with that they don’t necessarily have, and if they do have accessibility to it do we have a true understanding of what all they have to do to have that level of engagement and accessibility? Again, case in point, bus routes. Think about what’s happening around our country. There has been a reduction from a transportation standpoint financially, and many of the routes and the transportation services that have been provided—some of this due to disruption, others due to areas in which there have had to be a funneling of tax dollars and resources in other spaces and places in our communities. Long story short, the reality is, is that in many communities the bus routes have had to be reduced, which means that individuals are either having to walk or find ways to public accessibility to some of these resources in terms of broadband access and computer access. So then as we’re teaching and we’re instructing and we’re providing services, we have to think about how can we level the playing field and remove barriers? Does it have to be performed—does that learning outcome have to come in the form of computer access and broadband accessibility? And maybe it does, so this takes us to point number three. Let’s promote community resources to close the digital divide. I think that laser focus on how we’re going to close that divide creates this space for equity, and so, perhaps, it’s through libraries. There’s one organization out of North Carolina in some of their rural spaces they have now through grant funds created different spaces in their rural communities for those in more rural spaces to gain access to a computer lab and the grants are sustaining that accessibility through computer labs in those rural spaces. Amazing resource. There’s many others and examples that we can share around the country. So with that said, let’s promote these community resources. Sometimes it’s a library. Sometimes it’s a grant-funded opportunity. Sometimes it’s a local nonprofit. So let’s talk about how we can be creative in our respective communities to close the gap there. Fourth, adjust learning experiences to be more inclusive. Not only do we need to create the systems to level the playing field but we must then adjust the learning experiences to be more inclusive to create learning spaces and engagement spaces for all, going back to not only accessibility but skill. Last, certainly not least, providing institutional resources to close the digital divide. What I mean by this is, is that, in closing, due to—through the pandemic and many of our institutions received the Higher Education Emergency Relief Funds—the HEERF funds. Those HEERF funds were utilized in many different ways. In many cases, we were able to do laptop loan programs. In some spaces they were even doing hotspot loan programs. And so now that we are coming out of the pandemic what does it look like to sustain these resources, OK, because now that we provide these resources how do we sustain them? How do we ensure that we’re having long-term engagements? One of the things that I want and I ask from my educators, especially administrators, to look at: How do we close this—(inaudible)—without placing the costs on the backs of our students? They already have enough going on. We don’t need to just move the cost of something on to their tuition and fees. How can we be even more creative with the engagements and enrollments of our students to being laser focused on what we’re doing to close, again, many of those factors and gaps that were highlighted earlier? So grateful for the opportunity. Have a website. Would love to engage with you all more. I know we’re getting ready to go into conversation. But itsdrmordecai.com and, again, thank you all so much for the opportunity. FASKIANOS: Fantastic. Thank you so much for that overview. So we’re going to go to all of you for your questions now. You can click the raised hand icon on your screen to ask a question, and on an iPad or a tablet click the more button to access the raised hand feature. When you’re called upon accept the unmute prompt and please state your name and affiliation followed by a question. You can also submit a written question by the Q&A icon and I will read out the question, and if you do write your question please include your affiliation just to give us a sense of where you’re coming from. And there are no questions as of yet but I know that will change, or else you were so thorough that nobody has questions. (Laughs.) So do you see now with the pandemic experience that there will be continued—I’m going to ask the first question—you know, that this has opened up the space now for deeper understanding of the digital divide and bringing the resources to bear? Or now that we’re kind of post-pandemic or whatever this is people have forgotten about it and are moving on? BROWNLEE: Thank you so much for the question, my friend. I think that it’s twofold. There’s two sides of this coin, right. So there’s the one side of the coin where the awareness now is so much deeper and richer than it ever has been because of the amount of resources and what it took to sustain since 2020 those resources that were being provided to the students in the community. So now there’s many that have learned and they’re now having those conversations about how to sustain the resources because, as we all know, while there’s been an extension of HEERF funds through the Department of Education, that day is coming to an end here pretty soon and so we have to talk about sustainability. The other side of that coin is, unfortunately, there are those that acknowledge what the realities were but their agenda is more on how do we move past it, not necessarily sustain what we were providing. That’s part of the issue for some that we have to address because we don’t just move on from hardship, right. That hardship is real and we have to still maintain a laser focus on how we’re going to close the digital divide, especially in the academic spaces, but also understanding our responsibility as not only educators but community leaders, stakeholders within our community, to be a part of the solutions and the expansions on equitable access and resources being made available. And so I think with both sides of those coins we’re seeing two different realities. But I think that there’s also a need now more than ever to maintain the senses of urgency around the haves and have nots and what we’re going to do to be a part of the solution to ensure that we’re raising the level of accessibility and skill for all within our communities. FASKIANOS: I noted in your presentation you talked about knowing who your students are. So what advice do you have for higher education educators and leaders who are trying to navigate the digital divide in their classroom and to get to know—to figure out where their students are coming from and what their needs may be? BROWNLEE: So, as we all know, especially in the IR space, right, there’s different tools, resources, that we can use to survey our students. There’s different splash pages, if you will, that we can utilize in terms of the enrollment processes or the readvising processes, or even think of some of our learning management tools that we can engage with students to determine what their needs truly are. I think that it’s important that we create tools and instruments that will have high engagement rates. Sometimes those have to be incentivized. But we have to think about outside of our normal student leader responses how we’re capturing the voice of all of our students. And so that’s those that would not typically provide response, and as we think about the digital divide we have to acknowledge that that tool, that instrument, can’t just be electronic. What are we going to do to have paper resources or maybe through phone conversations, outreach, being able to have, certainly, the walk around conversations around our respective campuses and the universities. And so we need to have those conversations to make sure that we’re capturing the voice of all of our students, I think, is in the true spirit of continued improvement. We have to understand who we serve and then acknowledge, through the development of systems and the recalibration of our student experiences, are the voice of these students. FASKIANOS: Right. And in terms of the skills, because community colleges are so focused on developing the skills, what specifically are you doing at Aurora or are you seeing in the community college space to help students develop those skills that they need to navigate digitally? BROWNLEE: Absolutely. One of the things I’ll talk about—and those that may not be aware and I don’t know who all has visited Denver—but the history of Aurora—Aurora is the most diverse community—city—in the state of Colorado. I call that out because immigrants—it has a strong—there’s a strong population in this community and so part of our young thirty-nine years of existence in this community has been providing English second language courses. We’re noticing that especially our immigrant families and communities that are seeking social and economic mobility, highly skilled from where they come from but now we must create learning opportunities to close that gap, not only through language but through accessibility in this American market. And so through our community ESL programs we’ve been able to educate upwards of two thousand students a year and walk them through the various levels of learning and engagement with the English language, and then at some point in that process—learning process—we then engage and begin the computer engagement in utilizing the English language in their native language and beginning to close that gap. So I think that that work in itself is a part of that digital equity that must be created—how do you create the foundation to build upon to then advance the engagement. And there’s been some other great examples that I’ve seen around the country in doing that work, a lot of grant programs that I’ve seen in respective communities. You heard me talk about what’s happening out there in the Carolinas. But I think about what’s also happening over in California. California has been a great state that’s been able to do some work about working and identifying through heat maps and institutional resource—research and resources and community resources, looking at demographics, identifying low socioeconomic spaces, and putting concentrated efforts in those particular communities to increase the level of engagement, accessibility, and skill, and it’s critical and key. FASKIANOS: Great. We have a question from Gloria Ayee. So if you can unmute yourself and state your affiliation. Q: Hello. Thank you so much for sharing this important work that you’re doing. I am Gloria Ayee and I am a lecturer and senior research fellow at Harvard University, and my question is about the connection between the digital divide and also how it mirrors to current inequities that we see in the educational system in general. So thinking about that type of relationship, what do you think are the most significant challenges to addressing the digital divide, given the issues that we continue to see with the educational system in general at all types of institutions, and what do you foresee as the best way to actually address these challenges? BROWNLEE: Oh, that’s a great question. Great question. Thank you so much for asking that question, Gloria. I would say two things come to mind—funding and agenda, right. So if—I’ll tell you what comes to mind for me. So as we think about financially and we look at how these institutions are funded around the country, let’s think K-12. So grade schools. Think K-12. Let’s also think higher education. Are we talking headcount? Are we talking full-time equivalency? Are we talking success points? Are we talking—even as we think about developmental education, how are these institutions being funded to sustain the work of working especially with low socioeconomic communities? Let’s just take, for example, full-time equivalency, especially in this higher education space. So if I were someone who wanted to work to create programs that I’m going to help in the advancing and addressing of the digital divide and advancing digital equity, I need funds in order to do that. Now, could I pursue grant funds? Absolutely. But even—we all know that grant funds are not necessarily all the time sustainable funds. Short-term funds, but it still has to be a hard-lined. So then as we think about doing this work—I’ll go back to funding and agenda—realizing and looking at what would need to shift within particularly my state’s legislative agenda or, perhaps, in that particular district how the funding is occurring. If I’m working with a high population, which we are here at the Community College of Aurora—a high population of part-time students, these are students that are maybe taking one class and engaging. However, if I’m funded by a full-time equivalency model it then takes several students that are taking one class to then equal that one full-time equivalent, which then impacts my funding structure. So then how do I then serve, yet, I am seeking to obtain? And this is where we then get into, I think, a part of that friction of agenda and funding models. So I think that as we think equity—with an equity mindset beyond just the initiatives of overlay—we actually want to bake in the equity experience within our respective states and communities—then we’re going to have to take a look at the funding agenda, the agenda and funding—how are we truly going to advance equity and closing the digital divide. It has to be funded properly towards sustainability. We’ve seen this same thing occur in developmental education as well for those who’ve been a part of those conversations where we saw around the country there will be a reduction in developmental education funding, which has been impacted, in some cases, the success rates and resources that were historically provided through community colleges in certain communities. Same thing in this digital divide space and digital equity. So funding an agenda, and I think that the solution is, is really coming to the table and saying what does equity look like without it being an overlaid agenda, without it just being a conversation? What does it look like for it to be baked into the experience of how we’re going to transform lives, which then means that, in many cases, legislatively and funding models. We have to move from a transactional mindset to a transformational mindset and we have to go all in on ensuring that we’re creating equitable communities and engagements for those that we serve. Oh, you’re muted, my friend. FASKIANOS: Yes. Thank you. After two-and-a-half years—(laughter)—I should know that. Encourage all of you to share your best practices and what you’re doing in your communities as well. You know, we have seen the Biden administration really focusing on diversity, equity, and inclusion. They’re focusing on bringing more diversity to the State Department and other parts of the government. Is the Department of Education looking at the funding model? Is this an area that they are actively trying to reform and adjust? BROWNLEE: I get the sense—and I’ve had the pleasure of speaking in front of several legislators in different venues—I get the sense that there is a major conversation that’s happening. I do. I truly get the sense that there’s a major conversation happening, not just with our current administration from thinking about our U.S. president but also thinking local legislators as well. I really think that there’s conversations—many conversations that are happening. If anything, I feel as though the major—I don’t want to use the word barrier so I’m searching for the appropriate word here. But I think the major hurdle that we’re going to have to think about is how we have built and designed our funding models to date. You know, some of these funding models were built in early 1990s, mid-1990s in some cases. Really, you don’t see it too much early 2000s, and so we have older financial modeling infrastructure that we’re trying to pursue this work and how to change it. And so it can’t be a Band-Aid approach. I think in some spaces and communities that’s what’s been done is that rather than changing the actual model, the infrastructure itself, it’s received a Band-Aid in the form of grants. And I do believe that grants are significant and, certainly, necessary and appreciated. However, I think that we’re reaching a point in society where there has to be a total restructuring of our funding models and taking a look at what percentages are going where, taking a look at the demographics in our respective communities, taking a look at the economic realities in our respective communities. Take a look at just how much the demographics are shifting in our respective communities and building a model that’s ready to engage, sustain, and raise the level for all, and I think that we’re on our way. I, certainly, hope that we are. FASKIANOS: Thank you. I’m going to take the next question from Rufus Glasper. Q: I am here. FASKIANOS: Wonderful. Q: Hi, Mordecai. How are you today? BROWNLEE: How are you, sir? Q: Hi, Irina. FASKIANOS: Hi, Rufus. Great to hear from you. Q: Mordecai, talk a little bit about digital equity and faculty. How have they accepted, rejected, embraced what you were describing as all of the different factors that are affecting our students, and what kind of practices have you developed or can be developed to ensure that faculty can continue the progress and include our students who are most needy? BROWNLEE: Great question, Dr. Glasper. I didn’t expect anything different coming from you. So, let me just say, we’ve had some very intense conversations, and I have to really give our faculty and our instructors kudos because I will tell you this is probably by far one of the most engaged communities that I’ve ever worked in of educators that are committed to just truly getting to the solution. There’s some strong work that was done around inclusive excellence here at the Community College of Aurora, certainly, prior to my arrival. It led to this college receiving an Inclusive Excellence Award from the American Association of Community Colleges right around 2017. Part of their work at that time was looking at, as our faculty and our academy, how were we going to close the gap on success rates, particularly in English and math, and part of that work was creating resources towards gap closure to ensure that those that had not traditionally and historically had access to some of those learning materials and plans and resources that they were being provided those in a more intensive way. Now as we think more into the digital space and, certainly, think through the pandemic, what we’ve now done as an institution is that we’ve become—Community College of Aurora has become the very first Achieving the Dream institution in the state of Colorado and one of the projects that our faculty and our instructors are delving into—I’ve got a big meeting tomorrow on this, matter of fact—is taking a look at the respective success rates in our gateway courses—our key courses that are gateways into our respective academic programs—and asking ourselves how can we create more equitable learning experiences. Two things—critical things—that I’ve seen our faculty do. Number one, looking at the data. I think that the data is key and critical—taking a look, disaggregating that data. And our faculty and our instructors continue to do that work, looking at a three-year spread, a five-year spread, and saying: Where is the success occurring? Who’s it occurring with and those respective identities of those students? And then really asking the hard questions: Why isn’t this population succeeding at the same rate as this population? The other part of this criticality is, is also then accepting that there can’t be an excuse in the work. There can’t be an excuse in the work and that we must ensure then that we are creating the equitable resources and infrastructure to close the gap, create learning experiences, and say, listen, if our students can’t access the internet and the Web then what can we do to create for them the resources, whether it be paper? If they can’t come to the teaching demonstration at this particular day how can I create an opportunity for them to engage and obtain that information at another given time? Perhaps they’re a working parent and can’t necessarily attend at 10:00 a.m. but they can at 5:00 p.m. What are we doing to level the playing field with accessibility? And the other aspect of that is just that our faculty and instructors have been partnering to create these more holistic learning engagement opportunities where if we’re having a conversation in English then what can we do within our math department and almost cohorting, in a sense, the learning experiences amongst those two separate classes but then creating like engagements where the same conversations happening in English could be happening in math and science to begin to bring about a new learning within the students to say, OK, well, this particular world issue, now I’m understanding it through various lenses and I understand the interconnectivity in these learning experiences. And so more integrated learning, and I think that we’ve got a long way to go but we’re committed to doing that work. FASKIANOS: So Rufus Glasper is the chancellor of Maricopa Community Colleges, and I just thought I would ask you, Rufus, to maybe share your experience as the chancellor what has been working in your community. Q: I am the chancellor emeritus. I have not been at the colleges for a little over six years now. But I am the president and CEO for the League for Innovation in the Community College. And one of the things that I’d like to connect with with our experience right now we are involved in the state of Arizona with a project which is—which we are embracing. We are working with four different types of institutions right now—urban metropolitan, we have a couple of rural institutions and we have a couple of tribal, and we’re trying to make that connectiveness between insecurities—student insecurities. So we’re looking at housing. We’re looking at hunger. We’re looking at jobs. And one of the things that we have found is that we can’t make either of these items connect and work without broadband first, and the reason being when you’re looking at access it’s critical when you start to look at the activities that are occurring throughout the U.S. now and specifically within Arizona—I’ll talk about the connections we have now made that are national in scope, that are city, town, and county in scope, and the commitments that we are now working to obtain from all of those who are in position relative to enhancing broadband access and digital equity. There’s actually a Center for Digital Equity at Arizona State University (ASU), and last week we had a gathering of all of our institutions to get a better understanding of what does digital equity mean as it comes from the ASU center. What does it mean for each of our different types of institutions, and I will tell you that the one that was hardest hit was the one you talked about and that’s tribal just in terms of access, in terms of resources. But I am pleased with the dollars that are out there now at all levels. So if this is a time for us to increase access, increase affordability, than I think we should seize the moment. My question then, which would lead to another one, is on the whole notion of sustainability and you talked about that in terms of stimulus kinds of resources, and equity is in everyone’s face right now, especially broadband and others. Is it a sustainable initiative and focus and what are the elements that need to be connected in order to make sure that it stays in the forefront and that our students who may have benefited from buses sitting in their neighborhood during the pandemic and others but are still trying to make choices? And I’ll make the last connection point, and you made the opening—how flexible should our institutions be around work-based learning so that our students who are not able to come to the campus and be there on a regular basis but want to balance having a virtual environment? Do you see a balance coming or do you see us forced into staying the old, antiquated model of face-to-face classes and sixteen and eighteen weeks? BROWNLEE: Let me start with the sustainability component then. Thank you again, Dr. Glasper. From a sustainability standpoint, I’ll say here at the institution part of the conversation—it’s a hard conversation. But I encourage every educator to have this conversation, this brave conversation, in your spaces. Let’s take a look at your success rates, and I’m just particularly speaking to higher education right now. Let’s take a look at your various academic profiles. Let’s take a look at what has been your engagements with your workforce partners, your advisory councils, in many cases, and let’s talk about two things—one, the sustainability of those programs and, two, the social and economic mobility of those programs directly to workforce. I think what we will find is what we found here at the Community College of Aurora is that over time the various disruptions that have occurred has shifted the needs of our students. However, the institutions respectively delivering these services have not shifted with the times. And so it is quite possible that either our approach to the work or the actual lack of proper programming is prohibiting social and economic mobility in many of these communities and especially for us. Fifty-two percent of our students are first generation. Sixty-seven percent of our students are students of color. So as we talk about sustainability, we’re right there on the front line of having to take a look at enrollment, full-time equivalency, completion, graduation, and employment rates, and we began to find a shifting of that. And so when we talk sustainability, I bring this up as a framework, if you will, to say once you’ve had those conversations now let’s talk about where there are losses—financial losses—and areas in which we can truly be innovative and reallocate dollars that were once going in certain areas and infuse that into other areas that are going to have a higher return. So I think thinking, truly, with a return on investment—an ROI mindset—will then help us to not only meet the needs of our mission, meet it in its current state and its current needs and the disruption that’s currently being experienced, which will then help create new opportunities for sustainability beyond what has just been HEERF funding or potential grant funding, it can be hardlined into the institutional mission. I think the other component of that sustainability, too, is looking at the strategic plans of our respective organizations, looking at those—not only the mission but the objectives and asking how equity is not necessarily a separate objective but equity is actually ingrained in all aspects of the objectives—the strategic objectives—because, at that point, we can then understand the significance in resourcing and funding equity all the way through the entirety of the institution. In regards to your latter question about work-based learning and the old model of doing things, I, certainly, believe and hope, Dr. Glasper, that there’s this new movement that’s occurring where we’re going to have to embrace, whether we like it or not, the next era of higher education, and that next era will require us to not approach things in the same modalities and same ways. We’re watching, especially in research, the confidence levels reduce—heavily reduced now in the public’s perception of what higher education is to provide in comparison to what it once provided. Higher education in many communities isn’t necessarily being seen as the sole or the primary tool towards social economic mobility as it once was twenty, thirty years ago. So what does this mean? Our approach to sixteen-week instruction is, certainly, going to have to be transformed. What does it look like to have five-week instruction? Eight-week instruction? What does it look like for us to have true noncredit instructional programs that’s in direct partnership with business and industry to ramp up the training and social economic mobility opportunities within our communities? Folks aren’t necessarily looking for a two-year or a four-year or a six-year learning experience. They need to put food on their family’s table today. What does it look like for them to engage with the institution and have that kind of learning experience, and we have to do it with a digital equity mindset, right, because they’re seeking opportunity. So it doesn’t necessarily mean that they have accessibility in their current state. We want to get them to a state where they can have that accessibility. So how then do we create those tools? One key component of this is even looking at our college application processes. What is the readability score on some of these applications? We want to educate those that may have a reading level of a—seventh or eighth grade reading level. But some of these college applications are reading at a fourteen, fifteen grade reading level. That in itself is creating a barrier to those that are seeking opportunity, that need the opportunity to up skill. And so I think that the old model is going to, in my opinion, and hopefully quickly deteriorate and we’re going to have to be more effective. But let me also say this. It is critical that we have our faculty and our instructors at the table. These decisions shouldn’t be thrown upon them. It should be conversations that we’re having collectively together, and then how can then we resource our faculty and our instructors and our staff to be a part of those solutions, drive those solutions, reinvest in them to be able to create more innovative and more, I’ll say the word, relevant learning experiences because I truly believe that relevance is not necessarily a word that we’ve used in higher education in terms of our approach, but now more than ever we’re going to have to. FASKIANOS: OK. So I’m going to take a written question from Nicole Muthoni, who is an entrepreneur and innovator at the University of Connecticut. She has been passionately working on bridging the divide in emergent nations, especially Kenya. Therefore, in this regard, the key factors creating the digital divide in this space is geographic causes, socioeconomic factors, and culture. So the question is what tools and programs can we use to effectively educate teachers to learn the necessary skills that they can use to teach their students in the classrooms. This is because most of the teachers have not been empowered with the necessary and needed skills for educating in the space of digital equity. BROWNLEE: I think—I began to speak to that right towards the end of what I was just sharing, right. FASKIANOS: Right. BROWNLEE: It’s this idea of we’ve got to get out of the blame game. Oh, I want you to come up with the solution. Well, how are you investing in me to be a part of the solution? How are you even engaging me in part of being the solution? You know, as I talked earlier about those conversations we’re having at CCA about what are those programs that have been unsustainable or times have shifted and changed and we needed to create some more relevant learning experiences. It is our faculty and our instructors that made that decision to be able to say, hey, it’s time to pivot. They were at the table. Not just present for the sake of inclusion but, truly, the decision makers in that work. Now, I think, the next component of this work as we talked about achieving the dream and us being the first in the state of Colorado, part of our strategic plan is creating a—we don’t have a name so just work with me here conceptually. We don’t have a name yet. But I can tell you what the desired outcome is, and the desired outcome is that we create a learning center for our faculty and our instructors to grow and to be invested in and to learn what are those emerging approaches that will—on the verge of becoming best practices. However, they’re not, quote/unquote, “best practices” around the country yet. What could we create here at CCA to be a part of those solutions? And also exposure to national best practice. What are we doing to invest into our people? So I think that part of that shifting that Dr. Glasper was calling out is going to have to occur now more than ever because, unfortunately, what’s happened, I think, in the academy too many of our instructors and faculty have been blamed. Too many of our staff had been blamed, not engaged and brought about to be the solution, and not just thrown right out there in the fire to say come up with something. No. You need to care for your folks more deeply, more passionately, and more genuinely than we have ever before and really ask the question how are we going to be relevant and make sure that our folks feel cared for and that they’re valued in the spaces in which they’re serving. FASKIANOS: Thank you. So the next question is from Krishna Garza-Baker from the University of Texas at San Antonio. What would you say is the role of private service providers and their ability to assist in reducing the digital divide? Are they doing enough to collaborate with higher education institutions to address this area, specifically, internet service providers? And I’m going to add on to that. What are your recommendations for how schools can and should be leveraging corporate and community partnerships to help address the digital divide? BROWNLEE: You know, you heard me earlier talk about how we can’t just do this overlay approach. Yes, I want to give you a voucher for reduced broadband access. That’s wonderful. It is. It is grateful. It’s better than not having it. But now let’s talk about how we’re truly going to hardline in opportunities for all. As we think about the spirit of advocacy, unfortunately, sometimes, as they say, it’s the squeaky wheel gets the grease, I think, is how it’s communicated. And so what I would say is, is that now we have to think about those that don’t have a voice how we’re still meeting their needs. And so working directly with corporate industry partners, those who have the access. What does it look like if we focus less on trying to make a dollar and more on trying to create opportunity? What would it look like if we all came about and said we want to be the solution to the issue? Yes, there’s areas and opportunities where we’ll make that dollar. But as we think about society as a whole, what does it look like to create experiences and a life for the goodness of all? And so I think that now we really more than ever have to have these conversations. More than ever it just can’t be who gets the voucher. It’s how do you create the accessibility for all, those who have a voice and those who know how to use their voice. And I think that—if I understand the nature of that question now, I will say with private entities, corporate partnerships, I think it’s more visibility in these colleges and universities and these nonprofit spaces beyond the cameras and just looking at the campaigns. What does it look like for us to have the conversations day in and day out to say we’re neighbors, we’re all going to collectively be a part of the solutions and to bring the rising up, if you will, of our communities to raise the level for all and that’s, certainly, what we’re seeking to do. We’ve seen some major responsiveness in this particular community to say, listen, outside of just some campaign and a picture, what does it look like for you all to be a part of our learning experience, a part of our community, a part of our solutions, and to hardline these experiences for all. So equity causes and it charges and it demands that, and we have to realize the power of that. FASKIANOS: Thank you. I’m going to take the next question from Laila Bichara from SUNY Farmingdale. Many of my students are immigrants and are first-generation college students. My question is about skill transfer—once our students get access to technology for themselves and their families who are then losing their jobs due to automation. BROWNLEE: Demographic shift. I talked about it earlier. You know, I think about here in the Denver Metro area and I’m going to—I attended a site visit conversation with their chamber of commerce there in Denver. It was pretty telling. In looking at the demographics, it broke down how for millennials, I think, there’s currently—so there’s 3.3 million in the greater Denver area. It broke down for millennials, which I fall into this group—I think it was eight hundred and sixty-four thousand millennials currently in that space. Then it had Xers. Not Xers. It had generation Z. Z accounted for, roughly, six hundred thousand. But get this. So my children, my eight- and my four-year-old—they’re generation alpha—were only accounting for, roughly, three hundred thousand in the space currently right now. I say that as an example that I’m going to walk us through really quickly, and that is, is with the lens of equity and we think about the shifting and the disruptions in market and we think about especially now in the markets humanization versus automation, and we want to create social and economic mobility for these respective spaces wherever those realities are and we think about accessibility to the internet and we talk about that digital equity and the digital divide, we then have to have a high degree of urgency within us to say that what will—can we create today that will prevent communities of color and low socioeconomic communities that traditionally in this current market would have been given opportunities but that in the future market, due to a lack of potential skill and accessibility, will not be provided the resources and the opportunities that they once were in an automated world. And so what do we do then to make sure that they’re not the one pressing the button. They’re the one that’s coding the button, right, and that’s all a part of that work and that shifting. So it’s going to take stronger math and science skills and accessibility and equity all built into their learning experiences because if not the wide—we will widen the gap—the poverty gap—because we move, again, deeper into automation, lessen the humanization, and then we are essentially moving an entire population of folks further down the supply chain, if you will, which then will prohibit their learning—not learning, their earning ability. And so we have to be laser focused on those realities and, really, look to eradicate what’s going to be future barriers now so systematically we are able to address it. FASKIANOS: Great. So the last question I wanted to ask you is you’ve just completed your first year as president. What are the lessons that you’ve learned? BROWNLEE: Oh, my gosh. I will tell you that, you know, I just released an article on this talking about my first year in the presidency and through EdSurge and lessons learned, and one of those lessons I would say is is—that I highlighted in that article is, you know, don’t do more for an institution than you would do for your own family. I think that as educators, as community leaders, and anyone that’s on this call, I’ll just take the opportunity to encourage you. You know, sometimes we give our all to these entities in which we serve, and we do it and we give it countless hours. You know, we say it’s a forty-hour job but we’re probably spending fifty, sixty, seventy, if not more, and we get lost in that, right. And so there’s good work to be done. However, what is the biggest mockery of all to save the world but lose your own family? And I think that part of my lesson that I had to really reflect on was, like, right now as I’m giving this lecture my eight-year-old son is here in the office with me right now that I’m trying to get to be quiet and work with me as I’m giving—having this time with you all now, right. He doesn’t have school today. It’s an in-service day. But really creating those engagements for my family to be engaged in the experiences and making sure that they’re part of the process. I think the other component of this is, too—and I talked about this in the article—is realizing that it is a privilege to serve, never taking for granted the ability, the opportunity, that we have to serve because there’s others that wish that they had these opportunities. So, yes, even in our most—our days of most frustration it still is a pleasure and a blessing and an opportunity to serve and honor. And so what would life look like if we embraced it for the pleasure and the honor that it truly is and how we treat and create spaces for others to thrive, because they’re sacrificing being away from their families and loved ones to do this work. We need to create more communities for all to thrive. FASKIANOS: Oh, your son should be very proud of you. I have to say that—what a role model. BROWNLEE: Thank you. FASKIANOS: I’m going to go next to Laurette Foster. Laurette, please say your affiliation. It’s great to have you on. Q: Hi. Laurette Foster, Prairie View A&M University in Texas. And I really don’t have a question. I just want to say how delighted I was to hear the conversation and hear about what the next steps are, because looking back at the pandemic and how we wanted to step up and do so much and I’m just afraid that even though we did those things that needed to be done that many of us now are settling back into the old ways. And it’s still funny that when you told the story about the tribal community happened to go to the top of the mountain from 2:00 in the morning to do—the passion for education is there with the kids. But we have to continue to do our part. So I just appreciate all the comments and—that you did today. It was really enlightening. So thank you very much. BROWNLEE: And thank you, and I will say that my wife is a proud product of Prairie View A&M. The Hill as well. So just thank you for your comments. FASKIANOS: We have another thank you from John Marks of LSU of Alexandria just saying that it was really great to take time out of his day and to—said they—definitely in Louisiana access and skills are, indeed, real obstacles that are typical of every online class that he’s taught. I’m going to take the final question from Haetham Abdul-Razaq from Northwest Vista College, again, from San Antonio, Texas, working on a research project regarding online learning and community college students. One of the interesting findings is that some students might be considered as tech savvy, yet they have problems engaging in online classes. Do you think that we should build on the strengths of our students’ digital knowledge when it comes to these sorts of skills? BROWNLEE: Great question. Absolutely. I think, you know, we talk about creating student-centered approaches and sometimes we’re successful at that and other times we’re not, perhaps, because if we were to really delve into student-centered approaches just how far from our base currently of how we approach higher education just how far it’ll take us. But I would say, going back to an earlier conversation, now’s the time more than ever to go there. Matter of fact, we should have went there already before. It’s time, truly, for a revolution and an evolution in our approach to learning and engagement and advancement with an equity lens. And I go back to that word relevance. We have to create more relevant learning experiences. Think about business and industry. If we look at what’s happened over the past ten years due to some of our bureaucracies and our lack of responsiveness. Look at business and industry. They’re creating learning experiences right around higher education, in some cases not even engaging higher education anymore, directly working with middle schools and high schools to create their own strong pipelines. What has happened that that even came about, right? And so due to a lack of responsiveness, perhaps, innovation—true innovation—and that student-centered approach that we, perhaps, moved far from or maybe just took parts of that was easier to tackle, not the harder aspects of that, and so we now have to tackle it. We have to embrace it, because if not I think that five, ten years from now, certainly, twenty years from now, we’ll have more institutional closures, more reductions in enrollments, if we fail to be responsive and create these more equitable learning opportunities that are geared at creating a digital equity. FASKIANOS: Right. Well, we are just at the end of our time. Thank you very much, Dr. Mordecai Brownlee. We really appreciate your being with us and sharing your insights, and to all of you for your questions and comments. And so you can follow Dr. Mordecai and also go to his website, itsdrmordecai.com, and at @itsdrmordecai, correct? BROWNLEE: That is correct. That is correct. I look forward to engaging with everyone. FASKIANOS: Wonderful. We really appreciate it. Just as a reminder for all of you, our next Higher Education webinar will be on Wednesday, November 2, at 1:00 p.m. Eastern time. Rebecca Granato, associate vice president for global initiatives at Bard College, will talk about refugees, migration, and education. So we hope you’ll tune in for that. In the meantime, I encourage you to check out CFR fellowships for educators at CFR.org/fellowships, and this is a program that allows educators to come for a year in residence at CFR or else go work in—we place you in government to get some policy-relevant experience. The deadline is October 31. So if you’re interested email us and we can send you information about that. Also, go to CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for research and analysis, and follow us at @CFR_Academic. Thank you all again. Thank you, Dr. Brownlee. We appreciate it, and we hope you have a good rest of the day. (END)  
Energy and Climate Change

The effects of climate change, including the increase in wildfires, severe storms, and other extreme weather events, require the United States to shift to cleaner forms of energy, to become more energy efficient, and to improve resilience through adaptation.

United States

Progress on President Biden’s climate agenda will slow with a split Congress. But with federal efforts dulled, state-level action could supply added momentum.

United States

It probably won’t, and that would be a mistake.

Climate Change

International efforts, such as the Paris Agreement, aim to reduce greenhouse gas emissions. But experts say countries aren’t doing enough to limit dangerous global warming.
Future of Work

Rapid technological changes and automation are dramatically changing the nature of work. U.S. economic competitiveness hinges on the ability of U.S. workers to adapt to new modes of employment.

United States

Dan asks a question that everyone who has ever signed up for military service has likely asked: was it worth it? His answer is worth reading. 

Immigration and Migration

The broken system hurts immigrants—and makes it harder for the United States to compete.

United States

Legal immigrants and noncitizens—service members enriching the U.S. military since the American Revolution.
Immigration

The United States’ ability to attract talent from around the world remains one of its core strengths. Outdated and cumbersome immigration rules weaken this economic advantage as workforce mismatches grow and millions of U.S. residents are left in limbo.

United States

Spurred on by worsening economic and political crises across Latin America, migration to the United States reached record levels in 2022. Here’s a look at the year’s major immigration stories.

Immigration and Migration

The broken system hurts immigrants—and makes it harder for the United States to compete.

United States

Legal immigrants and noncitizens—service members enriching the U.S. military since the American Revolution.
Infrastructure

U.S. infrastructure lags behind that of its peers in international rankings. Major investments are needed to both upgrade U.S. infrastructure and adapt it for a changing climate.

Energy and Environment

In October 2012, Superstorm Sandy made landfall as one of the most destructive hurricanes to ever hit the United States, causing tens of billions of dollars in damage and dozens of casualties. A decade later, climate change has continued to intensify the impacts of hurricanes in the United States, as made evident by recent Hurricane Ian. Our panelists discuss lessons learned in climate resiliency over the past decade, and how the United States can better prepare for natural disasters moving forward.  The Lessons From History Series uses historical analysis as a critical tool for understanding modern foreign policy challenges by hearing from practitioners who played an important role in a consequential historical event or from experts and historians. This series is made possible through the generous support of David M. Rubenstein.

Energy and Environment

Billion-dollar disasters such as Hurricane Ian are on the rise in the United States. Officials should take swift action to reduce the damage and protect Americans.

Infrastructure

Innovation

The United States has led the world in innovation, research, and technological development since World War II, but its primacy is now being challenged by China and other countries. Maintaining U.S. dominance in emerging technologies is critical for economic and national security.

Economics

The consequences of the COVID-19 pandemic and the rise of China have prompted renewed debate about the U.S. government’s role in shaping the economy.

Robots and Artificial Intelligence

Lauren Kahn, research fellow at CFR, leads the conversation on AI military innovation and U.S. defense strategy.   FASKIANOS: Thank you, and welcome to today’s session of the Fall 2022 CFR Academic Webinar Series. I’m Irina Faskianos, vice president of the National Program and Outreach at CFR. Today’s discussion is on the record, and the video and transcript will be available on our website CFR.org/Academic if you would like to share it with your colleagues or classmates. As always, CFR takes no institutional positions on matters of policy. We’re delighted to have Lauren Kahn with us to talk about AI military innovation and U.S. defense strategy. Ms. Kahn is a research fellow at CFR, where she focuses on defense, innovation, and the impact of emerging technologies on international security. She previously served as a research fellow at Perry World House at the University of Pennsylvania’s global policy think tank where she helped launch and manage projects on emerging technologies and global politics, and her work has appeared in Foreign Affairs, Defense One, Lawfare, War on the Rocks, Bulletin of the Atomic Scientists, and the Economist, just to name a few publications. So, Lauren, thanks very much for being with us. I thought we could begin by having you set the stage of why we should care about emerging technologies and what do they mean for us in—as we look ahead in today’s world. KAHN: Excellent. Thank you so much for having me. It’s a pleasure to be here and be able to speak to you all today. So I’m kind of—when I’m setting the stage I’m going to speak a little bit about recent events and current geopolitical situations and why we care about emerging technologies like artificial intelligence, quantum computing—things that seem a little bit like science fiction but are now coming into realities and how our military is using them. And then we’ll get a little bit more into the nitty gritty about U.S. defense strategy, in particular, and how they’re approaching adoption of some of these technologies with a particular focus in artificial intelligence, since that’s what I’m most interested in. Look, awesome. Thank you so much for kicking us off. So I’ll say that growing political competition between the United States, China, and Russia is increasing—the risk of great power conventional war in ways that we have not seen since the end of the Cold War. I think what comes to everyone’s mind right now is Russia’s ongoing invasion of Ukraine, which is the largest land war in Europe that we’ve seen since World War II, and the use of a lot of these new emerging capabilities. And so I’ll say for the past few decades, really, until now we thought about war as something that was, largely, contained to where it was taking place and the parties particularly involved, and most recent conflicts have been asymmetric warfare being limited to terrestrial domains. So, on the ground or in the air or even at sea, where most prominent conflicts were those between nation states and either weak states or nonstate actors, like the U.S. wars—led wars in Afghanistan and Iraq or intervention in places like Mali and related conflicts as part of the broader global war on terrorism, for example. And so while there might have been regional ripple effects and dynamics that shifted due to these wars, any spillover from these conflicts was a little bit more narrow or due to the movement of people themselves, for example, in refugee situations. I’ll say, however, that the character of wars is shifting in ways that are expanding where conflicts are fought and where they take place and who is involved, and a large part of this, I think, is due to newer capabilities and emerging technologies. I’ll say it’s not entirely due to them, but I think that there are some things, like, with the prominence of influence operations, and misinformation, deep fakes, artificial intelligence, commercial drones, that make access to high-end technology very cheap and accessible for the average person has meant that these wars are going to be fought in kind of new ways. We’re seeing discussion of things like information wars where things are being fought on TikTok and social media campaigns where individuals can kind of film what’s happening on the ground live and kind of no longer do states have, so to speak, a monopoly on the dissemination of information. I’ll speak a little bit more about some of the examples of technologies that we’re seeing. But, broadly speaking, this means that the battlefield is no longer constrained to the physical. It’s being fought in cyberspace, even in outer space, with the involvement of satellites and the reliance on satellite imagery and open source satellite imagery like Google Maps and, again, in cyberspace. And so as a result, it’ll not only drive new sectors and new actors kind of into the foray when it comes to fighting wars, and militaries have been preparing for this for quite a while. They’ve been investing in basic science research and development, testing and evaluation in all of these new capabilities, from artificial intelligence, robotics, quantum computing, hypersonics. And these have been priorities for a few years but I’ll say that that conflict in Ukraine and the way that we’re seeing these technologies are being used has really kind of put a crunch on the time frame that states are facing, and I’m going to speak a little bit more about that in a minute. But to kind of give you an example of what are—what does it mean to use artificial intelligence on the battlefield—what do these kind of look like, there’s—largely, my work before this conflict was a little hypothetical. It was hard to kind of point to. But I think now, as these technologies mature, you’re seeing that they’re being used in more ways. So artificial intelligence, for example, are used to create—has been used by Russia to create deep fakes. There was a very famous one of President Zelensky that they used that they then combined with a cyberattack to put it at a very—to put it on national news in Ukraine, to make it look a little bit more believable even though the deep fake itself, it was a little, like, OK, they could tell it was computer generated. These are kind of showing how some of these technologies are evolving and, especially when combined with other kinds of technological tools, are going to be used to kind of make some of these more influence operations and propaganda campaigns a little bit more persuasive. Other examples of artificial intelligence, there’s facial recognition technology being used to identify civilians and casualties, for example. They’re being used to—they’re using natural language processing, which is a type of artificial intelligence that kind of analyzes the way people speak. You think of Siri. You think of chat bots. But more advanced versions being used to kind of read in radio transmissions and translate them and tag them so that they’re able to—that forces are able to go through more quickly and identify what combatants are saying. There’s the use of 3D printing and additive manufacturing where individuals are able to for very cheap—a 3D printer costs a couple—a thousand dollars and you can get it for maybe less if you build it yourself. You can add—you can add different components to grenades to make—and then people are taking smaller commercial drones to kind of make a MacGyvered smart bomb that you can maneuver. So those are some of the kind of commercial technologies that are being pulled into the kind of military sphere and into the battlefield. They might not be large. They might not be military in its first creation. But because they’re so general purpose technologies—they’re dual use—they’re being developed in the private sector and you’re seeing them being used on the battlefield and weaponized in new ways. There are other technologies that are more based originally in the military and defense kind of sectors and who’s created them, things like loitering munitions, which we’re seeing more of now, and a little—a lot more drones. I’m sure a lot of you have been seeing a lot of—about the Turkish TB2 drones and the Iranian drones that are now being used by Russia in the conflict. And these are not as new technologies. We’ve seen them. They’ve been around for a couple of decades. But they’re reaching a maturity in their technological lifecycle where they’re a lot more cheap and they’re a lot more accessible and they’re a lot more familiar now that they’re being used in innovative and new ways. They’re being seen as less precious and less expensive. And so not that they’re being used willy nilly or that they’re expendable but militaries, we’re seeing, are willing to use them in more flexible ways. And so, for example, Ukraine, in the early days of the campaign, there were some—allegedly, Ukraine used it as—the TB2 as a distraction when it wanted to sink a war ship rather than actually using it to try and sink the war ship itself. And so using it for things that they’re good for but maybe not the initial thought or the initial what they were designed to be used for. Iran—I mean, excuse me, Russia, now using the Iranian-made loitering munitions. They’re pretty reasonable in price. They’re about $20,000 a pop, and so using them in swarms to be able to take out some of the Ukrainian infrastructure has been a pretty good technique. Ukraine, for example, is very good at shooting them down. I think they were reporting at some point they had an ability to shoot them down at a rate of around 85 percent to 90 percent. And so the swarms weren’t necessarily all of them were getting through but because they’re so reasonably priced it was still—it was still a reasonable tactic and strategy to take. There’s even some kind of more cutting edge, a little bit more unbelievable, applications like now being touted as an Uber for artillery, whether you’re using similar kind of algorithms that Uber uses to kind of identify which passengers to pick up first and where to drop them off, about how to target artillery systems—what target is most efficient to hit first. And so we’re seeing a lot of these technologies being used, like I said, in new and practical ways, and it’s really condensed the timeline that, I think, states are seeing, especially the United States—that they want to adopt these technologies. Back in 2017, Vladimir Putin famously stated that he believed that whoever became leader in AI would become leader of the world, and China has very much publicized their plans to invest a lot more in AI research and development, to invest in bridging the gaps between its civil and military engineers and technologists to take advantage of AI by the year 2023. So we’ve got about one more year to go. And so I think that the United States, recognizing this, the time crunch has been—the heat is on, so to speak, for adopting some of these newer capabilities. And so we’re seeing that a lot now. There’s a lot of reorganization happening within the Department of Defense to kind of better leverage and better adapt in order to take advantage of some of these technologies. There’s the creation of a new chief data—digital and artificial intelligence office, the new emerging capabilities policy office, that are efforts in order to better integrate data systems ongoing projects in the Department of Defense, et cetera, to implement it for broader U.S. strategy. There’s been efforts as well to partner with allies in order to develop artificial intelligence. I mean, as part of the Indo-Pacific strategy that the Biden administration announced back in February of 2022 they announced that along with the Quad partners—so Japan, Australia, and I’m forgetting—and India, excuse me—they are going to fund research, for example, for any graduates from any of those four countries to come study in the United States if they focused on science, technology, engineering, and mathematics, and so to foster that integration and collaboration between our allies and partners to better take use of some of these things. I’ll say, even so, recently, in April 2022, for example, I think, looking at how Ukraine was using a lot of these technologies, the United States was able to fast track one of its programs. It was called the Phoenix Ghost. It’s a loitering munition. Little—it’s still a little—not well known. But, for example, I saw that the capabilities requirement that Ukraine had and fast tracked their own program in order to fulfill that. So they’re being used for the first time. So, again, we’re seeing that the United States is kind of using this as an opportunity to learn as well as to really take advantage and start kicking into high gear AI in defense innovation development. And so I’ll say that doesn’t mean that it’s not without its challenges, acquisitions process in particular. So how the United States—how Department of Defense takes a program from research and development all the way to an actual capability that it’s able to use on the battlefield. Before, in the 1950s where it used to take maybe five years now takes a few decades, there’s a lot of processes in between that make it a little bit challenging. All these sorts of checks and balances in place, which are great, but have made the process slow down the process a little bit. And so it’s harder for smaller companies and contractors to kind of—that are driving a lot of this—driving the cutting-edge research in a lot of these fields to work with the defense sector. And so there are some of these challenges, which, hopefully, some of this reorganization that the Pentagon is doing will help us. But that’s the next step, looking forward. And so that’s going to, I think, be the next big challenge that I’m watching for the—over the rest of this year and the next six months. But I think I threw a lot out there but I’m happy to open it for questions now and focus on anything in particular. But I think that gave an overview of some of the things that we’re seeing now. FASKIANOS: Absolutely. That was insightful and a little scary—(laughs)—and look forward now to everybody’s questions. As a reminder, after two and a half years of doing this, you can click on the raise hand icon on your screen to ask a question, and on an iPad or Tablet click the more button to access the raise hand feature. When you’re called upon, please accept the unmute prompt and state your name and affiliation. You can also submit a written question via the Q&A icon, and please include your affiliation there, and we are going to try to get through as many questions as we can. All right. So the first question—raised hand comes from Michael Leong. Q: Hi. Is this working? FASKIANOS: It is. Please tell us your affiliation. Q: Hi. My name is Michael Leong. I’m an MPA student in public administration at the University of Arizona in Tucson. And I just have a question about, basically, with the frequent use and successful use of drones in Ukraine is there any concern domestically about—because of how easily they are adapting such accessible technology to warfare that those can be used maliciously domestically and what steps they might be considering. Thanks. KAHN: Absolutely. That’s a great question. I think it’s broader than just drones as well when you have this proliferation of commercial technology into defense space and you have these technologies that are not necessarily, for example, weapons, right. So for—I think a good example is Boston Dynamics. They make this quad pet robot with four legs. It looks kind of like a dog. His name is Spot. And he’s being used in all sorts of commercial applications—help fund local police forces, et cetera—for very benevolent uses. However, there’s been a lot of concern that someone will go and, essentially, duct tape a gun to Spot and what will that kind of mean. And so I think it’s a similar kind of question when you have some of these technologies, again, that aren’t—it depends on how you use them and so it’s really up to the user. And so when you get things like commercial drones, et cetera, that you’re seeing that individuals are using for either reconnaissance or, again, using in combination with things like 3D printing to make weapons and things like that, it is going to be increasingly, increasingly difficult to control the flow. We’ve seen Professor Michael Horowitz over at the University of Pennsylvania, who’s now in government, he’s done a lot of research on this and you see that the diffusion of technologies happens a lot—a lot quicker when they’re commercially based rather than when they’re from a military origination. And so I think it’s definitely going to pose challenges, especially when you get things like software and things like artificial intelligence, which are open source and you can use from anywhere. So putting—kind of like controlling export and extrolling (sic) after the fact how they’re used is going to be extremely difficult. A lot of that right now is currently falling to kind of companies who are producing them to self-regulate since they have the best, like, ability to kind of limit access to certain technologies. Like, for example, open AI. If any of you have played with DALL-E 2 or DALL-E Mini, the image generating prompt sandbox tool that’s—they have limited what the public can access—certain features, right—and are testing themselves to see, OK, how are these being used maliciously. I think a lot of them are testing how they’re being used for influence operations, for example. And so making sure that some of those features that allow that to be more malicious they’re able to regulate that. But it is going to be extremely hard and the government will have to work hand in hand with a lot of these companies and private actors that are developing these capabilities in order to do that. But it’s a very great question and it is not one that I have a very easy answer to on how to address that. But it is, like, something that I’ve been thinking about a lot. FASKIANOS: Thank you. I’m going to take the next question from Arnold Vela, who’s an adjunct faculty at Northwest Vista College. What is the potential value of AI for strategy, e.g., war planning, versus tactical uses? KAHN: Great. So I think—honestly, I think a lot of artificial intelligence the benefit is replacing repetitive human—repetitive redundant tasks, right. So it’s not replacing the human. It’s making the human be more efficient by reducing things like data entry and cleaning and able to pull resources from all together. And so it’s actually already being used, for example, in war planning and war gaming and things like that and Germany and Israel have created things to make 3D AI to create sort of 3D battlefields where they can see all the different kind of inputs of information and sensors. And so I think that’s really where the value add—the competitive advantage of artificial intelligence is. It’s not necessarily—having an autonomous drone is very useful but I think what will really be the kind of game changer, so to speak, will be in making forces more efficient and both have a better sense of themselves as well as their adversaries, for example. And so, definitely, I think, I’m more in the background with the nonsexy—the data cleaning and all the numbers bit will be a lot more important, I think, than the having a drone with encased AI capabilities, even though those kind of suck the oxygen out a little bit because it’s really exciting. It’s shiny. It’s Terminator. It’s I, Robot-esque, right? But I think a lot of it will be the making linguists within the intelligence community able to process and translate documents at a much faster pace. So making individuals’ lives easier, I think. So definitely. FASKIANOS: Great. Thank you. I’m going to go next to Dalton Goble. Please accept the unmute. Q: Thank you. FASKIANOS: There you go. Q: Hi. I’m Dalton. I’m from the University of Kentucky and I’m at the Patterson School for Diplomacy and International Commerce. Thank you for having this talk. I really wanted to ask about the technology divide between the developed and developing world, and I wanted to hear your comments about how the use of AI in warfare and the technologies such as—and their proliferation can exasperate that divide. KAHN: Absolutely. I actually think, we’re—I think that I’ve been focusing a lot on how the U.S. and China and Russia, in particular, have been adopting these technologies because they’re the ones that are investing in it the most. I mean, countries in Europe are as well and, Israel, et cetera, and Australia also. Except I still think we’re in those early stages where a lot of countries—I think, over a hundred or something—have the national AI strategies right now. I don’t think it’s as far along yet in terms of its—at least its military applications or applications for government. I will say that, more broadly, I think, again, because these technologies are developed in the commercial sector and are a lot more reasonably priced, I think there’s actually a lot of space for countries in the developing world, so to speak, to adopt these technologies. There’s not as many barriers, I think, when it’s, again, necessarily a very expensive, super specific military system. And so I think that it’s actually quite diffusing rapidly in terms—and pretty equally. I haven’t done extensive research into that. It’s a very good question. But my first gut reaction is that it actually can—it actually can help kind of speak—not necessarily exacerbate the divide but kind of close the gap a little bit. A colleague of mine works a lot in health care and in health systems in developing countries and she works specifically with them to develop a lot of these technologies and find that they actually adopt them quicker because they don’t have all of these existing preconceived notions about what the systems and organizations should look like and are a lot more open to using some of these tools. But I will say, again, they are just tools. No technology is a silver bullet, and so I think that, again, being in the commercial sector these technologies will diffuse a lot more rapidly than other kind of military technologies. But it is something to be cognizant of, for sure. FASKIANOS: Thank you. I’m going to go next to Alice Somogyi. She’s a master’s student in international relations at the Central European University. Could you tell us more on the implications of deep fakes within the military sector and as a defense strategy? KAHN: Absolutely. I think influence operations in general are going to be increasingly part of the—part of the game, so to speak. I mean, I mentioned there’s going to be—it’s very visible to see in the case of Ukraine about how the information war, especially in the early days of the conflict, was super, super important, and the United States did a very good job of releasing information early to allies and partners, et cetera, to kind of make the global reaction time to the invasion so quick. And so I think that was a lot—very unexpected and I think has shown just—not to overstate it but the power of individuals and that a lot of propaganda will have. We’ve known—I’m sure if you studied warfare history, you can see the impact of propaganda. It’s always been—it’s always been an element at play. I will just say it’s another tool in the toolkit to make it a little bit more believable, to make it harder, to make these more efficient, and I think what’s really, really interesting, again, is how a lot of these technologies are going to be worked together to kind of make them more believable. Like, again, creating deep fakes. The technology isn’t there yet to make them super believable, at least on a—like, a large scale that many people at—that a state could believe. But combining them with something like a cyberattack, to place that in a place that you would have a little bit more—more willing to believe it, I think, will be increasingly important. And we’ll see it, I’m sure, combined in other ways that I can’t even imagine. And that goes back to one of the earlier questions we had about the proliferation of these technologies and, like, it being commercial and being able to contain the use and you can’t, and that’s the hardest part. And I think that especially when it comes to software and things where once you sell it out there they can use it for whatever they want. And so it’s this kind of creativity where you can’t prevent against any possible situation that you don’t know. So it has to be a little bit reactive. But I think there are measures that states and others can take to be a little bit proactive to protect against the use. This isn’t specifically about deep fakes but about artificial intelligence in general. There’s a space, I think, for confidence-building measures so informal agreements that states can kind of come to to set norms and kind of general rules of the road about, like, expectations for artificial intelligence and other kind of emerging technologies that they can put in place before they’re used so that when situations that are unexpected or have never seen before arise that there’s not—there’s not totally no game plan, right. There’s a kind of things and processes to kind of fall back on to guide how to advance and work on that situation without having to—without regulating too much too quickly that they become outdated very quickly. But I think it’ll definitely be as the technology develops that we’ll be using a lot more deep fakes. FASKIANOS: Yes. So Nicholas Keeley, a Schwarzman Scholar at Tsinghua University, has a question that goes along these lines. Ukrainian government and Western social media platforms were pretty successful at preempting, removing, and counteracting the Zelensky deep fake. How did this happen? I mean, he’s—asks about the cutting-edge prevention measures against AI-generated disinformation today that you just touched upon. But can you just talk about the Ukrainian—this specific what we’re seeing now in Ukraine? KAHN: Yeah. I think Ukraine has been very, very good at using these tools in a way that we haven’t seen before and I think that’s, largely, why a lot of these countries now are looking and watching and are changing their tack when it comes to using these. Again, they seem kind of far off. Like, what’s the benefit of using these newer technologies when we have things that are known and work. But I think Ukraine, kind of being the underdog in this situation and knowing since 2013 that this was a future event that might happen has been preparing, I think, in particular, their digital minister. I’m not sure what the exact title was, but they were able to mobilize that very quickly. It was originally set up to better digitize their government platforms and provide access to individuals, I think, on a phone app. But then they had these experts that work on how—OK, how can we use digital tools to kind of engage the public and engage media. I think when they—they militarized them, essentially. And so I think a lot of the early days, asking for—a lot of people in that organization asked Facebook, asked Apple, et cetera, to either put sanctions, to put guardrails up. You know, a lot of the early, like, Twitter, taking down the media, et cetera, was also engaged because specifically this organization within Ukraine made it their mission to do so and to kind of work as the liaison between Silicon Valley, so to speak, and to get—and to engage the commercial sector so they could self-regulate and help kind of the government do these sort of things, which, I think, inevitably led to them catching the deep fake really quickly. But also, if you look at it, it’s pretty—it’s pretty clear that it’s computer generated. It’s not great. So I think that, in part, was it and, again, in combination with a cyberattack you could then notice that there was a service attack. And so, while it made it more realistic, there’s also risks about that because they’re practiced in identifying when a cyberattack just occurred, more so than other things. But, absolutely. FASKIANOS: Thank you. I’m going to go next to Andrés Morana, who’s raised his hand. Q: Hi. Good afternoon. I’m Andrés Morana, affiliated with Johns Hopkins SAIS International Relations. Master’s degree. I wanted to ask you about AI and then maybe emerging technology as well. But I think artificial intelligence, as it applies to kind of the defense sector, like, the need to also at the same time reform in parallel the acquisitions process, which is notorious for—as we think about AI kind of where these servers are hosted a lot of commercial companies might come with maybe some new shiny tech that could be great. But if their servers are hosted in maybe a place that’s so easy to access then maybe this is not great, as it applies to that defense sector. So I don’t know if you have thoughts on maybe the potential to reform or the need to reform the acquisitions process. Thank you. KAHN: Yeah, absolutely. I mean, this is some people’s, like, favorite, favorite topic on this because it has become sort of a valley of death, right, where things go and they die. They don’t—they don’t move. Of course, there’s some bridges. But it is problematic for a reason. There’s been a few kind of efforts to create mechanisms to circumvent that. The Defense Innovation Unit has created some kind of funding mechanisms to avoid it. But, overall, I do think it needs—I don’t know what that looks like. I’m not nearly an expert on specifically the acquisitions process that a lot of folks are. But it is pretty—it would make things a lot easier. China, for example, people are talking about, oh, it’s so far ahead on artificial intelligence, et cetera, et cetera. I would argue that it’s not. It’s better at translating what it has in the civilian and academic sectors into the military sphere and being able to use and integrate that. And so overcome that gap. It does so with civil-military fusion. You know, they can kind of do—OK, well, we’re saying we’re doing it this way so it’s going to happen, whereas the United States doesn’t have that kind of ability. But I would say the United States has all the academic and industry leading on artificial intelligence. Stanford recently put out their 2022 AI Index that has some really great charts and numbers on this about how much—how much research is being done in the world on artificial intelligence and which countries and which regions and specifically who’s funding that, whether it’s governments, academia, or industry. And the United States is still leading in industry and academia. It’s just that the government has a problem tapping into that, whereas China, for example, its government funding is a lot greater and there’s a lot more collaboration across government, academia, and industry. And so I think that is right now the number-one barrier that I see. The second one, I’ll say, is accessing data and making sure you have all the bits and pieces that you need to be able to use AI, right. What’s the use of having a giant model that—an algorithm that could do a million things if you don’t have all of the data set up for it. And so those are the two kind of organizational infrastructure problems that I’ll say are really hindering the U.S. when it comes to kind of adopting these technologies. But, unfortunately, I do not have a solve for it. I would be super famous in the area if I did, but I do not, unfortunately. FASKIANOS: Thank you. I’m going to take the next question from Will Carpenter, a lecturer at the University of Texas at Austin. Also got an up vote. What are the key milestones in AI development and quantum computing to watch for in the years ahead from a security perspective? Who is leading in the development of these technologies—large cap technology companies such as Google, ByteDance? Venture capital-backed private companies, government-funded entities, et cetera? KAHN: Great. Great question. I’ll say for quantum, quantum is a little bit more down the line since we do not have a quantum computer, like, a really big quantum computer yet that can handle enough data. China’s kind of leading in that area, so to speak. So it’s curious to watch them. They’ve created their first, I think, quantum-encrypted communications line and they’ve done a few works on that. So I think to keep an eye on that will be important. But, really, just getting a computer large enough that it’s reasonable to use quantum, I think, will be the next big milestone there. But that’s quite a few years down the line. But when it comes to artificial intelligence, I’ll say that artificial intelligence has had waves and kind of divots in interest and then research. They call them AI winters and AI springs. Winter is when there’s not a lot of funding and spring is when there is. It’s featured a lot of—right now we’re in a spring, obviously, and it was a large part because of breakthroughs in, like, the 2010s in things like natural language processing and computer vision, et cetera. And so I think continued milestones in those will be key. There’s a few that I’ve worked on. There’s a—there’s the paper right now—hopefully, it will be out in the next few months—on forecasting on when we actually think those—when AI experts and machine learning experts think those milestones will be hit. I mean, there were, like, two that were hit, like, there was ones where you’d have AI being able to beat all the Atari games. You have AI being able to play Angry Birds. There’s ones that’s, like, OK—well, and there are lots of those mini milestones that—bigger leaps than just the efficiency of these algorithms. I think things like artificial or general intelligence. Some say there are some abilities for you to create one algorithm that can play a lot of different games. You know, it can play chess and Atari and Tetris. But I think, broadly speaking, it’s a little bit down the line also. But I’ll say for, like, the next few months, it’ll—and the next few years, it’ll probably be just, like, more efficient in some of these algorithms, making them better, making them leaner, use a lot less data. But I think we’ve, largely, hit the big ones and so I think it’ll be—we’ll see these short, smaller milestones being achieved in the next few years. And I think there was another part to the question in the—let me just go look in the answer for what it was. Who’s developing these. FASKIANOS: Right. KAHN: I would say these, like, large companies like Google, Open AI, et cetera. But I’ll say a lot of these models are open source, for example, which means that the models themselves are out there and they’re available to anyone who wants to kind of take them and use them. I mean, I’m sure you’ve seen—once you saw DALL-E Mini you saw DALL-E 2 and DALL-E X. So, like, they proliferate really quickly and they adapt, and that’s a large part what’s driving the acceleration of artificial intelligence. It’s moving so quickly because there is this nature of collaboration and sharing that companies are incentivized to participate in, where they just take the models, train them against their own data, and if it works better they use that. And so those kind of companies are all playing a part, so to speak. But I would say, largely, academia right now is still really pushing the forefront, which is really cool to see. So I think that means that a lot more Blue Skies kind of just basic research being funded will—if it’s being pumped into that we’ll continue to kind of—we’ll see these advances continue. I’ll say also a lot of—when it comes to defense applications, in particular, I think, and where the challenge is is that a lot of—a lot more than typically when it comes to artificial intelligence these capabilities are being developed by niche smaller startup companies that might not be— that might not have the capabilities that, say, a Google or a Microsoft has when it comes to working and contracting with the U.S. government. So that’s also a challenge. When you have this acquisitions process it’s a little bit challenging at best, even for the big companies. I think for these smaller companies that really do have great applications and great specific uses for AI, I think that’s also a significant challenge. So I think it’s, basically, everybody. Everyone’s working together, which is great. FASKIANOS: Great. I’m going to go next to DJ Patil. Q: Thanks, Irina. Good to see you. FASKIANOS: Likewise. Q: And thanks for this, Lauren. So I’m DJ Patil and I’m at the Harvard Kennedy School Belfer Center, as well as Devoted Health and Venrock Partners. And so, Lauren, the question you addressed a little bit on the procurement side, I’m curious what your advice to the secretary of defense would be around capabilities, specifically, given the question of large language models or the efforts that we’re seeing in industry and how much separation of results that we’re seeing even in industry compared to academia. Just the breakthroughs that we’re seeing reported are so stunning. And then if we look at the datasets that are—that they’re building on—those companies are building on, they’re, basically, open or there’s copyright issues in there. There’s defense applications which have very small data sets, and also, as you mentioned, in the procurement side a lack of access to the ability of these things. And so what is the mechanisms if you looked across this from a policy perspective of how we start tapping into those capabilities to ensure that we have competitiveness as the next set of iterations of these technologies take place? KAHN: Absolutely. I think that’s a great question. I’ve done a little bit of work on this. When they were creating the chief digital AI office, I think they had, like, people brainstorming about, like, what kind of things we would like to see and I think everyone agreed that they would love for them to get kind of a better access to data. If the defense secretary asks, can I have data on all the troop movements for X, Y, and Z, there’s a lot of steps to go through to pull all that information. The U.S. defense enterprise is great at collecting data from a variety of sources—from the intelligence community, analysts, et cetera. I think what’s challenging to know—and, of course, there are natural challenges built in with different levels of how confidential things are and how—the classifications, et cetera. But I think being able to pull those together and to clean that data and to organize it will be a key first step and that is a big infrastructure systems software kind of challenge. A lot of it’s actually getting hardware in the defense enterprise up to date and a lot of it is making sure you have the right people. I think another huge one—and, I mean, the National Security Commission on AI on their final report announced that the biggest hindrance to actually leveraging these capabilities is the lack of AI and STEM talent in the intelligence community in the Pentagon. There’s just a lack of people that, one, have the vision to—have the background and are willing to kind of say, OK, like, this is even a possible tool that we can use and to understand that, and then once it’s there to be able to train them to be able to use them to do these kind of capacities. So I think that’ll be a huge one. And there are ways that kind of—there are efforts right now ongoing with the Joint Artificial Intelligence Center—the JAIC—to kind of pilot AI educational programs for this reason as a kind of AI crash course. But I think there needs to be, like, a broader kind of effort to encourage STEM graduates to go into government and that can be done, again, by kind of playing ball, so to speak, with this whole idea of open source. Of course, the DOD can’t do—Department of Defense can’t make all of its programs open and free to the public. But I think it can do a lot more to kind of show that it’s a viable option for individuals working in these careers to address some of the same kind of problems and will also have the most up to date tech and resources and data as well. And I think right now it’s not evident that that’s the case. They might have a really interesting problem set, which is shown to be attractive to AI PhD graduates and things like that. But it doesn’t have the same kind of—again, they’re not really promoting and making resources and setting up their experts in the best way, so to speak, to be able to use these capabilities. FASKIANOS: Thank you. I’m going to take the next question from Konstantin, who actually wrote a question—Tkachuk—but also raised his hand. So if you could just ask your question that would be best. Q: Yes. I’m just happy to say it out loud. So my name is Konstantin. I’m half Russian, half Ukrainian. I’m connecting here from Schwarzman Scholarship at Tsinghua University. And my question is more coming towards the industry as a whole, how it has to react on what’s happening to the technology that the industry is developing. Particularly, I am curious whether it’s the responsibility and interest of industry and policymakers to protect the technology from such a misuse and whether they actually do have control and responsibility to make these technology frameworks unusable for certain applications. Do you think this effort could be possible, give the resources we have, the amount of knowledge we have? And, more importantly, I would even be curious on your perspective whether you think countries have to collaborate on that in order to such effort be efficient, or it should be incentive models based inside countries that will make an effort to the whole community. KAHN: Awesome. I think all of the above. I think right now, because there’s so—the relatively little understanding of how these work, I think a lot of it is the private companies self-regulating, which I think is a necessary component. But there are also now indications of efforts to kind of work with governments on things like confidence-building measures or other kind of mechanisms to kind of best understand and best develop transparency measures, testing and evaluation, other kind of guardrails against use. I think there are, like, different layers to this, of course, I think, and all of them are correct and all of them are necessary. I think the specific applications themselves there needs to be an element of regulation. I think at some point there needs to be, like, a user agreement as well about when they’re selling technologies and selling capabilities, how they agree to kind of abide by the terms. You sign it when you—the terms of use, right. And I think also then there are, of course, export controls that can be put on and certain—you’re allowed to do, the commercial side but you make the system itself—incompatibles are being used with other kinds of systems that would make it dangerous. But I think there’s also definitely room and necessary space for interstate collaboration on some of these, especially when you get—say, for example, when you introduce artificial intelligence into military systems, right, they make them faster. They make the decision-making process a lot more speedy, basically, and so the individual has to make quicker decisions. And so if you have things and when you introduce things like artificial intelligence to increasingly complex systems you have the ability for accidents to kind of snowball, right, where they become—as they go through. Like, one little decision can make a huge kind of impact and end up with a mistake, unfortunately. And so when you have the kind of situation when you’re forbid it’s in a—in a battlefield context, right. And let’s say the adversary says, oh, well, you intentionally shot down XYZ plane; and the individual said no, it was an auto malfunction and we had an AI in charge of it; who, in that fact, is responsible now? If it was not an individual now is it the—the blame kind of shifts up the pipeline. And so you’ve got problems like these. Like, that’s just one example. But, like, where you have increasingly automated systems and artificial intelligence that kind of shift how dynamics play out, especially in accidents, which require a lot of visibility, traditionally, and you have these technologies that are not so visible, not so transparent. You don’t really get to see how they work or understand how they think in the same way that you can say, if I pressed a button and you see the causality of that chain reaction. And so I think there is very much a need because of that for even adversaries—not necessarily just allies—to agree on how certain weapons will be used and I think that’s why there’s this space for confidence-building measures. I think a really—like, for example, a really simple kind of everyone already agrees on this is to have a human in the loop, right—a human control. When we eventually use artificial intelligence and automated systems increasingly in nuclear context, right, with nuclear weapons, I think everyone’s kind of on board with that. And so I think those are the kind of, like, building block agreements and kind of establishment of norms that can happen and that need to take place now before these technologies really start to be used. That will be essential to avoiding those worst case scenarios in the future. FASKIANOS: Great. Thank you. I’m going to take the next question—written question—from Alexander Beck, undergraduate at UC Berkeley. In the context of military innovation literature, what organizational characteristics or variables have the greatest effect on adoption and implementation, respectively? KAHN: Absolutely. I’m not an organizational expert. However, I’ll say, like before, I think that’s shifting, at least from the United States perspective. I think, for example, when the Joint Artificial Intelligence Center was created it was, like, the best advice was to create separate organizations that had the capability to kind of enact their own kind of agenda and to create separate programs for all of these to kind of best foster growth. And so that worked for a while, right. The JAIC was really great at promoting artificial intelligence and raising it to a level of preeminence in the United States. A lot of early success in making—raising awareness, et cetera. But now we’re seeing, there was some—a little bit of confusion, a little bit of concern, over the summer when they did establish the chief data—a digital and artificial intelligence office—excuse me. A lot of acronyms—when they—because they took over the JAIC. They subsumed the JAIC. There was a lot of worry about that, right. Like, they just established this great organization that we’ve had in 2019 and now they’re redoing it. And so I think they realized that as the technology develop, organizational structures need to develop and change as well. Like, in the beginning, artificial intelligence was kind of seen as its own kind of microcosm. But because it’s in a general purpose enabling technology it touches a lot more and so it needs to be thought more broadly rather than just, OK, here’s our AI project, right. You need to better integrate it and situate it next to necessary preconditions like the food for AI, which is data, right. So they reorganized to kind of ideally do that, right. They integrate it research and engineering, which is the arm in the Defense Department that kind of funds the basic research, to kind of have people understand policy as well. So they have all of these different arms now within this broader organization. And so there are shifts in the literature, I think, and there are different best cases for different kind of technologies. But I’m not as familiar with where the literature is going now. But that was kind of the idea has shifted, I think, even from 2018 to 2022. FASKIANOS: Thanks. We’re going to go next to Harold Schmitz. Q: Hey, guys. I think a great, great talk. I wanted to get your thoughts on AlphaFold, RoseTTAFold—DeepMind—and biological warfare and synthetic biology, that sort of area. Thank you. KAHN: Of course. I— Q: And, by the way—sorry—I should say I’m with the University of California Davis School of Management and also with the March Group—a general partner. Thank you. KAHN: I am really—so I’m really not familiar much with the bio elements. I know it’s an increasing area of interest. But I think, at least in my research, kind of taking a step back, I think it was hard enough to get people within the defense sector to acknowledge artificial intelligence. So I haven’t seen much in the debate, unfortunately, recently, just because I think a lot of the defense innovation strategy, at least in the Biden administration, is focused directly on the pacing—addressing the pacing challenge of China. And so they’ve mentioned biowarfare and biotechnology as well as nanotechnology and et cetera, but not as much in a comprehensive way as artificial intelligence and quantum in a way that I’m able to answer your question. I’m sorry. FASKIANOS: Thank you. I’ll go next to Alex, who has raised—and you’ll have to give us your last name and identify yourself. Q: Hi. Yes. Thank you. I’m Alex Grigor. I just completed my PhD at University of Cambridge. My research is specifically looking at U.S. cyber warfare and cybersecurity capabilities, and in my interviews with a lot of people in the defense industry, their number-one complaint, I suppose, was just not getting the graduates applying to them the way that they had sort of hoped to in the past. And if we think back at ARPANET and all the amazing innovations that have come out of the internet and can come out of the defense, do you see a return to that? Or do you see us now looking very much to procure and whatever from the private industry, and how might that sort of recruitment process be? They cited security clearances as one big impediment. But what else might you think that could be done differently there? KAHN: Yeah. Absolutely. I think security clearances, all the bureaucratic things, are a challenge, but even assuming that individual wants to work, I think right now if you’re working in STEM and you want to do research I think having two years, for example, in government and being a civilian, working in the Pentagon, for example, it looks—it doesn’t necessarily look like—allow you to jump then back into the private sector and academia, whereas other jobs do. So I think that’s actually a big challenge about making it possible for various reasons, various mechanisms, to kind of make it a reasonable kind of goal for not necessarily being a career in government but allowing people to kind of come and go. I think that’ll be a significant challenge and I think that’s in part about some of the ability to kind of contribute to the research that we spoke about earlier. I mean, the National Security Commission has a whole strategy that they’ve outlined on it. I’ve seen, again, like, piecemeal kind of efforts to overcome that. But nothing broad and sweeping reform as suggested by the report. I recommend reading it. It’s, like, five hundred pages long. But there’s a great section on the talent deficit. But, yeah, I think that will definitely be a challenge. I think cyber is facing that challenge. I just think anything that touches STEM in general, and so—and especially because I think the AI and particular machine learning talent pool is global and so states actually are, interestingly, kind of fighting over this talent pool. I’ve done a research previously also at the University of Oxford that looked at, like, the immigration preferences of researchers and where they move and things like that, and a lot of them are Chinese and studying in the United States. And they stay here. They move, et cetera. But a lot of it is actually also immigration and visas. And so other countries—China specifically made kind of for STEM graduates special visas. Europe has done it as well. And so I think that will also be another element at play. There’s a lot of these to kind of attract more talent. I mean, again, one of the steps that was tried was the Quad Fellowship that was established through the Indo-Pacific strategy. But, again, that’s only going to be for a hundred students. And so there needs to be a broader kind of effort to make it—to facilitate the flow of experts into government. To your other point about is this going to be what it looks like now about the private sector driving the bus, I think it will be for the time being unless DARPA and the defense agencies’ research arm and DOD change this acquisition process and, again, was able to get that talent, then I think—if something changes, then I think it will be able to, again, be able to contribute in the way that it has in the past. I think it’s important, too, right. There was breakthroughs out of cryptography. And, again, the internet all came from defense initially. And so I think it would be really sad if that was not the case anymore and I think especially as right now we’re talking about using—being able to kind of cross that bridge and work with the private sector and I think that will be necessary. I hope it doesn’t go too far that it becomes entirely reliant because I think DOD will need to be self-sufficient. It’s another kind of ecosystem to generate research in applications, and not all problems can be addressed by commercial applications as well. It’s a very unique problem set that defense and militaries face. And so I think there will need to be—right now, it’s a little bit heavy on needing to—there’s a little bit of a push right now, OK, we need to better work with the private sector. But I think, hopefully, overall, if it moves forward it will balance out again. FASKIANOS: Lauren, do you know how much money DOD is allocating towards this in the overall budget? KAHN: Off the top of my head, I don’t know. It’s a few billion. It’s, like, a billion. I think—I have to look. I can look it up. In the research 2023 budget request there was the highest amount requested ever for STEM research and engineering and testing and evaluation. I think it was—oh, gosh, it was a couple hundred million (dollars) but they had—it was a huge increase from the last year. So it’s an increasing priority. But I don’t have the specific numbers on how much. People talk about China funding more. I think it’s about the same. But it’s increasing steadily across the board. FASKIANOS: Great. So I’m going to give the final question to Darrin Frye, who’s an associate professor at Joint Special Operations University in the Department of Strategic Intelligence and Emergent Technologies, and his is a practical question. Managing this type of career how do you structure your time researching and learning about the intricacies of complex technologies such as quantum entanglement or nano-neuro technologies versus informing leadership and interested parties on the anticipated impact of emergent technologies on the future military operational environment? And maybe you can throw in there why you went into this field and why you settled upon this, too. KAHN: Yeah. I love this question. I have always been interested in the militarization of science and how wars are fought because I think it allows you to study a lot of different elements. I think it’s very interesting working at the intersection. I think, broadly speaking, a lot of the problems that the world is going to face, moving forward, are these transnational large problems that will require academia, industry, and government to kind of work on together from climate change and all of these emerging technologies, for example, global health, as we’ve seen over the past few years. And so I think it’s a little bit of a striking a balance, right. So I came from a political science background, international relations background, and I did want to talk about the big picture. And I think there are individuals kind of working on these problems and are recognizing them. But in that I noticed that I’m speaking a lot about artificial intelligence and emerging technologies and I’m not—I’m not from an engineering background. And so me, personally, I’m, for example, doing a master’s in computer science right now at Penn in order to shore up those kind of deficiencies and lack of knowledge in my sphere. I can’t learn everything. I can’t be a quantum expert and an AI expert. But I think having the baseline understanding and taking a few of those courses and more regularly has allowed me to when a new technology, for example, shows up that I can learn how—I know how to learn about that technology, which, I think, has been very helpful, speaks both languages, so to speak. I don’t think anyone’s going to be a master—you can’t be a master of one, let alone master of both. But I think it will be increasingly important to spend time learning about how these things work, and I think just getting a background in coding can’t hurt. And so it’s definitely something you need to balance. I would say I’m probably balanced more towards what are the implications of this, more broadly, since if you’re talking at such a high level it doesn’t help necessarily people without that technical background to get into the nitty gritty. It can get jargony very quickly, as I’m sure you guys understood listening to me even. And so I think there’s a benefit to learning about it but also make sure you don’t get too in the weeds. I think there are—I think a big important—there’s a lot of space for people who kind of understand both that can then bring those people who are experts, for example, on quantum entanglement and nanotechnology—to bring them so that when they’re needed they can come in and speak to people in a policy kind of setting. So there definitely is a room, I think, for intermediaries. There’s policy experts that people kind of sit in between and then, of course, the highly specialized expertise, which I think is definitely, definitely important. But it’s hard to balance. But I think it’s very fun as well because then you get to learn a lot of new things. FASKIANOS: Wonderful. Well, with that we are out of time. I’m sorry that we couldn’t get to all the written questions and the raised hands. But, Lauren Kahn, thank you very much for this hour, and to all of you for your great questions and comments. You can follow Lauren on Twitter at @Lauren_A_Kahn, and, of course, go to CFR.org for op-eds, blogs, and insight and analysis. The last academic webinar of this semester will be on Wednesday, November 16, at 1:00 p.m. (EST). We are going to be talking with Susan Hayward, who is at Harvard University, about religious literacy in international affairs. So, again, I hope you will all join us then. Lauren, thank you very much. And I just want to encourage those of you, the students on this call and professors, about our paid internships and our fellowships. You can go to CFR.org/careers for information for both tracks. Follow us at @CFR_Academic and visit, again, CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for research and analysis on global issues. So thank you all, again. Thank you, Lauren. Have a great day. KAHN: Thank you so much. Take care. FASKIANOS: Take care.

Foreign Policy

The Stephen M. Kellen Term Member Program is supported by a generous gift from the Anna-Maria and Stephen Kellen Foundation.
Social Justice and Equity

Racial inequality and other forms of social injustice undermine U.S. moral leadership abroad. The United States should renew its commitment to upholding U.S. values at home.

United States

Welcome to “Women Around the World: This Week,” a series that highlights noteworthy news related to women and U.S. foreign policy. This week’s post covers November 5 to November 10.

United States

With the U.S. Supreme Court overturning Roe v. Wade, it’s up to states to decide their own abortion laws. Watch to see what has changed so far in the United States and how it compares with other countries on abortion access.  

United States

Higher education provides students many socioeconomic benefits and increases the global competitiveness of the United States, but mounting student loan debt has sparked a debate over federal lending policies.
Trade and Finance

The United States will have to recalibrate its trade and finance policies to address a range of challenges in the coming years, including rising tensions with China and shifting global supply chains.

Economics

The consequences of the COVID-19 pandemic and the rise of China have prompted renewed debate about the U.S. government’s role in shaping the economy.

United States

It probably won’t, and that would be a mistake.

Monetary Policy

Over the past decade, the Fed kept interest rates low while it deployed trillions of dollars in stimulus and expanded its regulatory oversight. Now, the central bank is back in the spotlight for its battle against inflation. 

Experts in this Program

Edward Alden
Edward Alden

Bernard L. Schwartz Senior Fellow

Thomas J. Bollyky
Thomas J. Bollyky

Senior Fellow for Global Health, Economics, and Development and Director of the Global Health Program

Heidi Crebo-Rediker
Heidi Crebo-Rediker

Adjunct Senior Fellow

Roger W. Ferguson Jr.
Roger W. Ferguson Jr.

Steven A. Tananbaum Distinguished Fellow for International Economics

Alice C. Hill
Alice C. Hill

David M. Rubenstein Senior Fellow for Energy and the Environment

Jennifer Hillman
Jennifer Hillman

Senior Fellow for Trade and International Political Economy

Sebastian Mallaby
Sebastian Mallaby

Paul A. Volcker Senior Fellow for International Economics

Matthias Matthijs
Matthias Matthijs

Senior Fellow for Europe

Shannon K. O'Neil
Shannon K. O'Neil

Vice President, Deputy Director of Studies, and Nelson and David Rockefeller Senior Fellow for Latin America Studies

A. Michael Spence
A. Michael Spence

Distinguished Visiting Fellow

Laura Taylor-Kale
Laura Taylor-Kale

Fellow for Innovation and Economic Competitiveness

Christopher M. Tuttle
Christopher M. Tuttle

Senior Fellow and Director of the Renewing America Initiative