In 2016, the World Bank’s World Development Report explicitly highlighted the importance of cybersecurity, noting that “some of the perceived benefits of digital technologies are offset by emerging risks.” This observation remains relevant today. However, integrating cybersecurity or digital risk management into development projects has been slow. In a recent New America report, I lay out the reasons why cybersecurity should be folded into existing development activities, rather than isolated as a separate specialization.
The role of cybersecurity for the delivery of development outcomes and the attainment of the Sustainable Development Goals is potentially massive. Today, approximately 80 percent of World Bank projects have a tech component. In the future, this number will creep closer and closer to 100 percent. If information technology is untrustworthy and unreliable, countries and citizens may not fully reap the rewards of digitization. Worse yet, increased dependence on unreliable digital tech may threaten to undo development gains. Simply put, people must be able to trust the technology if it is to deliver on its immense promise.
It is imperative that development actors in lower- and middle-income countries have the ability to manage digital risk. Yet, cybersecurity or principles of cyber risk management remain underused in the development community for a number of reasons.
The first, and perhaps primary, challenge to mainstreaming cybersecurity in development is donors’ hesitancy to fully embrace cybersecurity as a development issue. In the development world, as elsewhere, metrics steer investment. Efforts to identify ways to measure the outcomes of cybersecurity investments in developing nations has started, mainly through the Global Forum on Cyber Expertise, but meaningful progress on actually developing and collecting metrics is slow. Lacking these metrics, it becomes difficult to craft empirically-driven arguments for what cybersecurity capacity development interventions produce the most positive outcomes.
A second challenge is aid recipients’ uncertainty with regard to how to build their own cybersecurity capacity and manage cybersecurity risk. While aid recipients are increasingly interested in investing in cybersecurity, three factors prevent them from doing so: (1) the complexity and technological nature of cybersecurity and risk management sometimes leaves aid recipients unsure where to start; (2) the perception that internet access and cybersecurity are competing for political attention and finances; and (3) cost. Some tools have been developed to mitigate these concerns, but they fail to provide a clear and easy outline of what capacity development to prioritize and when.
A third challenge is balancing cybersecurity capacity building with other development objectives. Development spending can be perceived as zero-sum—money spent on cybersecurity could be seen as taking away from money spent on addressing other development needs, like education, access to water, or health. Policymakers seeking to mainstream cybersecurity in development must understand that, while cybersecurity often contributes to the attainment of development goals, it is not always that primary contributor and sometimes other equities will justly receive greater attention and funding.
Finally, there is a shortage of cybersecurity expertise in important development institutions. Integrating cybersecurity talent into the development community is necessary to equip it with the expertise to implement cybersecurity projects. Some have sought to solve this problem by contracting outside experts, but consultants’ profit motives can lead to advice that encourages inefficient or ineffective spending.
So what is to be done? As a start, cybersecurity needs to be reframed, especially in the context of development. This should include discussions around cybersecurity, risk management, sustainability, resilience, and trust. Cybersecurity, in this framing, is an enabler for the attainment of Sustainable Development Goals, not just a defense from malicious actors or foreign threats.
Inherent in this reframing is the need to clearly demonstrate the positive effects of cybersecurity on development. Because the development community relies on metrics to drive decision making, more must be done in academia, government, and the private sector to identify, gather, and analyze empirical data on both how best to manage cybersecurity risk in developing countries and how best to develop local capacity to do so. Together, this information should form the basis for an empirical argument for increased investment in managing cybersecurity risk. In parallel, cybersecurity experts should provide those in the donor community with tools to understand and manage cyber risk. In order to scale the inclusion of cyber risk management in development activities, the cybersecurity community must create templates for digital risk impact assessments (modeled after assessments like Human Rights Impact Assessments and Environmental Impact Assessments) for development programs and projects.
Increased investment in managing cybersecurity risk, if done properly, would help development actors improve their cybersecurity savvy. With their new knowledge and in-house expertise, donors and develop institutions will be in a better position to guide aid recipients toward cost effective cybersecurity solutions.
Hard-working and well-meaning individuals and organizations have taken on the challenge of building cybersecurity capacity in the developing world. They should be commended. However, the development community, with its immense financial resources and powerful networks, must become more heavily invested in cybersecurity given that, in the near future, all development programs will rely on digital technology in some form or another.