The Path of Most Resistance: Artificial General Intelligence as a Non-Kinetic Innovation
This article was published alongside the PWH Report “The Future of Artificial Intelligence Governance and International Politics.”
When introducing new technologies to international politics, we seek comparisons. Electricity, chemical compounds that can decimate local populations, and, above all, nuclear weapons loom in the psyche of strategists and analysts. But an innovation, especially one on the horizon and that has not yet “arrived” like artificial general intelligence (AGI), is additive to the international security landscape and is likely to follow the path of other similar, non-kinetic technology innovations in international security.
How states cooperated to—or failed to—control other new information technologies offer lessons for managing any introduction of AGI. The sooner analysts and strategists recognize AGI would be among the non-kinetic technology innovations in security affairs, the sooner strategists can find solutions to minimize civilian harm and restrict the direst security consequences of AGI.
AGI Would Be Information Technology
Artificial Intelligence (AI) and AGI fit the standard definitions of information technology. Like electricity or the first digital computers, AI portends to transform flows of information and communication. Their use is not transparent to a bystander operating in real life or in the “offline world.” To have military effect or vulnerability, the technology must be integrated with hardware or infrastructure, like civilian water plants, pipelines, or fighter jets. AGI is unlikely to emerge as a kinetic weapon.
Historically, arms control has relied on transparent tests, quantities, or use and are unlikely to apply to AGI. Normative and regulatory approaches are more likely; the record of international approaches to managing other non-kinetic technologies reflects this (table 1). AI and AGI are likely to forge a similar path in international security to other non-kinetic technologies, like access to the electromagnetic spectrum, internet-enabled digital technologies, and process automation. Each has transformed warfighting and civilian uses, but international cooperation and controls have been less advanced, binding, or sophisticated than those for kinetic weapons. It is also notable that in the past, as new non-kinetic technologies have appeared, they have been compared to nuclear weapons, even if scholars determined those comparisons were misleading.
| Non-Kinetic Technology | Sample International Security Implication(s) | International Coordination & Controls |
| Internet/networked connectivity | Offensive cyber exploits; vulnerabilities of military and civilian systems, some with limited discrimination | UN GGE reports (e.g., no international cyber cessation treaty, laws of armed conflict apply), confidence-building measures |
| Access to the electromagnetic spectrum | Information and communication warfare (e.g., radio frequency jamming, signals intelligence) | Spectrum regulations, International Telecommunications Union |
| Automation | Precision-guided munitions, missile defense, enhanced command, control, and communications | None |
The International Telecommunications Union (ITU), set up in 1865, helps to coordinate electromagnetic spectrum operations. No international agreements have restricted process automation, although myriad military applications have been in effect since World War Two. Concerns about cyberwarfare brought about multiple United Nations Groups of Governmental Experts; they recommended confidence-building measures and non-binding constraints for states to address cyber. Multinational enforcement or accountability measures to curb cyberthreats remain pending while the United States remains vulnerable to adversary attacks. Cyber and electromagnetic spectrum use brought about international cooperation, standards-building, and controls that reflect their dual-use and non-kinetic nature.
Kinetic technologies, on the other hand, have received more attention from leaders and in some cases have led to binding treaties, quantitative arms control measures, and oversight and enforcement mechanisms (table 2). Use of the skies led to multilateral aviation regulation to distinguish civilian from military uses. Chemical and biological compounds that could have non-discriminatory uses against militaries and civilians alike led to normative then controlling schemes, including some oversight or enforcement mechanisms. Even spaceflight—arguably still in its early stages of exploitation for military and civilian purposes—is subject to multilateral efforts to restrict the most egregious harms and some coordination on satellite communications. Nuclear fission, revealed in the “wonder weapons” of the United States’ bombing of Japan, led to a multilateral non-proliferation treaty and safeguards regime to affirm civilian uses did not convert into military applications. While some kinetic weapons are governed through a mix of formal treaties, informal norms, and standards, non-kinetic technologies have, at best, been subject only to informal cooperative arrangements.
| Kinetic Technology Development | Sample International Security Implication(s) | International Accords, Controls |
| Aviation | Bombing raids, air-launched missile and munitions, transport | Chicago Convention, International Civil Aviation Organization to isolate military and civilian uses |
| Chemical compounds | Mass, non-discriminatory effects of chemical weapons use | Geneva Protocol, Chemical Weapons Convention, Organization for the Prohibition of Chemical Weapons |
| Biological compounds | Mass, non-discriminatory effects of biological weapons use | Geneva Protocol, Biological Weapons Convention (limited enforcement mechanisms) |
| Spaceflight | Orbital access for intelligence and surveillance | Outer Space Treaty, ITU satellite registration |
| Nuclear fission | “Nuclear revolution,” mass, non-discriminatory, multi-generational effects of nuclear weapons use | Treaty on the Nonproliferation of Nuclear Weapons, Bilateral arms control |
| Autonomy | Unmanned, autonomous warfighting and munitions | Political Declaration on the Responsible Use of Military AI |
Dual-use information technologies have discrete civilian concerns. The intangibility of the harms and ambiguous, multifaceted uses of the non-kinetic technologies reduce the urgency for leaders to develop controls, especially if it means ceding national advantage. Why would AGI be an exception?
Information Warfare
Recent analyses of AGI and more advanced “superintelligence” that may be looming have compared AGI in international security to a nuclear style “wonder weapon.” Could AGI prompt strategic surprise? Yes, but only if its applications are both transparent and integrated into military capabilities. The most acute way AGI could transform international security is through information.
AGI’s status as a non-kinetic tool does not mean it can introduce no novel harms. Capable large language models can democratize previously sensitive, protected information such as recipes for building weapons of mass destruction or terrorism. Novel cyberattacks on critical infrastructure (once networked) demonstrate that new non-kinetic technologies can enhance the vulnerability of civilians to harms. A pipeline or water treatment plant, recent targets of successful cyberattacks in the United States, were always vulnerable to kinetic attacks but the barriers to attack were higher and, historically, could signal the launch or escalation of a wider conflict.
Information technology innovations also democratize access to information, inviting malign actors to shape narrative, introduce falsehoods, or wage psychological battles against adversary leaders and civilians. The risk is high that an AGI-driven deepfake could alter the information intelligence analysts or well-placed policy staff of a U.S. president receive. More dangerous is the potential for civilians and leaders to distrust reputable information sources in the age of AGI, rendering no clarity on what is true or false. AGI could contribute to changing minds.
If militaries decide to integrate AGI into weapons, the technology’s use could enhance lingering security concerns and could complicate the effectiveness of prior international controls. For example, investigative activities of the Organisation for the Prohibition of Chemical Weapons could become more challenging if adversarial AGI alters the data or access to potential cases where chemical weapons use may have occurred. If militaries with nuclear weapons insert AGI tools into early warning data analysis, the lack of transparency into the dataset and analysis would be highly likely to spur nuclear crises and convey inaccurate information through chains of command. If used due to miscalculation or oversight, nuclear use would bypass and ignore the tenets of the international non-proliferation regime (with the Non-Proliferation Treaty as a backbone). Inserting AGI into autonomous systems could enhance the effectiveness of unmanned/uncrewed capabilities in theatre but may also undermine multinational efforts to maintain responsible use of AI in the military domain. There are inevitable human-machine teaming and technical challenges involved in integrating AGI into military capabilities, too. In addition to the risks of automation bias among operators, test and evaluation protocols must also contend with sandbagging or scheming from more advanced AI models to ensure effectiveness.
AGI could also prompt strategic surprise when connected to or powering a kinetic capability, bringing about a novel or more lethal use. The Stuxnet operation applied cyber tools to slow Iranian centrifuges that could produce weapons-grade nuclear materials. AGI’s entry would also likely prompt remediation and response from adversaries, as fields of fiber optic cables stretching across the Ukrainian battlefield remind us.

Figure 1: Field of Fiber Optic cables for FPV drones, Ministry of Defence of Ukraine
Learning Lessons
States have weak track records in cooperating and controlling non-kinetic technologies. It is unlikely that the introduction of limited AGI applications would prompt novel international approaches.12 Prior innovations and their diffuse applications do not go away and are rarely supplanted by a new technology, even powerful information technology. As James Fearon writes, AGI cannot supplant the nuclear revolution.13 In the case of nuclear weapons and the revolution in military affairs they introduced, security concerns of air, sea, and space were constant even while the implications of mass destructive ability were realized. In the age of AGI, international security will continue to grapple with “legacy” concerns. But by recognizing AGI as an information technology that is inherently non-kinetic, analysts and strategists may find lessons for averting the worst harm in the successes and failures to control other non-kinetic technologies in military and security affairs.