China Russia United States

As the US, China, and Russia build new nuclear weapons systems, how will AI be built-in?

Written by Matt Field

Researchers in the United States and elsewhere are paying a lot of attention to the prospect that in the coming years new nuclear weapons—and the infrastructure built to operate them—will include greater levels of artificial intelligence and automation. Earlier this month, three prominent US defense experts published a comprehensive analysis of how automation is already involved in nuclear command and control systems and of what could go wrong if countries implement even riskier forms of it.

The working paper “A Stable Nuclear Future? The Impact of Autonomous Systems and Artificial Intelligence” by the team of Michael Horowitz, Paul Scharre, and Alexander Velez-Green comes on the heels of other scholarly takes on the impact artificial intelligence (AI) will have on strategies around using nuclear weapons. All this research reflects the fact that militaries around the world are incorporating more artificial intelligence into non-nuclear weaponry—and that several countries are overhauling their nuclear weapons programs. “We wanted to better understand both the potentially stabilizing and destabilizing effects of automation on nuclear stability,” Scharre, a senior fellow at the Center for a New American Security, told the Bulletin.

“In particular, as we see nations modernize their nuclear arsenals, there is both a risk and an opportunity in how they use automation in their nuclear operations.”

The report notes that nuclear weapons systems already include some automated functionality: For example, warning systems automatically alert nuclear weapons operators of an attack. After the Cold War, Russian missiles were programmed to automatically retarget themselves to hit US targets if they were launched without a flight plan. For its part, the United States at one point designed its entire missile arsenal so that it could be retargeted in seconds from its peacetime default of flying into the ocean. Even these forms of automation are risky as an accidental launch could “spark a nuclear war,” the report says. But some countries, the report warns, might resort to riskier types of automation.

Read more at The Bulletin of Atomic Scientists

About the author

Matt Field

Matt Field is an associate editor at the Bulletin of the Atomic Scientists. Before joining the Bulletin, he covered the White House, Congress, and presidential campaigns as a news producer for Japanese public television. He has also reported for print outlets in the Midwest and on the East Coast. He holds a master’s degree in journalism from Northwestern University.

Leave a Comment