A Massachusetts Democrat is calling on the U.S. to pass legislation that would keep artificial intelligence away from nuclear power.
On Thursday, Sen. Edward Markey said, “78 years ago this weekend, Robert Oppenheimer witnessed the world’s first nuclear weapons explosion. In 2023, we face a new kind of nuclear threat: the militarization of increasingly powerful artificial intelligence systems.“
“We must pass legislation to keep AI away from the nuclear button before it’s too late,” he asserted.
Markey’s office said he filed over a dozen amendments to the National Defense Authorization Act, including language that would prohibit the use of AI in the U.S. military’s nuclear launch decisions.
LAWMAKERS RATTLED BY AI-LAUNCHED NUKES, DEMAND ‘HUMAN CONTROL’ IN DEFENSE POLICY BILL
Additionally, the office said the amendments would advance nuclear disarmament and nonproliferation and save billions in federal dollars by shifting wasteful spending from nuclear weapons development to vaccine research.
One amendment is based on the Block Nuclear Launch by Autonomous Artificial Intelligence Act, which would prohibit AI from making nuclear launch decisions.
Markey and bipartisan Reps. Ted Lieu, Don Beyer and Ken Buck introduced the legislation in April.
AI BANNED FROM RUNNING NUCLEAR MISSILE SYSTEMS UNDER BIPARTISAN BILL
“We need to keep humans in the loop on making life or death decisions to use deadly force, especially for our most dangerous weapons,” Markey said then.
“While U.S. military use of AI can be appropriate for enhancing national security purposes, use of AI for deploying nuclear weapons without a human chain of command and control is reckless, dangerous, and should be prohibited,” said Buck, R-Colo.
CLICK HERE TO GET THE FOX NEWS APP
Their bill would codify existing Pentagon policy that requires a human to be “in the loop” for any decisions regarding the use of nuclear weapons.
In its fiscal year 2024 budget, the Pentagon is calling for $1.8 billion solely toward research and development of AI capabilities.
Fox News’ Elizabeth Elkind contributed to this report.