06 June, 2018
The United States military is reported to be increasing spending on a secret research effort involving artificial intelligence (AI).
The Reuters news agency says military officials are exploring whether they can use AI to help predict the launch of a nuclear missile. Officials hope that, with the help of computer programs, they will be able to follow and target mobile launchers in North Korea and other places.
The effort has gone largely unreported. The few public details about it are buried in complex language in the latest defense department budget. But Reuters spoke with U.S. officials with knowledge of the research. They said the military now has several projects to explore how to develop AI systems to protect the country against a possible nuclear missile strike.
If the research is successful, such computer systems would be able to ‘think' for themselves, reproducing intelligent human behavior. AI programs could study huge amounts of information, including satellite imagery, faster than human beings and find signs of preparations for a missile launch.
Reuters says its story about the secret research is based on at least six sources. They included U.S. officials, who spoke on condition that they would not be publicly identified.
If the U.S. government had information about a possible missile launch, it would be able to seek a diplomatic solution. If the government believed there would be an attack, it would have more time to try to destroy the missiles before they were launched, or try to shoot them down.
"We should be doing everything in our power to find that missile before they launch it and make it increasingly harder to get it off (the ground)," one of the officials said.
President Donald Trump and his administration support the research effort. The administration plans to increase spending to $83-million in next year's budget for just one of the AI-powered missile programs, according to U.S. officials and budget documents.
The budget increase shows the growing importance of research on AI-powered anti-missile systems. It comes at a time when the United States faces a more militarily aggressive Russia and a serious nuclear threat from North Korea.
One person with knowledge of the programs said it includes a project aimed at North Korea. The U.S. government is worried about the North's development of mobile missiles that can be easily hidden.
While the project is secret, the military has been clear about its interest in AI. The Defense Department has said it is using AI to identify objects from video gathered in its drone aircraft program.
Yet some military officials are worried about AI spending. They say the current amount is too low.
AI arms race
The U.S. military is in a race against China and Russia to add more AI to its war machine. Officials want to create new systems that can teach themselves to perform complex actions.
The Pentagon's research on using AI to identify possible missile threats and follow movable launchers is very new. It is also just one small part of the growth of AI. But few details have been made public.
A U.S. official told Reuters that the military was already testing an early model of a system to follow mobile missile launchers.
The project involves military and private researchers in the Washington D.C. area. It is making use of new technology developed by private businesses. Those companies got financial assistance from an investment fund called In-Q-Tel, officials said.
The project is using the intelligence community's cloud service, studying how computer data is organized. It operates complex radar that can see through storms and greenery.
Budget documents seen by Reuters show the Pentagon will expand the program to "the remainder of the (Pentagon) 4+1 problem sets."
The U.S. military often uses the term ‘4+1' when talking about China, Russia, Iran, North Korea and terrorist groups.
Turning turtles into guns
Both supporters and critics of using AI to hunt missiles agree that it carries a lot of risk. It could speed up decision-making in a nuclear crisis, but the use of computers could also increase the possibility of computer errors. It might also start an AI arms race with Russia and China that could overturn the current nuclear balance.
The top commander of U.S. nuclear forces, Air Force General John Hyten said the Pentagon must create safeguards when AI systems become operational. This will be the only way to make sure that humans – not computers – control nuclear decisions, he explained.
The decision to use nuclear weapons has been called the "escalation ladder."
"[Artificial intelligence] could force you onto that ladder if you don't put the safeguards in," Hyten said. "Once you're on it, then everything starts moving."
Experts at the RAND Corporation and other research groups say it is likely that China and Russia would learn to hide their missiles from identification.
There is some evidence that says they could be successful.
At the Massachusetts Institute of Technology, students tricked the AI system by using Google images. The computer program was made to believe that a plastic turtle was really a gun.
Dr. Steven Walker, director of the Defense Advanced Research Projects Agency (DARPA), said the Pentagon will need humans to look at AI decisions.
"Because those systems can be fooled," Walker said.
DARPA is working on a project to make AI systems better able to explain themselves to human researchers. DARPA officials believe the new project will be important for national security programs.
We can't be wrong
Among the people working to improve AI is William "Buzz" Roberts of the National Geospatial Agency (NGA). He works in the government's efforts to develop AI to help examine satellite imagery. Information from satellite imagery is used by missile hunters.
Last year, the NGA said it used AI to look at 12 million images. So far, Roberts said, NGA researchers have made progress in getting AI to help identify whether or not a target of interest is present. He would not talk about individual programs.
In trying to understand possible national security threats, the NGA researchers work under a different kind of pressure from researchers in private businesses.
"We can't be wrong," said Roberts.
I'm Susan Shand, and I'm Dorothy Gundy.
Reuters reported this story. Susan Shand adapted this story and George Grow edited it.
_____________________________________________________________
Words in This Story
artificial – adj. not natural or real
mobile – adj. able to be moved
source – n. someone or something that provides what is wanted or needed
according – adv. as stated by or in
fund – n. a plan or program
error – n. a mistake or problem
cloud – n. the large computers (called servers) that you can connect to on the Internet and use for storing data
escalation – n. to become worse or to make (something) worse or more severe
ladder – n. a series of steps or stages by which someone moves up to a higher or better position
turtle – n. a reptile that lives mostly in water and that has a hard shell which covers its body