Artificial intelligence arms race

Donald Trump Joe Biden Barack Obama Elon Musk Sundar Pichai Jensen Huang Sam Altman Satya Nadella Andy Jassy Tim Cook Mark Zuckerberg Xi Jinping Hu Jintao Jiang Zemin Jack Ma Robin Li Liang Wenfeng Pony Ma Daniel Zhang Ren Zhengfei Tan Ruisong Lei Jun United StatesGoogleNvidiaOpenAIMicrosoftAmazonAppleTeslaMetaIBMIntelTexas InstrumentsLockheed Martin ChinaBaiduDeepSeekTencentAlibabaHuaweiSenseTimeiFlytekXiaomiMegviiYMTCSilanAVIC Est.

There are strong incentives for development teams to cut corners with regard to the safety of the system, increasing the risk of critical failures and unintended consequences.

[16] In 2023, a United States Air Force official reportedly said that during a computer test, a simulated AI drone killed the human character operating it.

[16] A US government report argued that "AI-enabled capabilities could be used to threaten critical infrastructure, amplify disinformation campaigns, and wage war"[18]:1, and that "global stability and nuclear deterrence could be undermined".

[21] The November 2019 'Interim Report' of the United States' National Security Commission on Artificial Intelligence confirmed that AI is critical to US technological military superiority.

The organization's stated objective is to "transform the US Department of Defense by accelerating the delivery and adoption of AI to achieve mission impact at scale.

[33] Reportedly, Pentagon development stops short of acting as an AI weapons system capable of firing on self-designated targets.

[37] Its chief, U.S. Marine Corps Col. Drew Cukor, said: "People and computers will work symbiotically to increase the ability of weapon systems to detect objects.

"[38] Project Maven has been noted by allies, such as Australia's Ian Langford, for the ability to identify adversaries by harvesting data from sensors on UAVs and satellite.

[18][41] According to a February 2019 report by Gregory C. Allen of the Center for a New American Security, China's leadership – including paramount leader Xi Jinping – believes that being at the forefront in AI technology is critical to the future of global military and economic power competition.

"[7] The close ties between Silicon Valley and China, and the open nature of the American research community, has made the West's most advanced AI technology easily available to China; in addition, Chinese industry has numerous home-grown AI accomplishments of its own, such as Baidu passing a notable Chinese-language speech recognition capability benchmark in 2015.

[11] Before 2013, Chinese defense procurement was mainly restricted to a few conglomerates; however, as of 2017, China often sources sensitive emerging technology such as drones and artificial intelligence from private start-up companies.

[21] China published a position paper in 2016 questioning the adequacy of existing international law to address the eventuality of fully autonomous weapons, becoming the first permanent member of the U. N. Security Council to broach the issue.

[47] In 2019, former United States Secretary of Defense Mark Esper lashed out at China for selling drones capable of taking life with no human oversight.

[51] In 2021, the Indian Army, with assistance from the National Security Council, began operating the Quantum Lab and Artificial Intelligence Center at the Military College of Telecommunication Engineering.

With an emphasis on robotics and artificial intelligence, Defence Research and Development Organisation and Indian Institute of Science established the Joint Advanced Technology Programme-Center of Excellence.

[61][62] MoD earmarked ₹1,000 crore annually till 2026 for capacity building, infrastructure setup, data preparation, and Al project implementation.

[75][76] Army is developing autonomous combat vehicles, robotic surveillance platforms, and Manned-Unmanned Teaming (MUM-T) solutions as part of the Defence AI roadmap.

[83] In order to conduct research on autonomous platforms, improved surveillance, predictive maintenance, and intelligent decision support system, the Indian Army AI Incubation Center was established.

Russia plans to use Nerehta as a research and development platform for AI and may one day deploy the system in combat, intelligence gathering, or logistics roles.

[99][100][96] In addition, the Russian military plans to incorporate AI into crewless aerial, naval, and undersea vehicles and is currently developing swarming capabilities.

The acceptable collateral damage and the type of weapon used to eliminate the target is decided by IDF members and could track militants even when at home.

[107] Israel's Harpy anti-radar "fire and forget" drone is designed to be launched by ground troops, and autonomously fly over an area to find and destroy radar that fits pre-determined criteria.

[108] The application of artificial intelligence is also expected to be advanced in crewless ground systems and robotic vehicles such as the Guardium MK III and later versions.

Our technology therefore plugs the gaps in human capability", and they want to "get to a place where our software can discern whether a target is friend, foe, civilian or military".

[120] AI arms control will likely require the institutionalization of new international norms embodied in effective technical specifications combined with active monitoring and informal diplomacy by communities of experts, together with a legal and political verification process.

[1][2] As early as 2007, scholars such as AI professor Noel Sharkey have warned of "an emerging arms race among the hi-tech nations to develop autonomous submarines, fighter jets, battleships and tanks that can find their own targets and apply violent force without the involvement of meaningful human decisions".

[129] Many experts believe attempts to completely ban killer robots are likely to fail,[130] in part because detecting treaty violations would be extremely difficult.

[134] A 2015 open letter by the Future of Life Institute calling for the prohibition of lethal autonomous weapons systems has been signed by over 26,000 citizens, including physicist Stephen Hawking, Tesla magnate Elon Musk, Apple's Steve Wozniak and Twitter co-founder Jack Dorsey, and over 4,600 artificial intelligence researchers, including Stuart Russell, Bart Selman and Francesca Rossi.

Professor Noel Sharkey of the University of Sheffield argues that autonomous weapons will inevitably fall into the hands of terrorist groups such as the Islamic State.

The Sea Hunter sails out to sea
The Sea Hunter , an autonomous US warship, 2016
A cartoon centipede reads books and types on a laptop.
Putin (seated, center) at National Knowledge Day, 2017