AI-assisted targeting in the Gaza Strip

Israel has greatly expanded the bombing of the Gaza Strip, which in previous wars had been limited by the Israeli Air Force running out of targets.

These tools include the Gospel, an AI which automatically reviews surveillance data looking for buildings, equipment and people thought to belong to the enemy, and upon finding them, recommends bombing targets to a human analyst who may then decide whether to pass it along to the field.

Critics have argued the use of these AI tools puts civilians at risk, blurs accountability, and results in militarily disproportionate violence in violation of international humanitarian law.

"[8] Bianca Baggiarini, lecturer at the Australian National University's Strategic and Defence Studies Centre wrote AIs are "more effective in predictable environments where concepts are objective, reasonably stable, and internally consistent."

[15] Khlaaf went on to point out that such a system's decisions depend entirely on the data it's trained on,[b] and are not based on reasoning, factual evidence or causation, but solely on statistical probability.

[17] In the France 24 interview Abraham, of +972 Magazine, characterized this as enabling the systematization of dropping a 2000 lb bomb into a home to kill one person and everybody around them, something that had previously been done to a very small group of senior Hamas leaders.

[27] NPR cited a report by +972 Magazine and its sister publication Local Call as asserting the system is being used to manufacture targets so that Israeli military forces can continue to bombard Gaza at an enormous rate, punishing the general Palestinian population.

[23] In principle, the combination of a computer's speed to identify opportunities and a human's judgment to evaluate them can enable more precise attacks and fewer civilian casualties.

[16][29] Richard Moyes, researcher and head of the NGO Article 36, pointed to "the widespread flattening of an urban area with heavy explosive weapons" to question these claims,[29] while Lucy Suchman, professor emeritus at Lancaster University, described the bombing as "aimed at maximum devastation of the Gaza Strip".

[33] The system depends entirely on training data,[16] and intel that human analysts had examined and deemed didn't constitute a target had been discarded, risking bias.

The six said Lavender had played a central role in the war, rapidly processing data to identify potential junior operatives to target, at one point listing as many as 37,000 Palestinian men linked by AI to Hamas or PIJ.

[37] Citing multiple sources, the Guardian wrote that in previous wars identifying someone as a legitimate target would be discussed and then signed off by a legal adviser, and that, after 7 October, the process was dramatically accelerated, there was pressure for more targets, and to meet the demand, the IDF came to rely heavily on Lavender for a database of individuals judged to have the characteristics of a PIJ or Hamas militant.

Information systems are merely one of the types of tools that help analysts gather and optimally analyze intelligence from various sources for the process of identifying military targets, and according to IDF directives, analysts must conduct independent examinations to verify the targets meet the relevant definitions in accordance with international law and the additional restrictions of the IDF directives.

[45] Citing unnamed conflict experts, the Guardian wrote that if Israel has been using dumb bombs to flatten the homes of thousands of Palestinians who were linked with AI assistance to militant groups in Gaza, it could help explain what the newspaper called the shockingly high death toll of the war.

"[47] The IDF's response to the publication of the testimonies said that unlike Hamas, it is committed to international law and only strikes military targets and military operatives, does so in accordance to proportionality and precautions, and thoroughly examines and investigates exceptions;[48] that a member of an organized armed group or a direct participant in hostilities is a lawful target under international humanitarian law and the policy of all law-abiding countries;[49] that it "makes various efforts to reduce harm to civilians to the extent feasible in the operational circumstances ruling at the time of the strike"; that it chooses the proper munition in accordance with operational and humanitarian considerations; that aerial munitions without an integrated precision-guide kit are developed militaries' standard weaponry; that onboard aircraft systems used by trained pilots ensure high precision of such weapons; and that the clear majority of munitions it uses are precision-guided.

[42] In an opinion piece in Le Monde, reporter Élise Vincent [fr] wrote that automated weapons are divided into fully automated systems, which aren't really on the market, and lethal autonomous weapons, which in principle allow human control, and that this division allows Israel to claim the Gospel falls on the side of the more appropriate use of force.

"[61] United Nations Secretary-General, Antonio Guterres, said he was “deeply troubled” by reports that Israel used artificial intelligence in its military campaign in Gaza, saying the practice puts civilians at risk and blurs accountability.

[62] Speaking about the Lavender system, Marc Owen Jones, a professor at Hamad Bin Khalifa University stated, "Let’s be clear: This is an AI-assisted genocide, and going forward, there needs to be a call for a moratorium on the use of AI in the war".

[63] Ben Saul, a United Nations special rapporteur, stated that if reports about Israel's use of AI were true, then "many Israeli strikes in Gaza would constitute the war crimes of launching disproportionate attacks".