
File photo: Palestinians pray over bodies of people killed in the Israeli bombardment who were brought from the Shifa hospital before burying them in a mass grave in the town of Khan Younis, southern Gaza Strip on Nov. 22, 2023. AP
According to a report in independent Israeli-Palestinian magazine +972, Israel has used AI to identify targets in Gaza -- in some cases with as little as 20 seconds of human oversight.
Guterres said that he was "deeply troubled by reports that the Israeli military's bombing campaign includes Artificial Intelligence as a tool in the identification of targets, particularly in densely populated residential areas, resulting in a high level of civilian casualties."
"No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms," he said.
The +972 report revealed that "the Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties."
The report said that, according to "six Israeli intelligence officers", a system dubbed Lavender had "played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war."
"According to the sources, its influence on the military's operations was such that they essentially treated the outputs of the AI machine 'as if it were a human decision'," +972 reported.
Two sources said "the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians".
If "the target was a senior Hamas official... the army on several occasions authorized the killing of more than 100 civilians," it added.
Israel's deadliest war on Gaza has killed at least 33,091 people, mostly women and children, and injured over 75,750 others, according to the Gaza health ministry.
The United Nations has warned of imminent famine in the Palestinian territory under siege by Israel.
'Mass assassination factory'
Israel began hyping AI-powered targeting after an 11-day Israeli onslaught on Gaza in May 2021, which commanders branded the world's "first AI war".
The military chief during the 2021 war, Aviv Kochavi, told Israeli news website Ynet last year the force had used AI systems to identify "100 new targets every day", instead of 50 a year previously.
Weeks into Israel's war on Gaza, a blog entry on the Israeli military's website said its AI-enhanced "targeting directorate" had identified more than 12,000 targets in just 27 days.
An unnamed Israeli official was quoted as saying the AI system, called Gospel, produced targets "for precise attacks on infrastructure associated with Hamas, inflicting great damage on the enemy and minimal harm to those not involved".
But an anonymous former Israeli intelligence officer, quoted in November by +972, described Gospel's work as creating a "mass assassination factory".
In a rare confession of wrongdoing, Israel on Friday admitted a series of errors and violations of its rules in the killing of seven aid workers in Gaza, saying it had mistakenly believed it was "targeting armed Hamas operatives".
'War crimes'
Alessandro Accorsi, a senior analyst at Crisis Group, said the +972 report was "very concerning".
"It feels very apocalyptic. It's clear... the degree of human control is very low," he told AFP.
"There are a thousand questions around this obviously -- how moral it is to use it -- but it is hardly surprising it is used," he said.
Johann Soufi, a human rights lawyer and former director of the UN Palestinian refugee agency UNRWA's legal office in Gaza, said the +972 article described methods that were "undeniably war crimes".
They were "likely crimes against humanity" in view of the high civilian casualties, he added on X, formerly Twitter.
*This story was edited by Ahram Online
Short link: