April 7 (UPI) – Israel has defended its utilization of Lavender, an artificial intelligence tool, for gathering and analyzing information concerning purported Hamas targets before military actions.
Nadav Shoshani, a spokesperson for the Israeli Defense Forces, addressed a report in The Guardian alleging that Israel employed AI technology to identify 37,000 reported Hamas targets. The progressive Jewish publication +972 initially disclosed Israel’s use of AI in collaboration with Local Call, its affiliated outlet.
According to the +972 report, “Lavender has been instrumental in pinpointing Palestinians, especially in the initial stages of the conflict.” The sources indicated that the government essentially treated the AI-generated data “as if it were a human decision.”
The authenticity of the report was verified by The Guardian and echoed in a piece by The Washington Post, suggesting that the utilization of such technology “might elucidate the extent of destruction unleashed by Israel in Gaza.” John Kirby, a spokesperson for White House national security, informed CNN that the US was investigating Israel’s use of the tool.
Similar to databases utilized by other armed forces, “The Guardian post mentions an extensive database that cross-references existing data on terrorist groups.” Shoshani contended on social media that there is no roster of personnel capable of carrying out attacks.
While Israel has consistently asserted its right to target any Hamas entity, Shoshani’s distinction remains unclear. A database containing information about alleged Hamas affiliates effectively serves as a roster of potential targets.
Shoshani stated, “This database is meant to assist human research, not supplant it. Adhering to IDF directives and international law, all intelligence analysts must conduct an impartial review in the target selection process.”
Each intelligence target necessitates approval from an intelligence officer and field commander. The IDF targeting procedure is a distinct process requiring autonomous decision-making by commanders on whether to engage the target.
Israeli intelligence operatives utilizing the Lavender software, as reported by The Guardian, questioned the significance of their roles in the target selection process.
“At this juncture, I would allocate 20 seconds to each target, a task I perform numerous times daily. I contributed no value as a human, except for providing approval. It was a time-saver,” one officer remarked, while another mentioned feeling “constantly pressured” by superiors to identify more targets.