Written by 6:39 am AI, Latest news

### Unveiling Israel’s Military AI: A Terrifying Insight

The IDF is allegedly using an AI-powered database identify targets to eliminate, according to a new…

You’re perusing an excerpt from the Today’s WorldView newsletter. Sign up to receive the complete version at no cost , featuring news from across the globe and compelling ideas and perspectives, delivered to your inbox every weekday.

It’s challenging to come up with a more ethereal alias than this one. A recent report released by +972 magazine and Local Call suggests that Israel purportedly utilized an AI-driven database to pinpoint suspected Hamas and other militant targets in the blockaded Gaza Strip. As per the report, the tool, developed by Israeli military data experts, combed through an extensive collection of surveillance data and other inputs to generate targets for potential elimination. It likely played a significant role, particularly in the initial phases of the ongoing conflict, as Israel carried out relentless airstrikes on the region, resulting in the destruction of residences and entire neighborhoods. As per the latest figures from the Gaza Health Ministry, over 33,000 Palestinians, predominantly women and children, have lost their lives in the area.

The AI program in question? Dubbed “Lavender.”

Israeli journalist and filmmaker Yuval Abraham recently unveiled a comprehensive investigation into the existence of the Lavender initiative and its deployment in the Israeli offensive in Gaza that ensued after Hamas’s lethal terrorist attack on southern Israel on October 7. Abraham’s exposé, featured in +972 magazine, a progressive Israeli English platform, and Local Call, its Hebrew-language counterpart, drew upon insights from six unidentified Israeli intelligence officers who were directly involved in utilizing AI to designate targets for elimination during the conflict. According to Abraham, Lavender pinpointed as many as 37,000 Palestinians — along with their residences — for potential strikes. (The IDF refuted the existence of such a “kill list” to the journalist, portraying the program as a database designed for cross-referencing intelligence sources.) White House national security spokesperson John Kirby disclosed to CNN on Thursday that the United States was examining the media reports regarding the alleged AI tool.

“In the initial phases of the conflict, the military granted broad authorization for officers to act on Lavender’s target lists, without the need for thorough scrutiny of the rationale behind the machine’s selections or a detailed review of the underlying intelligence data,” Abraham detailed.

“One source revealed that human operators often merely endorsed the machine’s decisions, indicating that they typically spent only about ‘20 seconds’ evaluating each target before approving a strike — primarily to confirm that the designated target by Lavender was male,” he added. “This occurred despite the awareness that the system commits what are considered ‘errors’ in approximately 10 percent of cases, sometimes identifying individuals with loose affiliations to militant groups or no ties at all.”

This narrative could shed light on the extensive devastation inflicted by Israel on Gaza in its efforts to retaliate against Hamas, as well as the significant loss of life. Previous episodes of Israel-Hamas clashes witnessed the Israel Defense Forces engaging in a more prolonged, human-driven process of target selection based on intelligence and other inputs. Amidst profound Israeli outrage and distress following Hamas’s October 7 assault, Lavender might have facilitated Israeli commanders in swiftly devising a comprehensive program of reprisal.

“We were constantly under pressure: ‘Bring us more targets.’ They were really yelling at us,” recounted an intelligence officer, as reported by Britain’s Guardian, which gained access to testimonies initially disclosed by +972.

Many of the munitions dropped by Israel on locations allegedly identified by Lavender were conventional “dumb” bombs — bulky, unguided weaponry causing significant damage and civilian casualties. According to Abraham’s findings, Israeli authorities opted not to utilize more costly precision-guided munitions on the numerous low-ranking Hamas “operatives” singled out by the program. They also displayed little hesitation in targeting the buildings where the families of the designated individuals resided, he highlighted.

“We didn’t aim to eliminate [Hamas] operatives solely when they were in a military facility or engaged in military actions,” shared an intelligence officer, identified as A, with +972 and Local Call. “On the contrary, the IDF struck them in residential homes without reluctance, as a primary option. It’s much simpler to target a family’s residence. The system is designed to locate them in such scenarios.”

Significant apprehensions regarding Israel’s target selection methodologies and tactics have been raised throughout the conflict. “It is inherently difficult to differentiate between legitimate military targets and civilians” in that setting, remarked Brian Castner, senior crisis adviser and weapons investigator at Amnesty International, to my colleagues in December. “Therefore, based on basic discretion, the Israeli military should be utilizing the most precise available weaponry and opting for the smallest suitable munition for the target.

In response to the revelations about Lavender, the IDF issued a statement refuting some of Abraham’s claims and challenging the portrayal of the AI initiative. The IDF clarified that it is “not a system but simply a database used for cross-referencing intelligence sources to compile updated information on the military operatives of terrorist groups,” as published in the Guardian.

“The IDF does not employ an artificial intelligence system to identify terrorist operatives or predict whether an individual is a terrorist,” the statement emphasized. “Information systems are merely tools for analysts in the target identification process.”

The recent incident involving an Israeli drone strike on a convoy of vehicles belonging to World Central Kitchen, a prominent humanitarian food aid organization, resulting in the deaths of seven of its staff members, has intensified scrutiny on Israel’s conduct during the conflict. In a conversation with Israeli Prime Minister Benjamin Netanyahu on Thursday, President Biden reportedly urged Israel to alter its course and take tangible measures to safeguard civilian lives and facilitate aid distribution.

Simultaneously, hundreds of distinguished British lawyers and judges penned a letter to their government, urging a halt to arms sales to Israel to prevent “complicity in serious violations of international law.”

While the use of AI technology represents only a fraction of the concerns raised by human rights advocates regarding Israel’s actions in Gaza, it portends a troubling future. Lavender, noted Adil Haque, an international law expert at Rutgers University, embodies “the nightmare scenario of every international humanitarian lawyer.”

Visited 3 times, 1 visit(s) today
Tags: , Last modified: April 5, 2024
Close Search Window
Close