Written by 10:03 am AI, Latest news

### Report Reveals Israeli AI System Responsible for High Civilian Casualty Rate in Gaza

The targeting program has reportedly been used in conjunction with another called “Where’s Daddy,” …

An AI-driven targeting system named “Lavender” has been utilized by the Israeli military in Gaza to designate bombing targets, with minimal human involvement, as per a recent publication.

The report, released by +972 and Local Call, revealed that military personnel swiftly approved the AI-selected targets, often with mere cursory validation, taking approximately “20 seconds” per target before green-lighting a bombing operation.

Although the review process was primarily to confirm the target’s gender, a study showed a 10% error rate in identifying non-militants by the program. Despite this, sources disclosed that approval was granted to adopt Lavender’s kill lists automatically two weeks into the ongoing conflict. Additionally, the military reportedly tracked down individuals at their residences, sometimes resulting in casualties among family members in the presence of a program ominously named “Where’s Daddy?”

Due to delays in the “Where’s Daddy?” system, families were purportedly harmed at home even when the primary target was absent. The extent of the programs’ current utilization remains unclear, although they were notably active in the initial stages of the war.

One unnamed senior officer mentioned in the story as “B” stated, “We took out thousands of people. We put everything into automated systems, and as soon as one of the marked individuals was at home, he immediately became a target. We bombed him and his house.”

The report indicated that this approach led to tens of thousands of designated targets and numerous Palestinian civilian casualties, including women, children, and non-combatant men.

The global community has expressed concerns regarding civilian casualties in Gaza, where approximately 33,000 individuals have lost their lives in Israel’s military campaign following a Hamas-led attack on Oct. 7. In response, President Joe Biden urged Israeli Prime Minister Benjamin Netanyahu for an immediate temporary cease-fire, emphasizing the need for specific actions to address civilian harm and humanitarian suffering.

The Israeli military, however, refuted the claim that it employed an artificial intelligence system to identify militants, asserting that such systems were tools for analysts in target identification, subject to independent scrutiny to ensure compliance with international law.

The report also mentioned the alleged existence of a “kill list” comprising around 37,000 suspected Hamas militants, with the military reportedly deeming it acceptable to sacrifice up to 15 to 20 civilians for each junior Hamas operative targeted.

The program’s reliance on unguided “dumb” bombs early in the conflict contributed to significant civilian casualties, as precision strikes were not utilized on junior militants identified by the AI.

The author of the report, Yuval Abraham, highlighted the program’s impact on civilian populations, emphasizing the need for responsible military use of AI technologies in conflict zones.

Visited 3 times, 1 visit(s) today
Tags: , Last modified: April 6, 2024
Close Search Window
Close