In the aftermath of Hamas’s surprise attack on Israel in early October, there are allegations that the Israel Defense Forces deliberately targeted civilian residences and employed an AI-driven program named Lavender to pinpoint assassination targets, resulting in numerous airstrikes with minimal human oversight.
During this period, the system utilized extensive surveillance in Gaza to compile a list of 37,000 potential bombing targets, which included numerous low-ranking alleged Hamas members who were not traditionally the primary targets of such operations, as per reports.
The revelations, brought to light by +972 Magazine and Local Call, stem from interviews with six Israeli intelligence officers who were active during the conflict with Hamas in Gaza and were involved in utilizing AI for target analysis.
One officer mentioned that his involvement in the system was merely to approve Lavender’s targeting decisions swiftly, dedicating only a brief moment to personally review the system’s suggestions.
Furthermore, the officers recounted instances where Hamas targets were pursued in residential settings alongside civilians, as this location facilitated the verification of their whereabouts through intelligence tools. Planners reportedly were willing to risk the lives of up to 15 to 20 civilians to target a single low-ranking Hamas member.
According to one anonymous officer, the IDF did not limit its strikes to instances where Hamas operatives were in military structures or engaged in military activities. Instead, they targeted them in residential areas without hesitation, considering it a preferred option. The system was specifically designed to identify targets in such scenarios.
The IDF responded to the investigation by stating that an independent evaluation by an intelligence analyst is necessary to confirm the legitimacy of identified targets for attack, aligning with IDF directives and international law.
Critics have condemned these tactics as inhumane, with Tariq Kenney-Shawa from Al-Shabaka: The Palestinian Policy Network describing the reports as “sickening.”
Alex Hanna from the Distributed AI Research Institute expressed his dismay, stating, “This is sick and the future of AI warfare for US Empire.”
Following the events of October 7th, soldiers reportedly placed more trust in the system than in the judgment of their grieving comrades. An intelligence officer who utilized Lavender, developed by Israel’s elite Unit 8200, shared with The Guardian that this level of detachment was unprecedented, making the act of targeting easier despite personal losses.
The IDF refuted claims of using AI to identify verified military targets, clarifying that Lavender was employed to cross-reference intelligence sources for updated information on terrorist organizations’ military operatives.
The IDF emphasized that they do not utilize an artificial intelligence system to pinpoint terrorist operatives or predict individuals’ affiliations with terrorist groups, asserting that these information systems serve as aids for analysts in the target identification process.
The conflict in Gaza has resulted in an estimated 33,000 Palestinian casualties, predominantly civilians, according to the territory’s health ministry. Israel has faced criticism for the significant civilian casualties resulting from its operations, which have targeted civilian areas, medical facilities, and refugee camps. The IDF maintains that Hamas frequently embeds military activities within civilian locations to use civilians as shields.
Israel’s targeting strategies have garnered renewed international condemnation following an Israeli airstrike in Gaza that claimed the lives of seven World Central Kitchen aid workers.