The utilization of an untested and undisclosed artificial intelligence-powered database by the Israeli military to pinpoint targets for their bombing operations in Gaza has raised concerns among human rights and technology experts, who suggest that such actions could potentially constitute “war crimes”.
According to reports from Israeli-Palestinian publication +972 Magazine and Hebrew-language media outlet Local Call, the Israeli army has been utilizing an AI-assisted targeting system named Lavender to isolate and identify thousands of Palestinians as potential bombing targets.
Al Jazeera’s Rory Challands, reporting from occupied East Jerusalem, revealed that this database is responsible for generating kill lists comprising up to 37,000 targets.
Despite an error rate of approximately 10 percent, Israeli intelligence officials, who chose to remain anonymous, proceeded to expedite the identification and targeting of Hamas operatives in Gaza using the Lavender system.
Marc Owen Jones, an assistant professor specializing in Middle East Studies and digital humanities, expressed alarm at the deployment of untested AI systems by the Israeli army in life-and-death decision-making processes concerning civilians, labeling it as an “AI-assisted genocide”.
The use of this method reportedly resulted in numerous civilian casualties in Gaza, with Gaza’s Ministry of Health reporting at least 33,037 Palestinian fatalities and 75,668 injuries due to Israeli attacks.
Critics argue that the interaction between humans and the AI database was minimal, with individuals often merely providing a cursory review of the kill list before authorizing airstrikes.
In response to mounting criticism, the Israeli military emphasized the necessity for independent verification by analysts to ensure compliance with international law and internal regulations regarding target identification.
However, the revelation that there were “five to 10 acceptable civilian deaths” for each targeted Palestinian fighter underscores the high civilian casualty rate in Gaza.
AI expert Professor Toby Walsh highlighted the ethical concerns surrounding AI targeting systems, suggesting that such practices likely violate international humanitarian law by diminishing human oversight in warfare decision-making processes.
Sources cited by media outlets indicated that the Israeli army sanctioned disproportionate civilian casualties for each Hamas operative targeted by the Lavender system, potentially constituting war crimes.
United Nations special rapporteur Ben Saul warned that if the details in the report are substantiated, numerous Israeli strikes in Gaza could be deemed war crimes due to launching disproportionate attacks.
Moreover, there are concerns that Israel is seeking to market these AI tools to foreign governments, potentially normalizing and exporting the technology of occupation utilized in Gaza to other countries.