Written by 3:33 am AI, Discussions, Latest news

– Leveraging AI: Israel’s Utilization of Technology to Target and Attack Palestinian Homes

The Israeli publications +972 and Local Call have exposed how the Israeli military used an artifici…

AMYGOODMAN: Welcome to Democracy Now!, your source for independent news. I’m Amy Goodman.

Recently, +972 Magazine and Local Call uncovered a disturbing revelation about the Israeli military’s utilization of artificial intelligence, specifically a system named “Lavender,” to create a “kill list” in Gaza comprising up to 37,000 Palestinians earmarked for targeted assassination with minimal human oversight. This groundbreaking report was based on interviews with six Israeli intelligence officers directly involved with this AI system.

According to +972 Magazine, the impact of “Lavender” on military operations during the conflict was so profound that the outputs of the AI machine were treated as equivalent to human decisions by the military. Additionally, another AI system called “Where’s Daddy?” was employed to track Palestinian men on the kill list, particularly targeting them when they were at home with their families during nighttime hours.

In our extended coverage, we delve into an in-depth discussion with Yuval Abraham, the Israeli investigative journalist behind this exposé. His comprehensive investigation, titled “‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza,” sheds light on the mechanisms and consequences of these AI-driven targeting operations.

YUVALABRAHAM: In our detailed examination, we outlined the intricate process through which the military identifies and marks targets using highly automated methods since October. The development of Lavender was initially intended to pinpoint low-ranking operatives within Hamas and Islamic Jihad military factions. However, following a pivotal decision post-October 7th, the military expanded the scope to include tens of thousands of individuals as potential targets for airstrikes within their residences, leading to significant collateral damage.

AMYGOODMAN: The indiscriminate targeting of individuals, ranging from alleged low-ranking militants to senior Hamas commanders, resulted in a predetermined “collateral damage degree,” permitting the killing of civilians alongside the intended targets. This approach, utilizing unguided munitions to collapse entire buildings, raised serious ethical and legal concerns regarding proportionality and distinction under international law.

YUVALABRAHAM: The integration of Lavender and Where’s Daddy? facilitated the tracking and targeting of suspects in domestic settings, often devoid of military activity. The deliberate use of unguided missiles in residential areas underscored a callous disregard for civilian lives, with officers rationalizing the choice of munitions based on the perceived importance of the target rather than the potential impact on innocent families.

AMYGOODMAN: The IDF’s response, denying the use of an AI system for identifying terrorists, contradicts public statements made by senior Israeli military officials regarding the deployment of AI technology for counterterrorism purposes. This disparity underscores the need for transparency and accountability in the military’s targeting practices.

YUVALABRAHAM: The systemic reliance on artificial intelligence in conflict zones raises profound ethical dilemmas, blurring the lines of responsibility and accountability while undermining established legal frameworks. The potential for AI-based warfare to erode fundamental principles of international law demands urgent scrutiny and action to prevent further human rights violations.

As we navigate the complexities of AI-driven warfare and its implications on civilian populations, we must confront the ethical and legal challenges posed by these evolving technologies. Stay tuned for more insights from Yuval Abraham on this critical issue.

This is Democracy Now!, democracynow.org, The War and Peace Report.

Visited 2 times, 1 visit(s) today
Tags: , , Last modified: April 7, 2024
Close Search Window
Close