_
_
_
_
_

Lavender, Israel’s artificial intelligence system that decides who to bomb in Gaza

The Israel Defense Forces use an automated program to select their human targets, an unprecedented practice until now

Lavender AI
Images showing the attack on Hadi Ali Mustafa, a member of Hamas, in Lebanon, on March 13, 2024.@FDIonline
Manuel G. Pascual

Israel has crossed another line in the automation of war. The Israel Defense Forces (IDF) have developed a program supported by artificial intelligence (AI) to select the victims of its bombings, a process that traditionally needs to be manually verified until a person can be confirmed to be a target. Called Lavender, this system identified 37,000 Palestinians as potential targets during the first weeks of the war, and between October 7 and November 24, it was used in at least 15,000 murders in the invasion of Gaza, according to a journalistic investigation by two Israeli media, +972 Magazine and Local Call, which was published in The Guardian.

The AI system has sparked controversy due to the cold attitude with which the military commanders responsible for overseeing the suggestions of the Lavender system listed the deaths of people as mere statistics. Under the paradigms of Lavender’s algorithm, it is acceptable for a hundred civilians to die in a bombing targeted against a single senior Hamas or Islamic Jihad official. The system is designed to attack the target when they are home and at night, which increases the chances that the target will be there, but also that their family members and neighbors will die with them.

There have never before been reports of a country automating a task as sensitive as the selection of human military targets, a task in which a false positive can mean the death of innocent people. In an official statement following the report, the Israeli Defense Forces denied that it was letting a machine determine “whether a person is a terrorist.” The IDF claimed that information systems “are merely tools for analysts in the target identification process,” even though the sources cited by the journalistic investigation said that officers only validate Lavender’s recommendations, without carrying out further verification.

The investigation — which cites several IDF and intelligence services officials, including members of the Unit 8200 — does not reveal what parameters are used to determine whether or not a target has a relationship with Hamas or Islamic Jihad. Some factors are listed, such as if the individual frequently changes telephone numbers (something that happens constantly in a war context) or is male (there are no women with officer rank).

Like all AI systems, Lavender is known to be a probabilistic model. It works with estimates and, therefore, makes mistakes. At least 10% of the individuals marked as targets were not, according to official sources cited in the report. That margin of error, added to the collateral deaths accepted by the army (up to 300 civilians in a single bombing on October 17 to kill a Hamas commander), results in the murder of thousands of Palestinians without any link to terrorism — most of them women and children — on the recommendations of the software.

The automation of war

The Lavender program is complemented by two other programs: Where is Daddy?, which is used to track individuals marked as targets and bomb them when they are at home, and The Gospel, which is aimed at identifying buildings and structures where, according to the IDF, Hamas militants operate.

Lavender processes information collected from the more than 2.3 million residents of the Gaza Strip, confirming the dense network of digital surveillance to which Gazans are subjected. A score is created for each individual ranging from 1 to 100, which estimates the probability that they are linked to the armed wing of Hamas or Islamic Jihad. Individuals with a high score are killed along with their families and neighbors. According to the investigation by +976 Magazine, the officers did little to verify the potential targets identified by Lavender, citing “efficiency” reasons. The report claimed the officers, under pressure to collect new data and find new targets, spent a just few seconds looking at each case. Put into practice, it meant validating the algorithm’s indications.

Is it legal to use this type of system? “The Israeli military uses AI to augment the decision-making processes of human operators. This use is in accordance with international humanitarian law, as applied by the modern Armed Forces in many asymmetric wars since September 11, 2001,” says Magda Pacholska, researcher at the TMC Asser Institute and specialist in the intersection between disruptive technologies and military law.

Footage prior to the assasination of Ayman Nofal, a commander in Hamas's armed wing, on October 17, 2024.
Footage prior to the assasination of Ayman Nofal, a commander in Hamas's armed wing, on October 17, 2024.@FDIonline

Pacholska recalls that the Israeli army previously used automated decision-making support systems such as Lavender and Gospel in 2021, during Operation Guardian of the Walls. Forces from the United States, France and the Netherlands, among others, have also done so, although always against material objectives. “The novelty is that, this time, [Israel] is using these systems against human targets,” highlights the expert.

Arthur Holland Michel, who has been commissioned by the U.N. to write reports on the use of autonomous weapons in armed conflicts, points to other big differences. “What is different, and certainly unprecedented, in the Lavender case in Gaza, is the scale and speed at which the system is being used. The number of people that have been identified in just a few months is astonishing,” he says. “Another crucial difference is that the time between the algorithm’s identification of a target and the attack against it often appears to have been very short. That indicates that there is not much human research in the process. From a legal point of view, this could be problematic.”

According to the practices and doctrines of many Western states, including NATO, once it is determined that a person “directly participates in hostilities” they are a legal target and can also be attacked at their home, explains Pacholska. “It may be shocking to the public, but this is how contemporary conflicts against organized armed groups have been carried out since September 11, 2001,” she says.

Images taken after the bombing that killed Ayman Nofal, a high-ranking Hamas official, on October 17, in which another 300 people also died.
Images taken after the bombing that killed Ayman Nofal, a high-ranking Hamas official, on October 17, in which another 300 people also died.@FDIonline

What is not legal is massacring civilians. For Luis Arroyo Zapatero, honorary rector of the University of Castilla-La Mancha in Spain and specialist in international criminal law, the deaths caused by this tool should be considered “war crimes,” while the set of these actions, including the massive destruction of buildings and people, should be defined as “crimes against humanity.” In international law, the professor explains, assassinations are not admitted as military action, although there is discussion about so-called selective assassinations. “Deaths caused as collateral damage are pure murder. The Lavender system is directly a civilian killing machine, as it admits collateral civilian deaths of between 10 and 100 people beyond the precise target,” he says.

Israel’s weapons laboratory in Palestine

The Palestinians know well what it is like to be watched. The Israeli intelligence services have been collecting all kinds of data on Palestinians for years. Their cell phone’s digital footprint, from locations to social media interactions, is processed and stored. Cameras with automatic facial recognition systems have been part of their daily lives since at least 2019. The Washington Post reported on a program, called Blue Wolf, aimed at recording the faces of every inhabitant of the West Bank, including children and the elderly. The individuals are added to a database along with information on their alleged dangerousness. In this way, a soldier can take a photo of a Palestinian and, using the information in the database, decide whether or not to arrest them. The New York Times has reported the use of a similar system in the Gaza Strip, deployed late last year, which also seeks to photograph and classify Palestinians without their consent.

All these technologies are developed by Israeli companies, which sell them to the Israeli armed forces and then export them to other countries, claiming that they have been tested in the field. “Facial recognition everywhere, drones, spy technology… This state is really an incubator for surveillance technologies. If you sell a product, you have to show how effective it is in real scenarios and in real time. That’s what Israel is doing,” says Cody O’Rourke, from the NGO Good Shepherd Collective, speaking from Beit Sahour, a Palestinian village east of Bethlehem. This American, who has been an aid worker in Palestine for two decades, knows that his name and that of other collaborators who have gone to Gaza are on a blacklist. That means additional searches and longer waits at Israeli military checkpoints. “It is one more layer of the application of technology to fragment the population,” he explains over a video call.

Slide shown at a trade show by a senior commander of Unit 8200, discussing the Lavender system, which is depicted with the iconography characteristic of startups.
Slide shown at a trade show by a senior commander of Unit 8200, discussing the Lavender system, which is depicted with the iconography characteristic of startups.

Israel has made a name for itself in the international arms market. It sells tanks, fighter jets, drones and missiles, but also sophisticated systems such as Pegasus, the spy software developed by NSO Group that enables a user to access a victim’s cell phone. “Israel had always considered itself a leader in cybersecurity and, as of five or six years ago, it has also been specializing in tools supported by AI that can have military use,” explains Raquel Jorge, technology policy analyst at the Elcano Royal Institute in Spain. Videos shared online have captured Israeli commanders at arms fairs, presenting the Lavender program in entrepreneur jargon and referring to the system as “the magic powder for detecting terrorists.”

Indeed, some believe the +972 Magazine investigation is an IDF marketing campaign. “While some have interpreted the report as a moral indictment of Israel’s use of a novel technology, I would suggest that it is more propaganda that is attempting to entrench Israel’s role in the global political economy as a weapons developer,” Khadijah Abdurraman, director of Logic(s) Magazine, a medium specialized in the intersection between technology and society, tells EL PAÍS. “One can easily imagine Sudan’s Rapid Support Forces placing an order for the Lavender systems before the end of the week,” he adds.

O’Rourke is of the same opinion. “The point is not that killing Palestinians is wrong, but that the system was used inappropriately, without carrying out the relevant checks. It seems like they want to sell the idea that there is a correct way to murder. The fact that this is published should not bother the army, because if it has been published in an Israeli media it means that the government has given its approval,” says the American, referring to the Military Censor’s office, which vetoes information that could harm the security of the state.

“Israel has spent decades delegitimizing the ‘peace process’ with the Palestinians while never being interested in making peace. It needs the world to legitimize its occupation and sells the technology used to maintain that occupation as a calling card,” writes Antony Loewenstein in his book The Palestine Laboratory, which delves into how Israel has used its occupation over Palestine as a showcase for the military technology that it has been selling around the world for decades.

The use of Lavender raises many questions and few answers. What type of algorithms does the system use to identify potential targets? What elements are taken into account in this calculation? How are the system’s target recommendations verified? Under what circumstances do analysts refuse to accept a system recommendation? “If we do not have answers to these questions, it will be very difficult to find a solution to the serious risks posed by the rapid automation of war,” concludes Holland.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition


More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_