The Limited Times

Now you can see non-English news...

Lavender, Israel's artificial intelligence that decides who is bombed in Gaza

2024-04-17T04:50:37.197Z

Highlights: Israel has developed a program supported by artificial intelligence to select the victims of its bombings. Baptized as Lavender, this system marked 37,000 Palestinians during the first weeks of the war. Lavender was used in at least 15,000 murders from October 7 to November 24 in the invasion of Gaza. The system is designed to attack the target's home and at night, which increases the chances that he will be at home, but also that his family members and neighbors will die with him. The Israeli Armed Forces denied in an official statement after the publication of the report that it is letting a machine determine "if someone is a terrorist." The letter says that information systems "are mere tools for analysts in the process of identifying objectives," although the cited sources assure that the officers limit themselves to validating Lavender's recommendations without carrying out verifications. It is within acceptable parameters, for example, that for every senior Hamas or Islamic Jihad official, a hundred civilians die in the bombing, which usually affects several buildings. The Israeli military uses AI to augment the decision-making processes of human operators. This use is in accordance with international humanitarian law, as applied by the modern Armed Forces in many asymmetric wars since September 11, 2001.


The Israeli Armed Forces use an automated system to select their human targets, a practice unprecedented until now


Israel has crossed another line in the automation of war. Its Armed Forces have developed a program supported by artificial intelligence (AI) to select the victims of its bombings, a process that traditionally requires manual verification of evidence until verifying that a target deserves to be one. Baptized as Lavender, this system marked 37,000 Palestinians during the first weeks of the war, and was used in at least 15,000 murders from October 7 to November 24 in the invasion of Gaza, according to a journalistic investigation by two Israeli media,

+972 Magazine

and

Local Call

, also published in

The Guardian

.

The tool has generated controversy due to the coldness with which the military commanders responsible for giving the green light or not to the suggestions of the Lavender system manage the deaths of people as mere statistics. It is within acceptable parameters, for example, that for every senior Hamas or Islamic Jihad official, a hundred civilians die in the bombing, which usually affects several buildings. The system is designed to attack the target's home and at night, which increases the chances that he will be at home, but also that his family members and neighbors will die with him.

Never before has it been reported that anyone has automated a task as sensitive as the selection of human military targets, a task in which a false positive can mean the death of innocent people. The Israeli Armed Forces denied in an official statement after the publication of the report that it is letting a machine determine "if someone is a terrorist." The letter says that information systems “are mere tools for analysts in the process of identifying objectives,” although the cited sources assure that the officers limit themselves to validating Lavender's recommendations, without carrying out verifications.

The investigation, which is sourced from several Israeli army and intelligence services officials, including Unit 8200, does not reveal what parameters are used to determine whether or not a subject has a relationship with Hamas or Islamic Jihad. Some are listed, such as the individual frequently changing telephone numbers (something that happens constantly in a war context) or being male (there are no women with officer rank).

It is known that, like all AI systems, Lavender is still a probabilistic model. You work with estimates and, therefore, make mistakes. At least 10% of the individuals marked as targets were not, according to official sources cited in the report. That margin of error, added to the collateral deaths accepted by the army (up to 300 civilians in a single bombing on October 17 to kill a Hamas commander), leaves a balance of thousands of Palestinians, most of them women and children, murdered. by indication of the software without having any connection with the militias.

The automation of war

The Lavender program is complemented by another called Where is Daddy? (Where's Dad?), used to track already marked individuals and carry out the bombing when they are at home, and with The Gospel, aimed at identifying buildings and structures from which, according to the army, Hamas militants operate.

Lavender processes information collected from the more than 2.3 million residents of the Gaza Strip, confirming the dense network of digital surveillance to which all its inhabitants are subjected. A score is created for each individual ranging from 1 to 100, from lowest to highest estimated probability of being linked to the armed wing of Hamas or Islamic Jihad; Those who get the highest grades are swept off the face of the Earth along with their families and neighbors. According to the investigation by

+976 Magazine

, the officers barely checked the targets suggested by Lavender citing reasons of “efficiency”: they spent a few seconds looking at each case, pressured by the need to collect new targets to shoot at every day. Put into practice, it meant validating the algorithm's indications.

Is it legal to use this type of system? “The Israeli military uses AI to augment the decision-making processes of human operators. This use is in accordance with international humanitarian law, as applied by the modern Armed Forces in many asymmetric wars since September 11, 2001,” says jurist Magda Pacholska, researcher at the TMC Asser Institute and specialist in the intersection between disruptive technologies. and military law.

Pacholska recalls that the Israeli army had previously used automated decision-making support systems such as Lavender and Gospel in Operation Guardian of the Walls in 2021. The forces of the United States, France and the Netherlands, among others, had also done so. , although always against material objectives. “The novelty is that, this time, it uses these systems against human targets,” highlights the expert.

Arthur Holland Michel, to whom the UN has commissioned reports on the use of autonomous weapons in armed conflicts, adds another element. “What is different, and certainly unprecedented, in the Lavender case in Gaza is the scale and speed at which the system is being used. The number of people that have been identified in just a few months is astonishing,” he emphasizes. “Another crucial difference is that the time between the algorithm's identification of a target and the attack against it appears to have often been very short. That indicates that there is not much human research in the process. From a legal point of view, this could be problematic,” he concludes.

According to the practices and doctrines of many Western States, including NATO, Pacholska recalls, once it is determined that a person “directly participates in hostilities” he is a legal target and can also be attacked at his home: “It may be shocking to the public, but this is how contemporary conflicts against organized armed groups have been carried out since September 11, 2001."

What is not legal is massacring civilians. For Luis Arroyo Zapatero, honorary rector of the University of Castilla-La Mancha and specialist in International Criminal Law, the deaths caused by this tool should be considered “war crimes”, and the set of these actions and the massive destruction of buildings and people “are crimes against humanity.” In international law, the professor explains, assassinations are not admitted as military action, although so-called selective assassinations are discussed. “The deaths caused as collateral damage are pure murders. The Lavender system is directly a civilian murder machine, as it allows collateral deaths of civilians of between 10 and 100 people beyond the precise objective,” he assures.

The Palestinian Israeli weapons laboratory

The Palestinians know well what it is like to be watched. The Israeli intelligence services have been collecting all kinds of data on each of them for years. The digital trace of your mobile phones, from locations to interactions on social networks, is processed and stored. Cameras with automatic facial recognition systems have been part of their daily lives at least since 2019.

The Washington Post

reported on a program, Blue Wolf, aimed at recording the faces of every inhabitant of the West Bank, including children and the elderly, and associating them with a “dangerousness” card, so that soldiers, when photographing a subject on the street with their cell phone, would see a color code on the fly that would tell them if they had to arrest him.

The New York Times

has reported the use of a similar system in the Gaza Strip deployed late last year that also seeks to collect and classify Palestinians' faces without their consent.

All of these technologies are developed by Israeli companies, which sell them to their armed forces and then export them with the guarantee of having been tested in the field. “Facial recognition everywhere, drones, spy technology… This State is really an incubator for surveillance technologies. If you sell a product, you have to show how effective it is in real scenarios and in real time. That's what Israel is doing,” says Cody O'Rourke, of the NGO Good Shepherd Collective, from Beit Sahour, a Palestinian village east of Bethlehem. This American, who has been an aid worker in Palestine for two decades, knows that his name and that of other collaborators who have gone to Gaza are on a blacklist. That means additional searches and longer waits at Israeli military checkpoints. “It is one more layer of the application of technology to fragment the population,” he explains by video call.

Israel has made a name for itself in the international arms market. It sells tanks, fighters, drones and missiles, but also sophisticated systems such as Pegasus, the spy software developed by NSO Group that allows access to the victims' mobile phones, which in Spain was used to intercept the communications of independence leaders in the middle of the

process.

. “Israel had always considered itself a leader in cybersecurity and, for five or six years, it is also specializing in tools supported by AI that can have military use,” reflects Raquel Jorge, technology policy analyst at the Elcano Royal Institute. Presentations by Israeli commanders at arms fairs have circulated online presenting the Lavender program in entrepreneur jargon and referring to the system as “the magic dust for detecting terrorists.”

Some read the

+972 Magazine

investigation as a major marketing campaign by the Israeli armed forces. “While some have interpreted the report as a moral indictment of Israel's use of a novel technology, I would suggest that it is more propaganda attempting to entrench its role in the global political economy as a weapons developer,” he explains to EL PAÍS Khadijah Abdurraman, director of

Logic(s) Magazine

, a magazine specialized in the intersection between technology and society. “One can easily imagine Sudan's Rapid Support Forces placing an order for the Lavender systems before the end of the week,” she adds.

O'Rourke is of the same opinion. “The point is not that killing Palestinians is wrong, but that the system was used inappropriately, without carrying out the relevant checks. It seems like they want to sell that there is a correct way to murder. That this is published should not bother the army, because if it has been published in an Israeli media it means that the Government has given its approval," says the American in reference to the Military Censor's office that vetoes information that could harm the security of the State.

“Israel has been delegitimizing the peace process with the Palestinians for decades, although it has never been interested in achieving peace. "It needs the world to legitimize its occupation and uses technology to maintain that occupation as a calling card," writes Antony Loewenstein in his book

The Palestinian Laboratory

(Captain Swing, 2024), which delves into how the Hebrew country has used the occupation. that it exercises over the Palestinian territories as a showcase for the military technology that it has been selling around the world for decades.

The use of Lavender raises many questions and few answers. What type of algorithms does the system use to identify potential targets? What elements are taken into account in this calculation? How are system goal recommendations verified? Under what circumstances do analysts refuse to accept a system recommendation? “If we do not have answers to these questions, it will be very difficult to find a solution to the serious risks posed by the rapid automation of war,” concludes Holland.

_

Source: elparis

All tech articles on 2024-04-17

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.