Israel has developed a program supported by artificial intelligence to select the victims of its bombings. Baptized as Lavender, this system marked 37,000 Palestinians during the first weeks of the war.

Lavender was used in at least 15,000 murders from October 7 to November 24 in the invasion of Gaza. The system is designed to attack the target's home and at night, which increases the chances that he will be at home, but also that his family members and neighbors will die with him. The Israeli Armed Forces denied in an official statement after the publication of the report that it is letting a machine determine "if someone is a terrorist." The letter says that information systems "are mere tools for analysts in the process of identifying objectives," although the cited sources assure that the officers limit themselves to validating Lavender's recommendations without carrying out verifications. It is within acceptable parameters, for example, that for every senior Hamas or Islamic Jihad official, a hundred civilians die in the bombing, which usually affects several buildings. The Israeli military uses AI to augment the decision-making processes of human operators. This use is in accordance with international humanitarian law, as applied by the modern Armed Forces in many asymmetric wars since September 11, 2001.