In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”
Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”
Dang, we’re even outsourcing war crimes to AI now smh
“It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
… the IDF imposed pre-authorised limits on the number of civilians it deemed acceptable to kill in a strike aimed at a single Hamas militant… The IDF judged it permissible to kill more than 100 civilians in attacks on a top-ranking Hamas officials. “ … earlier in the war they were authorised to kill up to “20 uninvolved civilians” for a single operative, regardless of their rank, military importance, or age.
So to recap;
- They knew/had a very good idea the death toll their actions would have
- There was a conscious decision (born from apathy and indifference it seems) to target the homes of militants instead of individuals directly
- All this activity has operational oversight and vetting - this is not a rogue unit/drone operator but a policy course that is being pursued even now.
Given the recent triple airstrike on the WCK aid convoy, quotes like this help see behind the curtain and speaks to their mindset:
“…you don’t want to invest manpower and time in it”. They said that in wartime there was insufficient time to carefully “incriminate every target”. “So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it,” they added.
Sure, you get to live with it.
I’ve already seen this one. I think they were called Kree, the planet called Hala, and the AI that kept them in perpetual wars was called the Supreme Intelligence.
Also interesting lines
“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
The Lavender machine joins another AI system, “The Gospel,” about which information was revealed in a previous investigation by +972 and Local Call in November 2023, as well as in the Israeli military’s own publications. A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list.
In addition, according to the sources, when it came to targeting alleged junior militants marked by Lavender, the army preferred to only use unguided missiles, commonly known as “dumb” bombs (in contrast to “smart” precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties. “You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs],” said C., one of the intelligence officers. Another source said that they had personally authorized the bombing of “hundreds” of private homes of alleged junior operatives marked by Lavender, with many of these attacks killing civilians and entire families as “collateral damage.”
One source said that when attacking junior operatives, including those marked by AI systems like Lavender, the number of civilians they were allowed to kill alongside each target was fixed during the initial weeks of the war at up to 20. Another source claimed the fixed number was up to 15. These “collateral damage degrees,” as the military calls them, were applied broadly to all suspected junior militants, the sources said, regardless of their rank, military importance, and age, and with no specific case-by-case examination to weigh the military advantage of assassinating them against the expected harm to civilians.
According to A., who was an officer in a target operation room in the current war, the army’s international law department has never before given such “sweeping approval” for such a high collateral damage degree. “It’s not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law,” A. said. “But they directly tell you: ‘You are allowed to kill them along with many civilians.’
D. stressed that they were not explicitly told that the army’s goal was “revenge,” but expressed that “as soon as every target connected to Hamas becomes legitimate, and with almost any collateral damage being approved, it is clear to you that thousands of people are going to be killed. Even if officially every target is connected to Hamas, when the policy is so permissive, it loses all meaning.”
A. also used the word “revenge” to describe the atmosphere inside the army after October 7. “No one thought about what to do afterward, when the war is over, or how it will be possible to live in Gaza and what they will do with it,” A. said. “We were told: now we have to fuck up Hamas, no matter what the cost. Whatever you can, you bomb.”
Looks like the article is not accessible on Tor. Here’s as much of the article I can paste before reaching the max char limit of Lemmy
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
By Yuval Abraham | April 3, 2024
In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”
Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”
Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.
During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.
Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.
The result, as the sources testified, is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.
“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
The Lavender machine joins another AI system, “The Gospel,” about which information was revealed in a previous investigation by +972 and Local Call in November 2023, as well as in the Israeli military’s own publications. A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list.
In addition, according to the sources, when it came to targeting alleged junior militants marked by Lavender, the army preferred to only use unguided missiles, commonly known as “dumb” bombs (in contrast to “smart” precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties. “You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs],” said C., one of the intelligence officers. Another source said that they had personally authorized the bombing of “hundreds” of private homes of alleged junior operatives marked by Lavender, with many of these attacks killing civilians and entire families as “collateral damage.”
In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any “collateral damage” during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.
Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90) The following investigation is organized according to the six chronological stages of the Israeli army’s highly automated target production in the early weeks of the Gaza war. First, we explain the Lavender machine itself, which marked tens of thousands of Palestinians using AI. Second, we reveal the “Where’s Daddy?” system, which tracked these targets and signaled to the army when they entered their family homes. Third, we describe how “dumb” bombs were chosen to strike these homes.
Fourth, we explain how the army loosened the permitted number of civilians who could be killed during the bombing of a target. Fifth, we note how automated software inaccurately calculated the amount of non-combatants in each household. And sixth, we show how on several occasions, when a home was struck, usually at night, the individual target was sometimes not inside at all, because military officers did not verify the information in real time.
STEP 1: GENERATING TARGETS
‘Once you go automatic, target generation goes crazy’
In the Israeli army, the term “human target” referred in the past to a senior military operative who, according to the rules of the military’s International Law Department, can be killed in their private home even if there are civilians around. Intelligence sources told +972 and Local Call that during Israel’s previous wars, since this was an “especially brutal” way to kill someone — often by killing an entire family alongside the target — such human targets were marked very carefully and only senior military commanders were bombed in their homes, to maintain the principle of proportionality under international law.