When algorithms murder: Israel is falsely designating Gaza areas as empty in order to bomb them

The army is using an algorithm it knows to be inaccurate to declare Gaza neighborhoods as “green,” or cleared of residents, and carry out airstrikes — killing hundreds of civilians in recent weeks, a joint investigation reveals.

All lies and jest,

Still a man hears what he wants to hear,

And disregards the rest …

~Paul Simon, “The Boxer”

5 June 2025 – – When I began my video/film genocide project (you can watch the collection here on my Youtube channel), I also started a long-form essay entitled “Our Age of Progressive Atrocity”. It seemed a useful companion.

As I noted in my opening chapter which many of you have read, 21st century genocide shall run for a very long to time, unabated. Someday they will say of us that we were living in a strange time, the kind that usually follows revolutions or the decline of great empires. It was no longer the heroic fervor of midcentury upheavals, the glamorous vices of concentrations of power, or the skeptical soullessness and insane orgies of the latest bubbles.

It was an age in which despair and material comfort, technological wizardry and political malaise, and a paradoxical freedom, both from criticism and to endlessly criticize, were mixed together, along with deep but narrow enthusiasms, the renunciation of utopias, condescension toward the past, weariness of the present, and pessimism for the future.

We thought we were sublime. We thought we’d solved everything. But we were actually complacent. We had lost the thread of the story. We were sleep walking into the Apocalypse. It was the end of we knew not what, and so there was no real “we must put our house in order!!” because earth-shaking events had become the norm.

The Gaza Strip was one of the most hopeless places I have ever seen in my life. Most of us will never step foot there, but if you’ve been, you won’t forget it. The corrugated tin roofs perched on rickety huts that people call home in the sprawling refugee camps. The open sewage. The poorly paved roads. The lack of basic infrastructure. The undernourished children.

But most of all there is the death and despair and anger in the eyes of adults. But what is so striking, tragic in fact, is the obvious reality that Gaza could have been a vibrant gem for its people. At one time this strip of land was crowded with 2 million people – many still categorized as refugees 5 decades after the conflict in question.

But that did not happen. Instead, the current Islamic authoritarian government – Hamas (officially categorized as a terrorist organization by the United States) – spent its not-so-limited resources on other matters. In early October 2023 it mustered its strength and treasure to launch an attack on Israeli civilians. That led to Israel’s decision for Gaza’s complete destruction. (I’ll leave for another day Israel’s complicity in the attack).

I have been to the Middle East several times. But the problem is that while in the work-world I normally live in a world seemingly corralled by algorithms, when I travel to the Middle East I must bring to my soul an element of chaos to a culture that continually plays its deadly, dystopic sameness.

I have traveled across Israel and all through the West Bank, eating/shopping/talking with people in Ashkelon, Haifa, Jerusalem and Tel Aviv, etc. but also Hebron, Jericho, Nablus and Ramallah. But there was always one big take-away from all these trips. I had made a statement that “things are certainly better” and I was corrected by everybody: “No, things are simply quiet. Here, something can trigger an escalation. Anything can trigger a low-level conflict that simply escalates. At any time. But we all expect at some point something massive will trigger”.

And so in October 2023 and its aftermath we saw what “massive” means.

And technology is ever present. In fact, it is on steroids.

Last year I wrote about Lavender, based on a report by +972 Magazine –  a rare glimpse into the first-hand experiences of Israeli intelligence officials who have been using machine-learning systems to help identify targets during the Gaza war.The story exploded across the internet. The Israeli military’s bombing campaign in Gaza was using a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.

In addition to talking about their use of the AI system, called Lavender, the intelligence sources claim that Israeli military officials permitted large numbers of Palestinian civilians to be killed, particularly during the early weeks and months of the conflict, using a “calculus” of how many deaths were “sustainable”. 

At no point did Israeli intelligence officers question the AI’s accuracy, or approach, or the legal or moral justification for Israel’s bombing strategy. At no point were details provided about the specific kinds of data used to train Lavender’s algorithm, or how the program reached its conclusions.

Now, +972 Magazine has followed up. A few quotes from the piece (fully accessible here):

In recent weeks, the Israeli army has been launching airstrikes on residential neighborhoods in Gaza that they treat as evacuated, despite knowing that many of the houses bombed were filled with civilians who could not or did not want to leave, according to two intelligence sources who spoke to +972 Magazine and Local Call.

The army’s designation of a particular neighborhood as “green,” or cleared of residents, is based on a crude algorithmic analysis of phone usage patterns over a wide area — not on a detailed, house-by-house assessment before bombing, as previously revealed by +972 Magazine, Local Call and The New York Times.

Two intelligence sources noticed in May that the army was bombing homes and killing families, while internally recording that the homes were empty or nearly empty of residents, based on the flawed algorithmic calculation.

“This occupancy estimate is based on a bunch of incredibly crappy algorithms,” one intelligence source explained to +972 and Local Call. “It’s clear there are a lot of people in those houses. They haven’t really evacuated.

“You look at the evacuation tables, and everything is green — that means between 0 to 20% of the population remains. The whole area we were in, in Khan Younis, was marked green, and it clearly wasn’t,” the source added.

Last week, an airstrike in Khan Younis hit the home of Dr. Alaa Al-Najjar, killing nine of her 10 young children, and her husband Dr. Hamdi al-Najjar, who succumbed to his wounds a few days later.

 

Yes. *Bad algorithms* killing people – a new wrinkle to add to the literature of genocide.

As I have noted before, it boggles the mind that so many people are talking about AI and LLM AIs and its “bad” training sets with biases and inaccuracies. It produces “fluffy content”, and incorrect answers, false information and skewed output – leading to poor human decisions made in reliance on biased output.

Ah … but the coming glory of Artificial General Intelligence. “SAY HALLELUJAH!!”

All of that pales to insignificance in the real world when things like Lavender have been unleashed. War ruled by deliberate, murderous algorithmic bias.

Leave a Reply

Your email address will not be published. Required fields are marked *

scroll to top