Israel has been using AI to assist in selecting their targets. The collateral damage in Gaza, represented by the Palestinian flag (left), continues to increase. (Source: Adobe Stock)

Israeli Use of AI Targeting Leads to Killing of More Civilians, Rather Than Fewer

Nearly a year ago, reported on how Ukrainians and Russians were using AI in war. For instance, Ukrainians are using AI-assisted drones of all shapes and sizes. Now Israel is using AI in its war against Hamas.

From a horrifying story we found at comes some details about the AI Israel uses.


Israel is, according to multiple international news reports, using a secretive AI tool called Habsora, to identify targets and for the assassination of Hamas leaders.

“As The Guardian and the Israeli-Palestinian magazine reports, military officials have confirmed on condition of anonymity the existence of a program known as ‘Habsora,’ translated to ‘The Gospel’ in English — and not to be confused with the Hebrew term ‘Hasbara,’ which roughly translates to ‘propaganda’ — that’s said to comprise a ‘factory’-esque production line of people slated for state slaughter.”

Yes, you read that right. They said they had built a death factory to unleash on Gaza. Reports say Israel has killed more than 15,000 people already, which makes it likely many more will die because of this algorithm.

Collateral Damage

According to sources who spoke to on condition of anonymity, the heightened bombing of non-military “power targets”—private residences, public buildings, important pieces of infrastructure, and high-rise blocks—is intended to “create a shock” in Palestinian civil society and ultimately “lead civilians to put pressure on Hamas.”

Israel has also allegedly used its tech and other intelligence-gathering capabilities under the umbrella of its so-called “target division” to estimate how many civilians would be killed in potential attacks. Those numbers,‘s sources say, are calculated and relayed to the military in advance.

The number of civilians being killed is extraordinary under any circumstances.

“Nothing happens by accident,” one of the magazine’s sources explained. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target.”

The “calculated” losses mean that AI is being used to aid in killing people on a greater scale than ever before. Previously, AI-led drone strikes were far more limited.