Published April 24, 2025 | Version v1 | https://doi.org/10.59350/ht5xw-64562

Killing at Scale

AI has played a pivotal role in Israel's war in the Gaza strip since October 7th 2023, when Hamas militants massacred 1,200 Israeli civilians and soldiers and took more than 250 hostage (CNN 2025).  Reporting by Israeli and international news outlets have detailed how large language models and predictive analytics are helping to determine when and where bombs fall from the sky and troops shoot on the ground (Abraham 2024). Developed by Israeli military units, yet bolstered with computing infrastructure and technologies provided by private civilian firms, US military officials and AI experts say it is the first time automated systems have been used in warfare at such a large scale (Biesicker et al 2025).#violence, #prediction, #automation

Popular writing on automated warfare in Israel/Palestine tends toward techno-determinism. Headlines conjure terminal style death scenes of AI powered weapons systems run amuck. "Do the humans in Israel's army have sufficient control over its technology?" asks an Economist headline (Economist 2024). "War by algorithm raises new moral dangers" warns the Financial Times (Thornhill 2024). Sensationalism elides the industrial supply chains and political structures enabling their use.#automation, #data war

This short commentary offers a simple corrective. I chronicle how the embrace of automated warfare go hand in hand with the steady rise of hard-core political conservatism and nationalism, in Israel and Palestine and worldwide. Orienting towards these political conditions expands understandings of AI as sociotechnical systems, enunciated through people and technologies as well as the ideological and material infrastructures that give them form (Seaver 2018).#infrastructure, #data war, #nationalism

Phase One

Let us go back a decade to particular political tectonics. In March 2016, Elor Azaria, a combat soldier deployed to patrol the city of Hebron in the occupied West Bank shot and killed Abdel Fattah al-Sharif, a Palestinian assailant who already lay immobilized on the ground. Azaria was tried in an Israeli military court for manslaughter. The trial sparked historic protests across the country. Tens of thousands demonstrated outside Israeli army bases, in public squares, and in front of government buildings. The cause of their outrage was clear. They decried the military as weak. Some said the army appeared more intent on saving Palestinian lives than protecting the lives of Israelis, even its own soldiers like Azaria. It was the first time Israeli citizens revolted en-masse again their military.#nationalism, #politics

Scholars of Israeli militarism like Yagil Levy and Rebecca Stein hold up the Azaria affair as a tipping point (Levy 2024, Stein 2021). A moment when Israel's traditional military establishment, centrist, secular, and nominally rule abiding was losing its authoritative grip. The ideology of Israel's radical right, which had long championed deadly force against Palestinians, was moving from the margins to the mainstream. To shore up its legitimacy among an increasingly right-wing populace, military heads began promoting more deadly tactics in the occupied Palestinian territories. Soldiers were instructed to shoot to kill, drone strikes in the West Bank and Gaza became more frequent, and military spokespeople began publishing lists of Palestinians assassinated after every major operation.#violence

These changes marked a shift from the Israeli Defense Force's (IDF) policies. Since the early 2000s, military leaders had been promising a progressively technological occupation would make Israeli military rule easier to sustain. Digital and then automated technologies – reconnaissance drones, CCTV cameras, biometric cameras, remote sensing systems – had promised to effectively manage Israel's military rule. Wartime innovations were said to reduce the number soldiers deployed to combat, prevent acts of terrorism, and minimize the intrusiveness of military rule for Palestinians. But by the late 2010s, violence in the West Bank was rising. At the same time, right wing Israeli politicians were gaining unprecedented political power. They and their supporters were sick of promises of humane military strategies that could pave the way for gradual peace plans, even if the plans were sure to never materialize. They demanded more brutal military tactics, more violent displays of force, more lethal outcomes.#automation, #violence, #politics

The army met their demands. In 2019 Aviv Kochavi became chief of staff of the IDF. In his entry speech, he pledged to make the army into a "lethal, innovative and efficient fighting force", appealing to a fractured Israeli populace (Levy 2021). For the center left, the words "innovative and efficient" tugged at a fantasy of a humane occupation, one effectively managed but never fully resolved by successive innovations in surveillance and killing. For the right, however, "lethal" carried a different power. It evoked a military that sees killing as the principle metric of military success.#violence, #innovation, #politics

Phase Two

The Israeli military's embrace of lethality was part and parcel of global trends. Worldwide, militaries obsessed with Silicon Valley's mode of production were turning to big-data analytics and machine learning to scale up their killing capacities. In 2018, the US Department of Defense made the "restoration of lethal force" a key concept in its official National Defense Strategy, touting innovations in algorithmic targeting – Palantir surveillance models, Google's cloud computing, AI-assisted drone swarms refined over two decades of counterinsurgency in the Middle East – as key to those efforts. The writer Jared Keller has described it as the Pentagon's 'cult of lethality' (Keller 2019). One that helps, in the words of Mathew Ford, "preserve the fiction that science both offers certainty in war and underpins the utility of military operations" (Ford 2018, 2059).#data war, #data

The belief that all geopolitical conflicts can be approached as engineering problems has a uniquely modern genealogy. It spans a century's worth of industrial scale military engineering, from nuclear armaments and radars to digital surveillance and drone strikes. Hype around big data, algorithmic processing, and an AI arms race has only compelled militaries to sink billions more into maintaining  a technological edge over adversaries (Scwartz 2025). Generals now parrot technology CEOs, approaching military conflicts as a product that can be optimized with better algorithms, more data, and more precise analytics.#engineering, #Big Data

Israeli intelligence units have exemplified these shifts. Over the last decade, soldiers were pulled away from analogue forms of surveillance and analysis and dispatched into what generals called "AI factories." Fewer soldiers were conscripted with the Arabic skills necessary to listen to conversations, compile intelligence briefings, analyze raw data. More were given incentives to experiment, evaluate, and refine targeting systems, tinkering with speech to text software, predictive systems, large language models. By 2020, the number of conscripts devoted to technical roles, from user experience research to engineering, had ballooned.#Big Data, #data war, #engineering

Technology helped to mitigate between the opposing sides of an increasingly polarized Israeli populace. That new AI systems that made the army more lethal appeased an increasingly right wing political base, demanding more brutal tactics. Death tolls across the territories rose and each year was more violent than the last. Yet, as the scholar Yagil Levy (2024) has noted, claims to technical precision helped to legitimize the bloodshed for Israel's more centrist elements, casting Israeli warfare as rational, restrained, and therefore humane. "AI makes our fighting more moral," Kochavi would tell reporters in 2021, "because we have a precise intelligence address for every bomb dropped" (Bohbot 2021).#violence, #data, #algorithmic intensification

Phase three

None of the state-of-the-art systems managed to alert IDF troops of Hamas' attack on October 7th and prevent the atrocities of that morning from unfolding. Some ex-security officials lamented a military so blinded by the hype emanating from Silicon Valley, they abandoned tried and true intelligence tactics. Others critiqued an enduring hubris that prevented military heads from seeing Palestinians as real political adversaries.#violence, #Intelligence

In response to Hamas' bloody attack, Israeli politicians promised to exact vengeance. The Air Force attacked 1,500 targets in Gaza in the first 48 hours of war. However Prime Minister Benjamin Netanyahu demanded more. According to the Israeli daily Yedioth Ahronoth, Netanyahu erupted in anger in a closed cabinet meeting on October 9th. "Why not 5,000?", he demanded of the IDF's Chief of Staff, Herzi Halevi. "We don't have 5,000 approved targets," Hertzi Halevi replied. "I'm not interested in targets," Netanyahu responded. "Take down houses, bomb with everything you have" (Nahmun 2025).#algorithmic intensification, #politics

The military heeded his demands. intelligence units used AI systems to churn out as many targets as possible for the Airforce to strike. According to investigative reporting by Yuval Abraham with +972 Magazine, generals ordered soldiers to lower the algorithmic thresholds used to determine who or what constituted a viable target, raise the number of civilians allowed to be killed in so-called targeted attacks, and gave troops as little as 20 seconds to sign off on each strike (Abraham 2024). In interviews I conducted, one reservist who served in intelligence units in those months put it this way: "I should be clear, after the seventh, they wanted to bomb as much as possible, so they let the machines do it" (Goodfriend 2024).#algorithmic intensification, #data war

AI-assisted targeting systems may have allowed the military to target and kill at an unprecedented scale, as international media outlets have reported. However attention to developments on the ground evidences how all the violence was the result of concerted decisions: a prime minister ordering a campaign of destruction and a military echelon eager to heed his demands.#AI, #violence, #politics

My argument, then, is quite simple. The embrace of algorithmic warfare, in Israel and elsewhere is enabled as much by technological innovations in killing as it is by particular political developments: the renaissance of far-right populism and militarism. The ideology looks different depending on one's vantage point, but an exaltation of closed borders and fortified homelands binds the disparate pieces together. As does the belief that more data and better algorithms will shore up national security. In practice, however, it simply allows deadly wars to drag on.#data war, #politics, #nationalism

Additional details

Description

AI has played a pivotal role in Israel's war in the Gaza strip since October 7 th 2023, when Hamas militants massacred 1,200 Israeli civilians and soldiers and took more than 250 hostage (CNN 2025).  Reporting by Israeli and international news outlets have detailed how large language models and predictive analytics are helping to determine when and where bombs fall from the sky and troops shoot on the ground (Abraham 2024).

Identifiers

UUID
273fdfbb-2fe7-4f1f-a562-432441e1da2f
GUID
https://carrier-bag.net/?p=1480
URL
https://carrier-bag.net/killing-at-scale/

Dates

Issued
2025-04-24T23:23:42
Updated
2025-04-25T11:18:59