Hostname: page-component-74d7c59bfc-zmj6b Total loading time: 0 Render date: 2026-01-28T06:09:52.572Z Has data issue: false hasContentIssue false

Algorithmic Death-World: Artificial Intelligence and the Case of Palestine

Published online by Cambridge University Press:  21 January 2026

Sarah Fathallah*
Affiliation:
University of Cambridge, UK
*
Rights & Permissions [Opens in a new window]

Abstract

Since October 2023, residents of Gaza have been subjected to artificial intelligence (AI) target-generation systems by Israel. This article scrutinises the deployment of these technologies through an understanding of Israel’s settler-colonial project, racial-capitalist economy, and delineation of occupied Palestinian territories as carceral geographies. Drawing on the work of Andy Clarno, which demonstrates how Israel’s decreasing reliance on Palestinian labour made them inessential to exploitation, this article argues that Palestinians are valuable to Israel for another purpose: experimentation. For over fifty years, Palestinians have been rendered as test subjects for the development of surveillance and warfare technologies, in what Antony Lowenstein calls “the Palestine Laboratory.” AI introduces a dual paradigm where both Palestinian lives and deaths are turned into sites of data dispossession. This duality demands keeping Palestinians alive to generate constantly updating data for the lethal algorithmic systems that target them, while their deaths generate further data to refine and market those systems as “battle-tested.” The article describes this state as an algorithmic death-world, adapted from Achille Mbembe’s conception of necropolitics. This article concludes that as Israel exports its lethal AI technologies globally, it also exports a model of racialised disposability.

Information

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - SA
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-ShareAlike licence (http://creativecommons.org/licenses/by-sa/4.0), which permits re-use, distribution, and reproduction in any medium, provided the same Creative Commons licence is used to distribute the re-used or adapted article and the original article is properly cited.
Copyright
© The Author(s), 2026. Published by Cambridge University Press

[The] procedures adopted for the captive flesh demarcate a total objectification, as the entire captive community becomes a living laboratory.

– Hortense J. Spillers, “Mama’s Baby, Papa’s Maybe: An American Grammar Book”

“A mass assassination factory” was the headline of an investigation that brought to mainstream light the use of a target-generation system powered by artificial intelligence (AI), used by Israel to identify an unprecedented amount of targets to bombard in Gaza.Footnote 1 While garnering newfound attention, this targeting system follows a decades-long history of killings of Palestinians at the hands of Israeli occupation forces and settlers, including killings mediated by modern technologies of warfare.

Settler-colonial studies scholars distinguish settler colonialism from franchise colonialism, in part based on how colonisers relate to the colonised. Under franchise colonialism, the colonial subject has a utility in providing labour, unlike under settler colonialism. The eliminatory projects carried out by settler-colonial regimes that aim to eradicate or expel indigenous populations, rather than use them as workers, bring into tension a logic of elimination versus a logic of exploitation in racial capitalism.

In Neoliberal Apartheid, scholar Andy Clarno troubles this tension when comparing the modern history of racial capitalism in the apartheid regimes of South Africa and Israel/Palestine. In pre-1994 South Africa, European settlement required the violent displacement and exclusion of the Black population. This dynamic shifted as White-owned workplaces became dependent on the low-wage labour of Black workers, preventing their total eradication. Whereas elimination made way for exploitation in the case of apartheid South Africa, decades of centralised economic planning meant that the exploitation of Palestinians was not a central function of Israel’s racial capitalism. The neoliberal restructuring of the Israeli economy since the late 1980s aimed to decrease its reliance on Palestinian workers, making them more vulnerable to the eliminatory practices of the Israeli regime.Footnote 2

Settler-colonial elimination, however, does not fully capture the driving forces animating Israel’s subjugation of Palestinians. This article argues that Israel’s racial-capitalist economy operates through another logic, that of experimentation, which is inherent to racialised carceral geographies and the histories of testing new technologies on a captive human population. As demonstrated by Antony Loewenstein in The Palestine Laboratory, Palestinian captivity takes the form of rendering them as objects of experimentation. This, in turn, offers a new form of commodification for Israel’s war profiteers, who seek advantage in the market by packaging AI-powered systems as “battle-tested.”Footnote 3 Palestinian objectification in this way becomes a means of increasing the value of goods.

Given AI’s demands for data, this logic of experimentation requires constant and iterative cycles of data dispossession, which occur on two fronts. On the one hand, the biometric, personal, and other data of living Palestinians are constantly harvested under conditions of coercion or covertness, as they cross military checkpoints in the West Bank or flee through evacuation corridors in Gaza. On the other hand, as their data is used to train and feed the very technologies that surveil, target, and kill them, the facts and circumstances of their targeting and deaths also become a site of production of data. That data is then used to augment the deadliness of AI systems that target and kill others.

The latest use of algorithmic target-generation systems in Gaza exemplifies how the data needs of lethal AI systems require Palestinians to be both living and dying. This dual character situates Palestinians in a form of algorithmic “death-world,” borrowing a term from Achille Mbembe to depict geographies where populations exposed to necropolitical conditions of extreme cruelty become essentially living dead.Footnote 4 In this algorithmic death-world, Palestinians are kept in a state of condemned living, always under the threat of being targeted, all the while being dispossessed of data required to enhance the lethality of the tools that ultimately deliver their death. These tools are subsequently improved upon and sold by Israeli companies to customers worldwide. But more than simply selling products, Israel sells the very idea of disposability as a mode of managing unwanted populations and as a market opportunity for war profiteering.

This article is organised as follows. Section 1 explores settler colonialism’s relationship to labour to illuminate the differences between the logics of elimination and exploitation and understand how Israel exerts dominance over Palestinians. Section 2 argues that a logic of experimentation animates Israel’s racial-capitalist economy. Section 3 demonstrates that this logic of experimentation, when invoked in service of algorithmic systems, requires data dispossessed on two fronts: both in life and in death, Palestinians’ bodies are rendered as the sources of data for lethal AI technologies. Section 4 describes how this dual need for data places Palestinians en masse in a form of algorithmic “death-world,” amplifying their necropolitical disposability. This article concludes by discussing the precedent set by, and potential replication of, algorithmic death-worlds in other contexts.

1. Elimination versus exploitation of settler-colonial subjects

Racial capitalism encompasses a wide range of capitalist modes of production and accumulation that rely on the dispossession of racialised populations, from slavery and indentured servitude to colonialism. In their introduction to Histories of Racial Capitalism, Destin Jenkins and Justin Leroy point to examples around the world—including in India, the Philippines, and Hawai’i—where colonial military and administrative structures were crucial to racial capital accumulation.Footnote 5 In addition to extracting resources, colonial regimes recruited and controlled colonised peoples for their labour power.

That being said, settler-colonial regimes can appear to be incompatible with systems of racial capitalism and their demands for labour exploitation. Settler-colonial studies scholars distinguish settler colonialism from franchise colonialism, in part based on their relationship to labour.Footnote 6 The difference between franchise colonialism and settler colonialism is the difference between “a demand for labour and a demand to go away.”Footnote 7 While “going away” could be achieved through different means, including the expulsion, mass killings, or forced assimilation of indigenous populations, the common denominator of settler colonialism is that it requires the systematic erasure of indigenous peoples from the land in order to replace them with settlers. In a seminal paper in settler-colonial studies, Patrick Wolfe terms this as a “logic of elimination,” prioritising the “elimination of the native” as a central feature in the process of establishing a settler society on expropriated land.Footnote 8 Conversely, franchise colonialism is sustained by “a determination to exploit” indigenous peoples for their labour.Footnote 9

While the seeming dichotomy between exploitation and elimination is established in settler-colonial studies literature, it is troubled when looking at the case of apartheid South Africa. In Neoliberal Apartheid, Andy Clarno presents a comparative study of racial capitalism in apartheid South Africa and Palestine/Israel, based on a decade of research in both contexts. Beginning in 1652, European settlement in South Africa relied on eliminatory practices that violently displaced Africans in order to conquer their land.Footnote 10 But come the late 1800s, Europeans in South Africa “sought to exploit and not just expel the African population.”Footnote 11 This dynamic intensified after the formation of the Union of South Africa in 1910, giving way to a period of industrialisation that made White-owned mines, factories, and farms dependent on the exploitation of Black workers. This is not to say that the South African Black population did not continue to experience exclusion and “persisting landlessness.”Footnote 12 However, Clarno contends that “the demand for low-wage Black labor prevented the complete elimination of the Black population,” especially when compared with Israel’s eliminatory puissance directed towards decimating Palestinians.Footnote 13

Rather than conceiving of elimination and exploitation as opposing logics, Clarno emphasises the need for a more subtle historical frame that accounts for transformation in the “shifting forms” of settler colonialism and neoliberal racial capitalism.Footnote 14 In the case of apartheid South Africa, what he shows is that a process that at one moment sought the elimination of the Black population mutated into one primarily organised around their labour exploitation.

As for the case of Palestine, in contrast, a logic of exploitation—understood as the extraction of surplus value through the production process—came to have a decreasing role in Israel’s racial-capitalist economy. To explain the relative intensity of eliminatory violence unleashed on colonised Palestinians, Clarno explicates that “the principal difference between Palestine/Israel today and South Africa before 1994 is the question of labor.”Footnote 15 Since the 1920s, the Zionist movement sought to establish a state that would reserve jobs for Jewish settlers, thereby reducing its dependence on Palestinian workers.Footnote 16 After the establishment of the state of Israel in 1948, the elimination aspect of this settler-colonial project, and the political and military capacities necessary to implement it, has only intensified.

Beginning in 1948, Palestinians provided low-wage labour to Israel’s economy. Later, after the 1967 Israeli occupation of the West Bank and the Gaza Strip, Palestinians in occupied territories were incorporated into Israel’s economy, paid even lower wages than 1948 Palestinians. By the mid-1980s, 40% of Palestinian workers worked inside Israel, mostly in agriculture and construction.Footnote 17 But from the late 1980s and through the 1990s and beyond, the neoliberalisation of Israel’s economy worked to render Palestinian labour progressively inessential. Israel transitioned to a high-tech economy, reducing its demand for industrial and agricultural labour, and free trade agreements allowed Israel to move production to neighbouring countries.Footnote 18 Israel also instituted a restrictive work permit regime for Palestinians, all the while importing large numbers of foreign workers to enshrine its non-reliance on Palestinian workers.Footnote 19 This shift in Israel’s racial-capitalist economy meant that Palestinians were more vulnerable to killing and displacement at the hands of the Israeli state and Israeli settlers, making way for settlement expansion across occupied Palestinian territory.Footnote 20 Because, in the words of Clarno, “when a colonial state no longer values the labor of the colonized, there are few structural barriers to elimination.”Footnote 21 This begs the question: If elimination animates Israel’s settler-colonial project, but exploitation has been rendered unnecessary to its economy, what logic better characterises how Israel stabilises its racial capitalism?

2. Experimentation and Palestinians as test subjects

Having unsettled the apparent dichotomy between elimination and exploitation, and established that exploitation is decreasingly central to Israel’s racial-capitalist economy, carceral geography studies offer a framework that can help more accurately define how Israel (de)values Palestinians. Indeed, Israel’s racial capitalism regards Palestinians as useful for another purpose: experimentation.

Racialised populations have a long history of being subjected to experimentation within the confines of carceral geographies, which scholar Britt Rusert defines as spaces of coercive containment that become sites of human experimentation.Footnote 22 From the abusive medical experimentation conducted on enslaved women to the Tuskegee syphilis experiments, Rusert traces the history of human experimentation in the United States in plantations, refugee camps, prisons, and other carceral spaces.Footnote 23 Taking a more global view, critical theorist Achille Mbembe speaks to how inventions such as forced sterilisation found “their first testing ground in the colonial world.”Footnote 24

The settler colony of Israel is no exception. In his eponymous work, investigative journalist Antony Loewenstein uses the phrase “the Palestine Laboratory” to portray how Israel relies on Palestine as a proving ground for arms and surveillance technologies that it then exports to other countries.Footnote 25 According to Loewenstein, “Palestine is Israel’s workshop, where an occupied nation on its doorstep provides millions of subjugated people as a laboratory for the most precise and successful methods of domination,” including “rockets, aerial defense systems, missiles, cyberweapons, and radar.”Footnote 26

Israel’s interests in maintaining an occupied population as a resource for experimentation are economically justified. Scholar Jeff Halper refutes that the occupation of Palestine poses a financial burden on Israel, asserting that, without it, Israel could not compete in the international weapons and securitisation markets.Footnote 27 This is because Israel’s competitive advantage is that the technologies it exports are “battle-tested” and “battle-proven.” These are terms Israeli cybersecurity and arms manufacturers publicise when showcasing their products in weapons expo shows to potential clients from around the world.Footnote 28 Veterans of the Israeli military, who often develop careers in the arms industry after their service, market the fact that weapons have been fine-tuned on Palestinians as a selling point. As professor Neve Gordon exclaimed, “They can say, ‘Hey this was used by the IDF [Israel’s military], this must be good.’ And that helps the marketing of the goods.”Footnote 29 In fact, SIBAT, the governmental agency in charge of the marketing and sale of IDF surplus systems and equipment, “highlights Israel’s ‘unique’ advantage of having the technologies of control it exports derive not from military laboratories […] but from the real controlled-conflict laboratory of the Occupied Territory.”Footnote 30 Not only are these weapons and technologies tested in real-world conditions, given that there are seldom any time gaps between Israel’s military campaigns, the development and iteration of those systems can happen in close to real-time cycles.Footnote 31

Even before the widespread adoption of AI, surveillance and carceral technologies used and perfected on Palestinians have been exported by Israel to other countries around the world. For example, the Pegasus software, developed by Israeli cybersecurity company NSO Group and first piloted on Palestinian human rights defenders, has been found to be in use in at least 45 countries.Footnote 32 The ability to experiment on Palestinians has similarly allowed Israel to become a global leader in unmanned aerial vehicles (UAVs), also called drones. Since 1985, Israel has been the global leading exporter of UAVs, accounting for 60.7% of sales worldwide.Footnote 33 First used in the 1950s, Israeli UAV research and development operated at a rapid pace thanks to four decades of consistent use and iteration in its occupation of Palestine.Footnote 34 Besides UAVs, other Israeli innovations that have been widely exported include night-vision equipment, bombs that penetrate reinforced concrete buildings, and anti-tank missiles.Footnote 35 And, increasingly, Israel is relying on the development of AI to continue in its mission as one of the world’s leading surveillance technology and weapons exporters.

3. Artificial intelligence and the dual fronts of data dispossession

Recent years have marked advancements in AI, and with them, a growing interest and accompanying surge in investment in AI surveillance and warfare systems.Footnote 36 Israel embraced AI as part of its military strategy, recognising the value of AI in enabling weaponry such as UAVs to shoot and bomb targets, armoured vehicles to drive autonomously, networked sensor systems to provide real-time battlefield data to shooters, microcomputer-outfitted rifles to support infantry soldiers in “increasing lethality,” and powering many other tools of biometric surveillance and autonomous weapons.Footnote 37

“Artificial intelligence” is a nebulous umbrella term that encompasses a wide range of technologies that have in common their need for data, whether to train models to recognise patterns or make predictions, to test models and assess their performance, or to operate functions like optimisation, processing, or automation. Quoting the Open Data Institute, “without data, there is no AI.”Footnote 38 Going from training, testing, to operation is not so much a linear process as it is a feedback loop: a dynamic process where an AI system uses its own output as new input to continuously learn and improve.

Considering Palestinians as experimentation subjects for Israel’s AI systems requires considering them as sites of capture of crucial data. When speaking to lethal technologies more specifically, AI introduces a duality in this process of data capture: both Palestinian lives and deaths generate data points for Israel’s data-trained and data-fed systems of warfare and carcerality.

3.1. Palestinian lives as data

To understand how Palestinian lives supply data critical to the development and operation of AI systems, one must fully grasp the conditions of constant data capture imposed on Palestinians, which have been the subject of investigation by many journalists and human rights groups. Indeed, the state of incessant tracking and monitoring under which Palestinians live can be aptly qualified as data dispossession: a continual process of extraction of personal and biometric data.Footnote 39

In the West Bank, for instance, Israeli startup AnyVision was exposed in 2019 for secretly monitoring Palestinians across the West Bank through more than 115,000 cameras, the locations of which were largely unknown. This facial recognition apparatus was qualified by AnyVision as “nonvoluntary” given that Palestinians are automatically tracked, categorised, and detected without their consent or enrollment.Footnote 40 AnyVision is far from being the only company implementing such biometric data capture technologies. Corsight is another facial recognition company that developed technology for police body cameras “that could immediately identify an individual in crowds, even if their face was covered, and match the person to photographs from years before.”Footnote 41 Besides private companies, the Israeli military itself employs an extensive network of cameras, mobile phones, and other means of data capture to document Palestinians. In 2021, investigative reporting revealed a vast database used by the Israeli military known as Wolf Pack, which contains profiles about virtually every Palestinian in the West Bank, including imagery, family members and history, educational status, licence plates, and more.Footnote 42 To feed this database, human rights organisation Amnesty International revealed in 2023 the existence of a previously unreported system called Red Wolf, which deploys 10 to 15 cameras at military checkpoints across the West Bank to scan and capture facial imagery of Palestinians without their consent. In addition to relying on this network of security cameras, Israeli soldiers themselves have been using a smartphone app called Blue Wolf since 2020 to capture images of Palestinians. Soldiers were encouraged to compete in a gamified system that rewards those who take and register the most photographs.Footnote 43 This widespread facial capture apparatus goes against condemnations and calls to ban facial recognition surveillance from civil society organisations from around the world, including Amnesty International, Human Rights Watch, and over 170 more.Footnote 44

In Gaza, data capture was traditionally conducted through wiretapping, monitoring drone and satellite footage, and infiltrating social media accounts and telecommunications infrastructure, with Israel claiming the ability to intercept and record every phone conversation.Footnote 45 Since October 2023, Israel was reported to have extended a new mass facial recognition program to Gaza, in tandem with its military ground offensive in the area. Israeli soldiers set up checkpoints with facial recognition cameras along the same evacuation corridors through which they urged people in Gaza to flee and find safety, biometrically checking them along the way.Footnote 46 Using a custom tool by Corsight along with Google Photos, Israeli military forces aimed to build a “hit list” of suspected militants. Despite its touted accuracy, the system misidentified Palestinians that were later detained, interrogated, and beaten.Footnote 47 Potential plans by Israel to set up gated communities in Gaza, which condition Palestinians’ access to humanitarian aid on their submission to biometric tests, including through facial recognition cameras, would further entrench data dispossession in what rights groups are describing as a “biometrics-for-food” mechanism of coercive surveillance.Footnote 48 In short, without Palestinians being alive—going to aid distribution sites in search of food or passing through facial recognition cameras at checkpoints and evacuation corridors, Israel would not be able to dispossess them of their data.

3.2. Palestinian deaths as data

The data dispossessed from Palestinians is used by Israel to render them as targets. With AI’s predictive functions, Israel can target Palestinians preemptively, for example by monitoring their social media in order to inculpate them for activity they will allegedly engage in the future.Footnote 49 And in some cases, the very systems that are capturing the data are the ones targeting them. A prime example can be found in the city of Hebron, where “motion and heat sensors, license plate scanners, facial recognition algorithms, 24/7 CCTV surveillance” are used to monitor Palestinians, sometimes even inside their homes.Footnote 50 In 2022, Israel introduced a remotely controlled crowd control system that uses AI to determine its targets. This system, developed by Israeli firm Smartshooter, is capable of firing tear gas, stun grenades, or sponge-tipped bullets at the targets it identifies.Footnote 51 This example elucidates how moments of targeting serve to capture and feed data to AI-powered tools that construct Palestinians as targets of (potentially deadly) attacks from those same tools.

In addition to their lives, the deaths of Palestinians are, too, turned into data points for Israel to fine-tune and demonstrate the effectiveness of its lethal technologies. As AI requires a constant data feedback loop, data about and from the lives of Palestinians facilitates targeting them in the moment, and data from their targeting—and killing—improves the precision of future targeting. By way of example, Spice 250 is an AI system fitted on the wings of bomb-carrying warplanes, used by Israel for the purposes of automatic target recognition.Footnote 52 A representative of Rafael Advanced Defense Systems, the Israeli company that supplies this technology to the IDF, extolled how Spice 250 allows for “future deep learning to be based on data extracted from earlier launches,” completing two simultaneous functions: one being the real-time evaluation of the damage inflicted on a target during a mission, and the other being the extraction of post-mission information for future updates.Footnote 53 To put it simply, Spice 250 assesses the data from the damage caused by bombs in one strike mission to ameliorate the system’s model and upgrade the damage that can be caused by bombs dropped in future strike missions.

Eventually, the experimentation practices needed for the training, operation, and testing of tools like Spice 250 generate data that not only serves to refine them but also financially benefit Israeli economic actors who develop, implement, and sell them. In fact, merely months after October 2023, Israeli startups such as Axon Vision, who are behind some of the AI-powered products deployed in Gaza, were already advertising that their products’ “precision has improved” from their use on the ground, in the hopes of increasing their marketability to potential customers.Footnote 54 Similarly, data proving that their tools have successfully hit their targets is wielded by Israeli weapons developers to demonstrate the effectiveness of their products. It has been observed that those companies show video footage during weapons expo shows, for example of systems flattening entire Palestinian neighbourhoods in a matter of seconds or of a drone strike that “killed a number of innocent Palestinians, including children,” as promotional material.Footnote 55 In short, without Palestinians dying, Israel would not have the data it needs to perfect the lethality of its AI systems, prove their capabilities, and market them to potential buyers.

4. Algorithmic death-world and necropolitical disposability

The data generated by and for the experimental deployment and refinement of lethal AI systems supports the neoliberal restructuring processes that, according to Clarno, facilitate “the expansion of necropolitical projects to eliminate the racialized poor.”Footnote 56 Necropolitics, a concept coined by theorist and scholar Achille Mbembe, refers to the politics of subjugating living beings to the power of death.Footnote 57 Relevant to the context explored in this article, the “most accomplished form” of necropolitics, to Mbembe, is “the contemporary colonial occupation of Palestine.”Footnote 58 Indeed, Israel’s necropolitical practices have been long documented over the years, and include torturing and killing detained Palestinian political prisoners, maiming Palestinian bodies, attacking healthcare and life-sustaining infrastructure, imposing restrictions on Palestinians seeking medical treatment, harvesting organs from dead Palestinians, and freezing and withholding their bodies from their families.Footnote 59

When it comes to advances in AI technologies, scholar Dan McQuillan references Mbembe’s work to warn against their necropolitical trajectory. He explains that AI is becoming a form of governance exercised in social services, healthcare, and carceral settings, one that “not only discriminates in allocating support for life but sanctions the operations that allow death.”Footnote 60 For McQuillan, AI does so by exacerbating neoliberal inequalities and denying people access to basic necessities like housing, food, and healthcare, making them vulnerable to preventable harms.

Recent investigative reporting by +972 Magazine shed light on the use of algorithmic target-generation tools in Israel’s ongoing genocidal campaign in Gaza, namely Gospel, a system that marks buildings as targets for airstrikes, and Lavender, a program that identifies human targets and generates kill lists.Footnote 61 Even though the use of AI programs like Gospel had been previously disclosed years before 2023, details surrounding the deployment of these target-generation systems have since revealed a more perturbing picture of the necropolitics of AI.Footnote 62 Echoing McQuillan’s warnings, the use of algorithmic targeting tools like Gospel and Lavender in Gaza marks a stark amplification in the capacity for automation and systematisation of Israel’s necropolitical techniques.

While the full picture of what data is used by these targeting systems remains unknown, IDF sources reported that AI targeting recommendations are based on the processing of billions of data points from aerial and subterranean sensors, satellite photographs, intercepted communications, social media activity, and more, facilitating large-scale damage and “a significant civilian death toll in Gaza through the fallible collection and use of personally identifiable information.”Footnote 63 As such, these tools are not only responsible for the direct killing and maiming of tens of thousands of Palestinians, they also contribute to the slower death of hundreds of thousands as a result of the destruction of food, water, shelter, medical, hygiene, and other support systems and infrastructure, or what Mbembe described as the “orchestrated and systematic sabotage” of infrastructure.Footnote 64

In closing, this article conceives of necropolitics as an analytical lens through which to view the algorithmic target-generation systems in use in Gaza. Three crucial interventions form part of that analytical framework. First, the conditions of immediate and slow death facilitated by these systems bring to bear Mbembe’s notion of death-worlds, defined as “new and unique forms of social existence in which vast populations are subjected to conditions of life conferring upon them the status of living dead.Footnote 65 The data needs of these systems place Palestinians in a death-world where they become essentially living dead, alive enough to be dispossessed of data used to target them and enhance the lethality of the tools that continue to deliver death.

A second feature of necropolitics is the production of death-worlds by weapons “deployed in the interest of maximally destroying persons.”Footnote 66 Here, the unprecedented scale at which targets are rapidly identified by Israel cannot be overstated, as IDF officials called AI a “force-multiplier” that grants them the capacity to execute far more airstrikes than previously possible.Footnote 67 For instance, they mentioned that Gospel generates 100 human targets per day, when the IDF could generate 50 targets per year a few years prior.Footnote 68 To achieve this level of “significant operational effectiveness,” equally significant computing power is required to run these AI tools—power that is enabled by the data centres, server farms, and cloud services of tech giants like Amazon, Cisco, Dell, Google, Microsoft, OpenAI, and Palantir.Footnote 69 Compared to previous modes of target identification, these systems bestow upon Israel the ability to maximally target people en masse.

A third analytical intervention is Mbembe’s conception of death-worlds in a state of siege, and how such a state allows for a “modality of killing” where “[e]ntire populations are the target.”Footnote 70 This description is fitting for the deployment of algorithmic target-generation tools in the besieged Gaza Strip. For context, the Lavender system used by IDF officers purportedly took the personal data of Palestinians in its system to assign each individual a rating.Footnote 71 Once a person’s rating surpassed a predefined threshold, that person was marked as a target to be killed. After exhausting the list of targets that exceeded the first threshold, IDF officers set a new, lower threshold allowing them to target even more people.Footnote 72 The calibration of Lavender’s scoring system makes it such that any Palestinian in Gaza could be turned into a target at any given time, rendering the entire population of residents in Gaza as potential targets.

Taken altogether, AI systems that are reliant on the dispossession of data from both Palestinian lives and deaths, capable of the generation of kill lists en masse, and calibrated to maintain the entire Gaza population as targets epitomise the existence of an algorithmic death-word governed by a logic of experimentation inherited from other racialised carceral geographies. These systems also exemplify the disposability central to Israel’s necropolitical project. Commenting on the ongoing genocidal campaign in Gaza, Clarno determines that Israel’s racial-capitalist economy has turned “racialized disposability into a market opportunity.”Footnote 73 Similarly, Loewenstein warns that “Israel has provided a model in Gaza on how to obliterate a society without serious consequences (yet),” piquing the interest of other forces interested in dealing with populations they deem undesirable in a similar fashion.Footnote 74 In a sense, Israel is not only selling weapons and surveillance systems to willing buyers, it is also selling the blueprint for how to manage unwanted populations by rendering them expendable, with AI being the latest instrument of necropolitical disposability.

5. A dangerous precedent

As a racial-capitalist project that effectively reduced its dependence on Palestinian labour, Israel’s geopolitical standing means that little stands in the way of settler-colonial practices that aim to eliminate Palestinians in order to replace them with settlers. Elimination, however, does not fully characterise how Israel subjugates Palestinians. This essay has turned to a lineage of racialised carceral geographies—plantations, refugee camps, and prisons—as sites of human testing. It has done so in order to explore how Palestinians serve as objects of experimentation to add a “battle-tested” stamp on the products Israeli war profiteers market and export worldwide.

Herein lies a tension central to Israel’s raison d’être as both a settler-colonial and a racial-capitalist project. On the one hand, from a political standpoint, the eliminatory practices of Israel as a settler-colonial regime point to an imperative to destroy the colonial subject, an imperative that is largely condoned by Israeli society.Footnote 75 On the other hand, from an economic standpoint, Israel’s racial-capitalist economy needs Palestinians as subjects of experimentation in order to cement its position as a leading surveillance technologies and weapons exporter.

The growing adoption and entrenchment of AI in surveillance and warfare alters the logic of experimentation characteristic of carceral geographies. The need for massive amounts of constantly updating data introduces a paradigm where AI systems dispossess Palestinians of their data both in life and in death. Israel relies on the information generated by the lives of Palestinians to target them, and by the deaths of Palestinians to improve and sell the AI systems that will keep targeting others.

The most recent deployments of algorithmic target-generation systems in Gaza since October 2023 brought this duality to the forefront. Said targeting systems take coercively extracted data from living Palestinians, data that these systems capture, process, abstract, and quantify, transforming life and social practice into measures of who is best fitted for killing. The deaths of Palestinians, in turn, provide more data to further calibrate and multiply future killings. The ability to target at will an entire population of Palestinians made disposable locates them in a form of algorithmic death-world, accelerating the alarming necropolitical trajectory of AI, as the logic of experimentation that drives Israel’s racial capitalism and the lethal violence that sustains it are sharply automated and systematised.

Experts are not only decrying the immediate devastating consequences of these systems, but are also warning about the terrifying precedent that their use in Gaza is setting. As Mbembe foresaw more than twenty years ago, “Gaza might well prefigure what is yet to come.”Footnote 76 Loewenstein’s sources are already suggesting that “many nations are looking to Israel and its use of AI in Gaza with admiration and jealousy,” conjuring a perturbing omen for the future.Footnote 77 Short of putting a stop to the occupation of Palestine and all colonial ambitions, Clarno predicts that “we will see Lavender and other AI kill lists again, along with even more frightening advances in necro-technology.”Footnote 78 After all, AI is only but the latest tool in the necropolitical toolbox, one that reveals and stabilises the political and economic motivations of institutions that long precede its adoption.

Author contribution

Conceptualization: S.F.

Conflicts of Interest

The author confirms no competing interests.

Footnotes

3 Loewenstein Reference Loewenstein2023.

4 Mbembe Reference Mbembe2003, 40.

5 Jenkins and Leroy Reference Jenkins, Leroy, Leroy and Jenkins2021, 16–17.

6 Wolfe Reference Wolfe2001, 868.

7 Veracini Reference Veracini2011, 4.

8 Wolfe Reference Wolfe2006, 387.

9 Veracini Reference Veracini2011, 2.

10 Clarno Reference Clarno2017, 25–26.

11 Clarno Reference Clarno2017, 6.

12 Clarno Reference Clarno2017, 35.

13 Clarno Reference Clarno2017, 197, 198.

14 Clarno Reference Clarno2017, 25.

15 Clarno Reference Clarno2017, 197–98.

16 Clarno Reference Clarno2017, 25–26.

17 Clarno Reference Clarno2017, 30.

18 Clarno Reference Clarno2017, 40.

19 Bartram Reference Bartram1998, 303; Berda Reference Berda2018.

20 Diehl and Federman Reference Diehl and Federman2024; Office of the United Nations High Commissioner for Human Rights 2024.

21 Clarno Reference Clarno2017, 198.

22 Rusert Reference Rusert and Benjamin2019, 27–28.

24 Mbembe Reference Mbembe2003, 23.

25 Loewenstein Reference Loewenstein2023.

26 Loewenstein Reference Loewenstein2023, 17–18.

28 Giovannetti Reference Giovannetti2019.

29 Kennard Reference Kennard2016.

31 Dowling Reference Dowling2023.

32 Buxbaum Reference Buxbaum2022.

34 Horowitz Reference Horowitz2011.

35 Northam Reference Northam2014.

36 Robins-Early Reference Robins-Early2024b.

41 Loewenstein Reference Loewenstein2023, 60.

42 Dwoskin Reference Dwoskin2021.

43 Amnesty International 2023a.

44 Amnesty International 2021.

45 Masarwa Reference Masarwa2021.

46 Robins-Early Reference Robins-Early2024a.

47 Frenkel Reference Frenkel2024.

48 Cogan and Scahill Reference Cogan and Scahill2024; Frankel and Mednick Reference Frankel and Mednick2025; Skyline International for Human Rights 2025.

50 Goodfriend Reference Goodfriend2023b, 461.

52 Frantzman and Atherton Reference Frantzman and Atherton2019.

53 Frantzman and Atherton Reference Frantzman and Atherton2019.

55 Baroud Reference Baroud2020; Loewenstein Reference Loewenstein2023, 7–8.

56 Clarno Reference Clarno2017, 201.

57 Mbembe Reference Mbembe2003, 39.

58 Mbembe Reference Mbembe2019, 80.

59 Alareer Reference Alareer2016; Amnesty International 2023b; Daher-Nashif Reference Daher-Nashif2021, 945; Deprez Reference Deprez2023, 8–9; Hanbali et al. Reference Hanbali, Kwong, Neilson, Smith, Hafez and Khoury2024, 2.

60 McQuillan Reference McQuillan2022, 85.

62 Ahronheim Reference Ahronheim2021.

63 Dwoskin Reference Dwoskin2024; Forensic Architecture Reference Architecture2024; Khlaaf, West, and Whittaker Reference Khlaaf, West and Whittaker2024, 2.

64 Hil and Polya Reference Hil and Polya2025; Khatib, McKee, and Yusuf Reference Khatib, McKee and Yusuf2024, 237; Mbembe Reference Mbembe2019, 82.

65 Mbembe Reference Mbembe2003, 40.

66 Mbembe Reference Mbembe2019, 92.

67 Biesecker, Mednick, and Burke Reference Biesecker, Mednick and Burke2025.

68 Abraham Reference Abraham2023.

69 Biesecker, Mednick, and Burke Reference Biesecker, Mednick and Burke2025.

70 Mbembe Reference Mbembe2019, 82.

71 Biesecker, Mednick, and Burke Reference Biesecker, Mednick and Burke2025.

72 Abraham Reference Abraham2024.

74 Loewenstein Reference Loewenstein2025.

75 Hardigan Reference Hardigan2024.

76 Mbembe Reference Mbembe2019, 97.

77 Loewenstein Reference Loewenstein2025.

References

Abraham, Yuval. 2023. “‘A Mass Assassination Factory’: Inside Israel’s Calculated Bombing of Gaza.” +972 Magazine, November 30. https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/.Google Scholar
Abraham, Yuval. 2024. “‘Lavender’: The AI Machine Directing Israel’s Bombing Spree in Gaza.” April 3. https://www.972mag.com/lavender-ai-israeli-army-gaza/.Google Scholar
Ahronheim, Anna. 2021. “Israel’s Operation against Hamas Was the World’s First AI War.” The Jerusalem Post, May 27. https://www.jpost.com/arab-israeli-conflict/gaza-news/guardian-of-the-walls-the-first-ai-war-669371.Google Scholar
Alareer, Refaat. 2016. “Israel’s Killer Bureaucracy.” The Electronic Intifada, June 28. https://electronicintifada.net/content/israels-killer-bureaucracy/17226.Google Scholar
Amnesty International. 2021. “Amnesty International and More than 170 Organisations Call for a Ban on Biometric Surveillance.” June 7. https://www.amnesty.org/en/latest/press-release/2021/06/amnesty-international-and-more-than-170-organisations-call-for-a-ban-on-biometric-surveillance/.Google Scholar
Amnesty International. 2023a. “Automated Apartheid: How Facial Recognition Fragments, Segregates, and Controls Palestinians in the OPT.” https://www.amnesty.org/en/documents/mde15/6701/2023/en/.Google Scholar
Amnesty International. 2023b. “Israel/OPT: Horrifying Cases of Torture and Degrading Treatment of Palestinian Detainees amid Spike in Arbitrary Arrests.” https://www.amnesty.org/en/latest/news/2023/11/israel-opt-horrifying-cases-of-torture-and-degrading-treatment-of-palestinian-detainees-amid-spike-in-arbitrary-arrests/.Google Scholar
Architecture, Forensic. 2024. “A Spatial Analysis of the Israeli Military’s Conduct in Gaza since October 2023.” https://forensic-architecture.org/investigation/a-cartography-of-genocide.Google Scholar
Arnett, George. 2015. “The Numbers behind the Worldwide Trade in Drones.” The Guardian, March 16. https://www.theguardian.com/news/datablog/2015/mar/16/numbers-behind-worldwide-trade-in-drones-uk-israel.Google Scholar
Baroud, Ramzy. 2020. “Technology of Death: The Not-so-Shocking Report on Israeli Weapons Exports.” The Jordan Times, March 24. https://jordantimes.com/opinion/ramzy-baroud/technology-death-not-so-shocking-report-israeli-weapons-exports.Google Scholar
Bartram, David V. 1998. “Foreign Workers in Israel: History and Theory.” International Migration Review 32 (2): 303. https://doi.org/10.2307/2547185.CrossRefGoogle ScholarPubMed
Berda, Yael. 2018. Living Emergency: Israel’s Permit Regime in the Occupied West Bank. Stanford Briefs.Google Scholar
Biesecker, Michael, Mednick, Sam, and Burke, Garance. 2025. “As Israel Uses US-Made AI Models in War, Concerns Arise about Tech’s Role in Who Lives and Who Dies.” AP News, February 18. https://apnews.com/article/israel-palestinians-ai-technology-737bc17af7b03e98c29cec4e15d0f108.Google Scholar
Buxbaum, Jessica. 2022. “Israel Experiments on Palestinians with AI-Powered Gun at Checkpoints.” MintPress News, October 5. https://www.mintpressnews.com/israel-smart-shooter-palestinians-lab-rats-ai-powered-gun/282129/.Google Scholar
Clarno, Andy. 2017. Neoliberal Apartheid: Palestine/Israel and South Africa after 1994. University of Chicago Press. https://doi.org/10.7208/chicago/9780226430126.001.0001.CrossRefGoogle Scholar
Clarno, Andy. 2024. “Israel’s Lavender Kill List: A Joint Imperial Production.” Spectre Journal 5 (10). https://doi.org/10.63478/LS2N6V2D.CrossRefGoogle Scholar
Cogan, Yaniv, and Scahill, Jeremy. 2024. “The Israeli-American Businessman Pitching a $200 Million Plan to Deploy Mercenaries to Gaza.” Drop Site, October 21. https://www.dropsitenews.com/p/mercenary-deliver-humanitarian-aid-gaza.Google Scholar
Daher-Nashif, Suhad. 2021. “Colonial Management of Death: To Be or Not to Be Dead in Palestine.” Current Sociology 69 (7): 945–62. https://doi.org/10.1177/0011392120948923.CrossRefGoogle Scholar
Deprez, Miriam. 2023. “Visual Necropolitics and Visual Violence: Theorizing Death, Sight, and Sovereign Control of Palestine.” International Political Sociology 17 (3): olad016. https://doi.org/10.1093/ips/olad016.CrossRefGoogle Scholar
Diehl, Caleb, and Federman, Joseph. 2024. “A Look at How Settlements Have Grown in the West Bank over the Years.” AP News, July 5. https://apnews.com/a-look-at-how-settlements-have-grown-in-the-west-bank-over-the-years-0000019079d8d0f6a3da79dcbd0a0000.Google Scholar
Dowling, Paddy. 2023. “Dirty Secret of Israel’s Weapons Exports: They’re Tested on Palestinians.” Al Jazeera, November 17. https://www.aljazeera.com/features/2023/11/17/israels-weapons-industry-is-the-gaza-war-its-latest-test-lab.Google Scholar
Dwoskin, Elizabeth. 2021. “Israel Escalates Surveillance of Palestinians with Facial Recognition Program in West Bank.” The Washington Post, November 8. https://www.washingtonpost.com/world/middle_east/israel-palestinians-surveillance-facial-recognition/2021/11/05/3787bf42-26b2-11ec-8739-5cb6aba30a30_story.html.Google Scholar
Dwoskin, Elizabeth. 2024. “Israel Built an “AI Factory” for War. It Unleashed It in Gaza.” The Washington Post, December 29. https://www.washingtonpost.com/technology/2024/12/29/ai-israel-war-gaza-idf/.Google Scholar
Fatafta, Marwa. 2023. “Apartheid Tech: Biometric ID and Surveillance in the Occupied West Bank.” In Resisting Borders and Technologies of Violence, edited by Aizeki, Mizue, Mahmoudi, Matt, and Schupfer, Coline. Haymarket Books.Google Scholar
Fatafta, Marwa, and Leufer, Daniel. 2024. “Artificial Genocidal Intelligence: How Israel Is Automating Human Rights Abuses and War Crimes.” Access Now. https://www.accessnow.org/publication/artificial-genocidal-intelligence-israel-gaza/.Google Scholar
Frankel, Julia, and Mednick, Sam. 2025. “Takeaways from AP’s Report into Claims of Excessive Force by American Contractors at Gaza Aid Sites.” AP News, July 2. https://apnews.com/article/israel-military-gaza-ghf-aid-un-3c1bef17093a2a3eeda0764c220b857b.Google Scholar
Frantzman, Seth J. 2019. “Israel’s Carmel Program: Envisioning Armored Vehicles of the Future.” C4ISRNET, August 5. https://www.c4isrnet.com/artificial-intelligence/2019/08/05/israels-carmel-program-envisioning-armored-vehicles-of-the-future/.Google Scholar
Frantzman, Seth J. 2020. “Israel Finds an AI System to Help Fight in Cities.” C4ISRNET, February 5. https://www.c4isrnet.com/battlefield-tech/2020/02/05/israel-finds-an-ai-system-to-help-fight-in-cities/.Google Scholar
Frantzman, Seth J. 2021. “New AI System Fills Rifle Sights with Extensive, Easy-to-Digest Info.” C4ISRNET, September 7. https://www.c4isrnet.com/artificial-intelligence/2021/09/07/new-ai-system-fills-rifle-sights-with-extensive-easy-to-digest-info/.Google Scholar
Frantzman, Seth J. 2022. “Israel Unveils Artificial Intelligence Strategy for Armed Forces.” Defense News, February 11. https://www.defensenews.com/artificial-intelligence/2022/02/11/israel-unveils-artificial-intelligence-strategy-for-armed-forces/.Google Scholar
Frantzman, Seth J., and Atherton, Kelsey D.. 2019. “Israel’s Rafael Integrates Artificial Intelligence into Spice Bombs.” C4ISRNET, June 17. https://www.c4isrnet.com/artificial-intelligence/2019/06/17/israels-rafael-integrates-artificial-intelligence-into-spice-bombs/.Google Scholar
Frenkel, Sheera. 2024. “Israel Deploys Expansive Facial Recognition Program in Gaza.” The New York Times, March 27. https://www.nytimes.com/2024/03/27/technology/israel-facial-recognition-gaza.html.Google Scholar
Giovannetti, Megan. 2019. “‘Battle-Proven:’ Weapons Expo Shows One Way Israel Profits from Occupation.” The Progressive Magazine, June 10. https://progressive.org/latest/weapons-expo-israel-profits-occupation-giovannetti-190610/.Google Scholar
Goodfriend, Sophia. 2023a. “What the Rise of Drone Warfare Means for Palestinians.” Foreign Policy, January 20. https://foreignpolicy.com/2023/01/20/israel-palestine-west-bank-attack-surveillance-drones-far-right-settlers/.Google Scholar
Goodfriend, Sophia. 2023b. “Algorithmic State Violence: Automated Surveillance and Palestinian Dispossession in Hebron’s Old City.” International Journal of Middle East Studies 55 (3): 461–78. https://doi.org/10.1017/S0020743823000879.CrossRefGoogle Scholar
Gray, Catriona. 2021. “Data Dispossession.” The Sociological Review Magazine, May. https://doi.org/10.51428/tsr.ilzc1791.CrossRefGoogle Scholar
Halper, Jeff. 2015. War against the People: Israel, the Palestinians and Global Pacification. PlutoPress.10.2307/j.ctt183pct7CrossRefGoogle Scholar
Halper, Jeff. 2023. “Global Palestine: Exporting Israel’s Regime of Population Control.” In Resisting Borders and Technologies of Violence, edited by Aizeki, Mizue, Mahmoudi, Matt, and Schupfer, Coline. Haymarket Books.Google Scholar
Hanbali, Layth, Kwong, Edwin Jit Leung, Neilson, Amy, Smith, James, Hafez, Sali, and Khoury, Rasha. 2024. “Israeli Necropolitics and the Pursuit of Health Justice in Palestine.” BMJ Global Health 9 (2): e014942. https://doi.org/10.1136/bmjgh-2023-014942.CrossRefGoogle ScholarPubMed
Hardigan, Richard. 2024. “Polls Show Broad Support in Israel for Gaza’s Destruction and Starvation.” Truthout, February 10. https://truthout.org/articles/polls-show-broad-support-in-israel-for-gazas-destruction-and-starvation/.Google Scholar
Hever, Shir. 2023. “Israel Is Already Weaponizing AI — But Not in the Ways It Claims to Be.” Mondoweiss, July 1. https://mondoweiss.net/2023/07/ai-in-service-of-ia-artificial-intelligence-can-be-used-for-legitimizing-israeli-apartheid/.Google Scholar
Hil, Richard, and Polya, Gideon. 2025. “Skewering History: The Odious Politics of Counting Gaza’s Dead.” Arena, July 11. https://arena.org.au/politics-of-counting-gazas-dead/.Google Scholar
Horowitz, Adam. 2011. “New Resource Tracks and Analyzes Israeli Arms Exports.” Mondoweiss, February 27. https://mondoweiss.net/2011/02/new-resource-tracks-and-analyzes-israeli-arms-exports/.Google Scholar
Jenkins, Destin, and Leroy, Justin. 2021. “Introduction: The Old History of Capitalism.” In Histories of Racial Capitalism, edited by Leroy, Justin and Jenkins, Destin, 126. Columbia University Press. https://doi.org/10.7312/jenk19074-002.Google Scholar
Kennard, Matt. 2016. “The Cruel Experiments of Israel’s Arms Industry.” Pulitzer Center, December 28. https://pulitzercenter.org/stories/cruel-experiments-israels-arms-industry.Google Scholar
Khatib, Rasha, McKee, Martin, and Yusuf, Salim. 2024. “Counting the Dead in Gaza: Difficult but Essential.” The Lancet 404 (10449): 237–38. https://doi.org/10.1016/S0140-6736(24)01169-3.CrossRefGoogle ScholarPubMed
Khlaaf, Heidy, West, Sarah Myers, and Whittaker, Meredith. 2024. “Mind the Gap: Foundation Models and the Covert Proliferation of Military Intelligence, Surveillance, and Targeting.” arXiv. http://arxiv.org/abs/2410.14831.Google Scholar
Loewenstein, Antony. 2023. The Palestine Laboratory: How Israel Exports the Technology of Occupation around the World. First published. Verso Books.Google Scholar
Loewenstein, Antony. 2025. “Israel’s Use of AI in Gaza Is a Terrifying Model Coming to a Country near You.” Middle East Eye, January 28. https://www.middleeasteye.net/opinion/israel-use-ai-gaza-terrifying-model-coming-country-near-you.Google Scholar
Masarwa, Lubna. 2021. “Israel Can Monitor Every Telephone Call in West Bank and Gaza, Says Intelligence Source.” Middle East Eye, November 15. https://www.middleeasteye.net/news/israel-can-monitor-every-telephone-call-west-bank-and-gaza-intelligence-source.Google Scholar
Mbembe, Achille. 2003. “Necropolitics.” Public Culture 15 (1): 1140. https://doi.org/10.1215/08992363-15-1-11.CrossRefGoogle Scholar
Mbembe, Achille. 2019. Necropolitics. Duke University Press. https://doi.org/10.1215/9781478007227.Google Scholar
McQuillan, Dan. 2022. Resisting AI: An Anti-Fascist Approach to Artificial Intelligence. Bristol University Press.Google Scholar
Northam, Jackie. 2014. “With Homegrown Technology, Israel Becomes Leading Arms Exporter.” NPR, August 29. https://www.npr.org/sections/parallels/2014/08/29/344030354/with-home-grown-technology-israel-becomes-leading-arms-exporter.Google Scholar
Office of the United Nations High Commissioner for Human Rights. 2024. “State of Palestine: Israeli Settlements in the Occupied Palestinian Territory, Including East Jerusalem, and in the Occupied Syrian Golan.” https://www.ohchr.org/sites/default/files/2024-03/Palestine-March2024.pdf.Google Scholar
Robins-Early, Nick. 2024a. “How Israel Uses Facial-Recognition Systems in Gaza and Beyond.” The Guardian, April 19. https://www.theguardian.com/technology/2024/apr/19/idf-facial-recognition-surveillance-palestinians.Google Scholar
Robins-Early, Nick. 2024b. “AI’s “Oppenheimer Moment”: Autonomous Weapons Enter the Battlefield.” The Guardian, July 14. https://www.theguardian.com/technology/article/2024/jul/14/ais-oppenheimer-moment-autonomous-weapons-enter-the-battlefield.Google Scholar
Rusert, Britt. 2019. “Naturalizing Coercion: The Tuskegee Experiments and the Laboratory Life of the Plantation.” In Captivating Technology, edited by Benjamin, Ruha, 2549. Duke University Press. https://doi.org/10.1215/9781478004493-003.CrossRefGoogle Scholar
Skyline International for Human Rights. 2025. “Biometrics-for-Food: A Dangerous Shift from Humanitarian Relief to Coercive Surveillance.” Skyline International for Human Rights (blog), May 15. https://skylineforhuman.org/en/news/details/819/biometrics-for-food-a-dangerous-shift-from-humanitarian-relief-to-coercive-surveillance.Google Scholar
Snaith, Ben. 2023. “What Do We Mean by ‘without Data, There Is No AI’?” Short paper. Open Data Institute. https://theodi.cdn.ngo/media/documents/20231221_-_Data-centric_AI_Short_Paper_-_What_do_we_mean_by_without_data_there_3AEHdDW.pdf.Google Scholar
Solon, Olivia. 2019. “Why Did Microsoft Fund an Israeli Firm That Surveils West Bank Palestinians?” NBC News, October 28. https://www.nbcnews.com/news/all/why-did-microsoft-fund-israeli-firm-surveils-west-bank-palestinians-n1072116.Google Scholar
Veracini, Lorenzo. 2011. “Introducing: Settler Colonial Studies.” Settler Colonial Studies 1 (1): 112. https://doi.org/10.1080/2201473X.2011.10648799.CrossRefGoogle Scholar
Wolfe, Patrick. 2001. “Land, Labor, and Difference: Elementary Structures of Race.” The American Historical Review 106 (3): 866. https://doi.org/10.2307/2692330.CrossRefGoogle Scholar
Wolfe, Patrick. 2006. “Settler Colonialism and the Elimination of the Native.” Journal of Genocide Research 8 (4): 387409. https://doi.org/10.1080/14623520601056240.CrossRefGoogle Scholar