Deadly Reverberations
of Daily Tech

A Talk on the Entangled Necropolitics of our Digital Culture

4S 2025 Seattle · Xiu @ Pica-pica Hacklab / Neoterrism

I am 300 kilometers away from Gernika — one of the deadliest trial grounds of technological warfare during the Spanish Civil War, later employed in World War II. 88 years later, and 5,000 kilometers away, a new era of automated warfare is conducted against the population of Gaza.

Introduction

We, as users of technology, are inevitably entangled in its necropolitics. Technological monopolies have dispossessed us of the knowledge of the Web 1.0 era and locked us into systems that fuel an arms race toward ever-more-militarized futures.

Our daily devices — smartphones, messaging apps, social media platforms, AI, and IoTs — operate on blackboxed systems that extract and profile our data not merely for consumer targeting but to train marketable surveillance systems, predictive policing tools, and, ultimately, killing machines. In this ecosystem, big tech are turning us, users, into both the weapons and the targets of a war machine.

This is a sound of a Drone. It is so prevalent in the lives of Palestinians for over 20 years that they have coined a name for it: Zanana. The sound of the drones are present 24/7. Palestinians have been a testing ground for Israel's drone technology of surveillance and warfare. Drones are part of a militarized technology complex, much of which is invisible and silent.

Three Entanglements

The reverberations of warfare and militarization reach us through our daily technologies. I’ll explore three of many entanglements:

  1. The Corporate-Military Nexus
  2. The Weaponization of Neutrality
  3. The Digital Violence Battlefield

1. The Corporate-Military Nexus

Technology is inherently dual-use. Civilian applications are easily repurposed for military ones. Since 2016 the US military has actively integrated Silicon Valley into its agenda. The Defense Innovation Board, created to formalize this integration, had former Google CEO Eric Schmidt as its first chairman.

In 2018, Google workers' mobilization against Project Maven publicly revealed that the corporation was building AI to assist the military through drone video analysis.

This convergence has only intensified. Between 2024 and 2025, both OpenAI and Google removed their prohibitions against military applications of their technologies. Two months ago, the U.S. Army Reserve created Detachment 201, a unit integrating the executives from Meta, Palantir, and OpenAI into military leadership roles as formally commissioned lieutenant colonels.1

1 The four include Shyam Sankar, chief technology officer for Palantir; Andrew Bosworth, chief technology officer of Meta; Kevin Weil, chief product officer of OpenAI; and Bob McGrew, adviser at Thinking Machines Lab and former chief research officer for OpenAI. Web Sources: Military.com. - Medium. - Data Center Dynamics.

Thanks to whistleblowers, researchers and journalists we know concrete examples that illustrate these necropolitical entanglements:

  • Microsoft processes mass surveillance of Palestinian communications via Azure, storing around 200 million hours of audio in data centers across the Netherlands and Ireland.2
  • Amazon and Google provide the cloud storage and infrastructure to hold data and train the faulty AI that sends drones to kill entire families based on a human-in-the-loop decision-making process of less than a minute.3
  • Meta's LLaMA models, trained on public Facebook and Instagram data and user interaction with MetaAI, will help automatize decision-making in real-time battlefields.4
  • Services we use daily such Google Photos have been reported to be used in facial recognition tech in Gaza to create "hit-lists" due to its effectiveness.5

It is chilling to think that technology like drones or the AI-powered guns displayed at Hebron's checkpoint might work with the help of a service so many of us use for personal memories (Apnews).

2. Weaponizing Neutrality

Technological progress is often disguised under the cloak of neutrality to obtain trust from users. Yet technology is rarely neutral. Usually it amplifies existing power structures. A key strategy in maintaining this power is the weaponization of the very concept of neutrality itself.

Here are two strategies being employed for the weaponization of neutrality:

  1. Dismantling technologies, like hard encryption, that resist state and corporate control.
  2. Deploying a rhetoric of neutrality to enforce specific political agendas and suppress dissent.

I recently got entangled in this conflict myself while researching US-USSR/Russia nuclear disarmament treaties for a peace event in 2023. I copied a text from Wikipedia into a self-private WhatsApp chat. I was immediately banned from the platform for one hour.

This small incident exposed that my messages were being scanned and flagged. At the time, the European Commission had already blocked Kremlin-affiliated media outlets by revoking broadcasting licenses, deplatforming content, and imposing DNS blocking.6 In addition, the recently created European Digital Media Observatory classified as misinformation discourses that might label the US, Ukraine, or Europe as warmongers.7

My WhatsApp ban revealed how automated censorship had extended into the most private sphere of messaging. The "end-to-end encryption" promised by Meta was a façade.

6EU Council Regulation 2022/350 amending Regulation (EU) No 833/2014 concerning restrictive measures in view of Russia's actions destabilising the situation in Ukraine.

7EDMO.

7Matthew Green, a professor of cryptography at Johns Hopkins University. “These metadata correlations are exactly that: correlations. Their accuracy can be very good or even just good. But they can also be middling,” Green said. “The nature of these systems is that they’re going to kill innocent people and nobody is even going to know why.” The Intercept.

[Chat Control v2]

This practice is now making its way into law.8 This month, the European Parliament votes on Chat Control v2. If approved, it will effectively outlaw hard encryption by forcing all communication providers, including e-mail, to scan user messages.

It will impose the same law enforcement for both the public and private spheres of digital communication. The ongoing cryptowar casts defenders of hard cryptography as either naïve libertarians or potential criminals. However, weaker encryption facilitates any type of hacking and endangers the users its claims to protect.

My WhatsApp anecdote echoes at a deadly scale in Gaza, where the same app is reported to have been used to generate kill-lists based on contacts and shared group chats
(Paul Biggar - The Intercept - 972mag).

In Gaza, WhatsApp users are not only targets, but weapons to target new victims.

The use of so-called "neutral data" to promote "precise, efficient, and less deadly" warfare is leading instead to automated genocide
("A Digitized, Efficient Model of War" @ Carnegie Endowment for International Peace).

[Insecurity is a feature]

The pervasive control that allows automated censorship and kill-lists is not a bug; is a feature of structurally engineered insecurity. Smartphones prioritize unauditable black-box hardware and software.

Modems are made by a handful of authorized companies and run their own OS with privileged access to the phone's main OS. At the software level, services like Google Mobile Services spy on us through persistent identifiers from which our patterns of life may be distilled.

To understand all the processes running on our phones is an impossible enterprise. For "our own security's sake", Google Play Services or Apple's App Store control the vast majority of popular apps. Developers must be certified and compile their code through corporate black-box compilers.8 Users who opt out face digital ostracism. Indeed, in Spain, one cannot hold a debit card without a phone.


[Neutrality against Woke AI]

While we witness the dismantling of secure encryption, in the United States, Trump's executive order "Preventing Woke AI in the Federal Government" represents a new escalation in the weaponization of neutrality.

It mandates that federally used AI systems be "neutral" and "nonpartisan." It claims to enforce "truth-seeking, objective, ideologically neutral and scientifically accurate" LLMs while explicitly excluding perspectives associated with Diversity, Equity, and Inclusion (DEI).

It especially targets "critical race theory, transgenderism, unconscious bias, intersectionality, and systemic racism".9 Trump's "neutrality" is mobilized as an ideological project that declares war on epistemologies of empathy and decolonial awareness, proving that models can be tweaked to enforce specific biases over others.

[Censorhip, Hate Speech and Disinformation]

Content moderation is a trench in the digital battlefield dug along the fine line between censorship, hate speech and disinformation. As a Spanish citizen I feel safe speaking about Gaza - at worst facing shadow-banning.

But under the new Digital Services Act (DSA), Palestinian Digital Right Activists claim that enforcement is uneven: antisemitic content is heavily targeted, while anti-Palestinian hate speech often persists.10

Palestinians note that in the West we see a sanitized version of Instagram and Facebook10. Meanwhile, Sada Social has reported over 25,000 digital violations against Palestinian expression only in 2025.11 Human Rights Watch documents that Meta’s censorship of pro-Palestinian content has been “systemic and global”, while the Centre for Countering Digital Hate reports that X fails to remove 96% of antisemitic and Islamophobic content.12

10 Aya Omar, AI engineer, told The New York Times that she was unable to see Palestinian media accounts she regularly reads because Meta and Instagram were blocking those accounts, sometimes cloaking it as ‘technical difficulty. She said that people were seeing a sanitized version of the events occurring in Gaza." @ New York Times.

11Sada Social Digital Index 2024.

12Business Human Rights Resource Centre.

[Repression of Pro-Palestinian Voices in Germany]

In Germany consequences of Palestinian advocacy are severe. Artists are being ostracized, workers fired, speakers like the UN special rapporteur Francesca Albenese have been cancelled, numerous people arrested, demonstrations banned, and European citizens expelled.

The country's Network Enforcement Act (NetzDG) - forerunner of the DSA - amplifies these dynamics to the digital ecosystem. Criticizing Israel online or posting "From the River to the Sea" (legally considered "hate speech"13) has resulted in fines14, job losses15, students suspensions16, and political dismissals17.

The digital and the real spheres merge as Interior Minister Herbert Reul ordered a recent and unprecedented large-scale raid targeting online hate speech. While aimed at far-right xenophobia, the same legal framework could criminalize political slogans for Palestine.18

13Crackdown on Pro-palestinian voices in Germany @Human Rights Research. Database of the systematic repression of pro-Palestinian voices in Germany @ Index of Repression.

14Federal Court Justice appeal.

15Zalando's case Zalando's case.

16Archive of Silence.

17Melanie's Schweizer's case.

18Human Events. About raids in 2024 against pro-Palestinians: @ Reuters.

US Border Politics & Digital Repression

In the US, repression is tied to border politics. Campus protests against the war in Gaza has had serious digital repercussions for international students.

Since mid-2025, visa applicants must make their social media accounts publicly accessible for AI analysis under the "Catch and Revoke" initiative launched by Secretary of State Marco Rubio.1920

This initiative echoes the arrests in 2015 of hundreds of Palestinians on the grounds of their Facebook posts.21

Financial and military disinformation strategies

As counterpoint on censorship, disinformation campaigns worsen war conditions and profit warfare tech giants. The defunding of the United Nations Relief and Works Agency for Palestine Refugees in the Near East (UNRWA) illustrates the stakes. Coordinated social media accounts, inauthentic profiles, and paid ads amplified allegations against staff, leading to suspension of funding and hampering humanitarian aid.22

Similarly, campaigns like the astroturf "Facts for Peace" spent hundreds of thousands of dollars on Meta to vilarize emotional content framing criticism of Israel as inherently antisemitic or pro-Hamas.23

Unmet promises that locked us in a war machine

As we witness the overwhelming horrors of automated genocide in Gaza, we are left with a seismic reckoning: How did data ever become so deadly?

The tech giants that emerged from the 1990s dot-com boom, built their empires on civilian users promising a global village of connectivity and democratization. Instead they have ingrained us is a global killing machine where diverse national, transnational, corporate and financial interests entangle users around the globe.

Over the years, they have locked us in a dependency that is nearly impossible to escape. They have done so by implementing aggressive marketing strategies and technological tricks that circumvent antitrust frameworks, and by deploying processes of disempowerment, black boxing and proletarization of knowledge.

24Following Bernard Stigler's coinage of the word.

Hacklabs and environmentalists as digital dissidence

Personally, despite our efforts as a local hacklab to promote Free Software in Spanish public institutions, Microsoft Teams - that by default stores data in Microsoft's Azure cloud - became practically mandatory in Covid and post-Covid schools.24

In 2023, the Spanish government had to sign an agreement with Microsoft to repatriate data of minors because enforcing the European General Data Protection Regulation (GDPR) is still not possible with big tech.25 The project was called "Operation Gernika."26

A few kilometers away from Gernika, Amazon and Microsoft Data Centers are being constructed and planned threatening scarce water resources under new and specially-tailored "Strategic Interest" laws that bypass the usually required environmental-impact assessments.27 28

24 Covid fluis Microsoft Teams Success @ Medium.

25About GDPR for big tech @ Pica-pica (in Spanish).

26Operación Gernika.

27Aragon pulmón de la IA made in Europe.

28Ecologistas en Acción (in Spanish).

How to Go about Digital Disarmament?

Microsoft's move into post-Covid schools exemplifies how corporations that engage in the necropolitics of the genocide in Gaza become the foundation of our digital culture. Their data centers embody the physical reality of a virtual killing machine.

How we disentangle from such powerful dynamics? To confront this issue requires more than consumer "choice." But opting out is not an option.

We must find a way to build collective knowledge on effective eco-pacifist strategies that refuse black-box dependence and necropolitical neutrality in favor of digital disarmament. We must find alternatives rather than accept as inevitable that the infrastructures of our digital life must be infrastructures of death.

Global Military Expenditure (1998–2024)

Top 10 countries and global spending trend 1992-2024 in million dollars at constant 2023 prices.
(Source: SIPRI)

(Press Play)

1998

Campaigns

Bibliography

Crawford, K. (2021). The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.

Taplin, J. (2017). Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy. Little, Brown. [“Don’t Be Evil” reference]

Edwards, P. N. (1996). The Closed World: Computers and the Politics of Discourse in Cold War America. MIT Press.

Hables Gray, Chris (2025). AI, Sacred Violence, and War—The Case of Gaza.

Foroohar, Rana.(2019)Don't Be Evil. How Big Tech Betrayed Its Founding Principles. Currency (Penguin Random House).

Mbembe, A. (2003/2019). Necropolitics. Duke University Press.

Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.

Stiegler, B. (2018). The Neganthropocene. Open Humanities Press.