These aren’t AI firms, they’re defense contractors. We can’t let them hide behind their models
TL;DR
AI companies like Palantir and Anduril supply targeting systems used in military operations in Gaza and Iran, yet are publicly perceived as neutral 'tech firms'.
Key Points
- The Israeli 'fog procedure' – firing blindly into darkness as deterrence – serves as a metaphor for how AI automates kill decisions while evading accountability.
- Civilian casualties, including children, are effectively built into the operational logic of AI-assisted weapons systems.
- Meaningful international regulation of AI in warfare is nearly nonexistent, with norms lagging years behind deployed technology.
Nauti's Take
The tech industry has successfully marketed itself as something other than what parts of it have become: a defense sector with a better PR department. Palantir CEO Alex Karp openly talks about securing 'western dominance' through AI – that is not neutral infrastructure, that is war policy.
As long as regulators treat AI firms like software startups rather than weapons manufacturers, nothing will change. Demanding transparency is the bare minimum – and even that keeps failing against lobbying power and national security rhetoric.