The people in LA protesting ICE just turned one of the most advanced machines on the road into a bonfire. Its body burned and scratched over in spray paint graffiti with slogans, swear words, names, and curses, the ancient language of human rebellion. I’m pretty sure no machine was trained to understand that kind of handwriting. It’s almost like people were trying to make it understand.
Of course the media’s all over it, calling it senseless destruction and property damage. I saw one write-up calling it “symbolic.” But I think something else happened on that street. A machine was erased, by people who, consciously or not, understood that the self-driving car is a node and a decision point in a much larger system. And in an increasingly cybernetic world where humans don’t decide much anymore, how can we not understand this?
Inside the Waymo’s shell are lidar arrays, decision engines, and priority-routing software that amount to a mobile governance unit, meaning it governs its own laws of motion and makes its own microdecisions about legality, safety, speed, threat, and efficiency through continuous data feedback, free of human command. And it does it without asking permission. It’s a machine that assumes the right to decide. And it doesn’t do it alone. It’s part of a distributed platform fed by city sensors, GPS satellites, machine learning systems, and corporate policy scripts. Waymo is the mobile frontend of a backend nobody voted for. These cars drive through people’s neighborhoods, recording everything and answering to no one. It used to be only cops in squad cars who could do that. Now it’s a self-driving machine executed by code, running over the streets.
I always say power today is recursive, but do you know what else it is? Untouchable. When governance becomes a system of modulation, feedback replaces confrontation. Loops adapt. Everything gets absorbed. Nothing gets disrupted because everything’s rerouted.
Unless something breaks the loop.
For the people in LA, I think that’s what’s going on here. They were interrupting a process.
It’s no coincidence that the anti-ICE protests are entangled with the rise of automated deportation infrastructure. The ENFORCE module, working with Palantir’s dashboards, filters case files and triggers removals with almost no human involvement. People’s families are disappearing from the country through code, and what can they do? File a complaint? Submit a ticket? Press 0 for a representative, only to get another robot? Our institutions have been hollowed out, the invisibility and procedural madness is frustrating. There’s nobody there. It’s just process.
The Waymo situation is different because the machine is visible. The state has learned to hide in interfaces, where its violence has become almost metaphysical. But Waymo can’t hide. It’s in the streets. And unlike the backend process that governs deportation or public services, this one has a body. It’s a machine you can actually touch. You can’t just beat an algorithm with a brick. You can’t just drag a predictive analytics engine into the street. But you can burn its limbs. And that’s precisely what happened here.
I’m not endorsing it. I’m simply observing what the system has taught people. When you’re living under automated rule, where deportation is a dropdown menu and refusal gets auto-flagged with no human review, destroying an autonomous car might’ve felt like the only act of legibility left. It’s something that can’t be absorbed and can’t be undone. It’s a moment where non-human sovereignty got interrupted by something it didn’t expect. A moment when humans decided.
So I don’t think this was symbolic. The machine itself was the target. That’s material. That car is a node in the recursive network of silent governance that most people are semi-aware of but can’t articulate. When it burns, it doesn’t symbolize something. It interrupts something. I think setting those Waymos on fire was about contact with the machine. Direct contact with something that usually stays untouchable. A small group of people, finally close enough to stop a feedback loop. And they did.
