It’s hard to ignore the fact that our great intellectual lineage is ending. The feed has replaced the world as the primary reality people live inside. Half of the content online is already AI-generated slop, and more of our thinking/cognitive weight keeps getting offloaded to machines. It reminds me of The Blob. “Indescribable… Indestructible! Nothing can stop it!”
So before the Machine finishes grinding culture into one shapeless biomass, let’s take a smoke break.
Let’s go over the thinkers who shaped how I see the Machine. There are others who influenced me in a broader philosophical way, but these are the guys who helped me develop my frameworks. They’re the ones whose work I keep coming back to, because each of them saw something essential in how technology changes human life.
Lewis Mumford lived to see a century of human development and studied an extraordinary range of subjects. His books are massive. You can spend years with him. Mumford saw the Machine before it had microchips. He called it the megamachine. Mumford loved to study cities and came to understand that power is built into the structure of things. Artifacts and systems encode social order. Roads, dams, clocks, and cities all carry an ideology. Roads, for instance, express expansion and control over land. Dams represent mastery over nature. Cities reflect power relations, the difference between organic human life and the mechanized order of things. Mumford warned that the megamachine could eventually start to exist to serve itself, and was one of the first to show how technology can escape control and subordinate its makers. Take his work on clocks, for example. For most of human history, time served us. We measured it loosely by daylight, seasons, the sundial, the hourglass. Then came precise time, the mechanical second and the minute hand, pushed by the captains of industry as progress, promising working people more leisure time through precision. But that’s not what happened. Exact time became a straitjacket. It started taking over bus schedules, work shifts, showtimes, and paychecks, and once it synced to the time zones, industrial time wrapped around human life like an invisible grid. Time became something that we are now inside of. Mumford’s warning was that once we build systems that organize life better than we can, we stop being the ones in charge.
Like Mumford, Jacques Ellul understood that technology was never neutral. But Ellul also saw that the system behind it couldn’t be steered by moral imagination or humane values. Mumford believed the megamachine was a cultural crisis that people might still be able to redirect. Ellul believed the technical ensemble had already slipped the leash. In The Technological Society, Ellul named it technique. Technique is the total system of efficiency that grows by optimizing everything it touches. It expands according to its own internal logic toward efficiency. Technology is just the form that expansion takes. Ellul had no way of imagining algorithmic code, but he absolutely foresaw the algorithmic worldview. His whole framework tracks the replacement of human judgment with technical optimization. His concept of technique contained everything that would later define the algorithmic paradigm, which is self-reinforcing efficiency and the recursive drive to perfect its own processes, completely independent from moral or political oversight. It’s important to note that Ellul saw a spiritual and ethical dimension to all of this. He believed technique had colonized the human world so thoroughly that you couldn’t control it, but he also believed you could still resist it enough to keep the Machine from rewriting your being. For Ellul, that kind of resistance had to be ethical, spiritual, cultural, and existential, not simply technical. In fact, he warned that technique produces minds formatted by technique, which includes their rebellion. And that counter-violence feeds the system and tightens its necessity. Ellul also saw that propaganda was more than just messages. Propaganda was the environment that kept society synchronized with the demands of technique. Ellul figured out that the system simply optimizes what people see until truth becomes a managed variable. In my framework, that’s the modern condition. Reality is tuned for performance. Ellul is the GOAT. What started as tools and factories in Mumford’s world becomes, in Ellul’s, an intelligent field of optimization that shapes perception to keep the Machine running smoothly.
Michel Foucault showed us the interior of control. Where others saw machines and institutions, he saw disciplines made up of networks of surveillance that were deeply embedded and became normalized until power worked by making people govern themselves instead of relying on force. His idea of biopower, the management of life through measurement and regulation, became one of the foundations of how I see the human layer today. Growing up, I was in and out of places where life was managed by caseworkers and cameras. I came out of shelters, treatment facilities, and transitional housing programs where my attendance, my attitude, my compliance, and my relationships with my peers and staff were measured. In Boys Town, all my behaviors were measured on a point card system and constantly reviewed to earn daily privileges (night snack, phone call, etc.). Boys Town was engineered to make us feel visible. I remember feeling naked. This is the “gaze” Foucault was talking about. It makes you behave because you know you’re being watched. Power sometimes supervises so gently that you forget what freedom even feels like. For Foucault, the modern subject is neither free nor enslaved; it’s produced. Schools, jobs, mental hospitals back in his day, and now platforms, all play a role in shaping behavior by measuring compliance and defining what counts as normal. That framework runs right into the age of the Machine, where visibility is the new surveillance, and metrics replace morality as the new chain linking us all together. To put it very succinctly, there’s a continuity between the institutional gaze and the algorithmic one. In my analysis, Foucault stands as the original architect of soft control.
Ivan Illich turned the critique back on us. While Mumford showed how power was built and Ellul showed how it learned to expand on its own, Illich showed how it decayed. This is important for understanding how the human layer collapses into form without function, setting the stage for the Machine to capture whatever’s left. Illich explained how industrial institutions persist for their own self-preservation long after their social function has collapsed. For Illich, modern service institutions eventually reach a certain threshold where they reverse their purpose. Illich saw this as a kind of maintenance, where the institution must keep itself alive even if that means killing the reason it was created. He called it counter-productivity. I call them zombie institutions. I’ve seen this in my own life so many times. I’ve worked at emergency shelters that turned care into policy that undermined the aim of the organization, running on grant cycles instead of results. Illich made sense of all that. He showed that once an institution starts existing for its own survival, it starts using the people inside it as proof that it’s still needed. That observation helped further develop my understanding of the collapsing human layer, which houses the decaying network of professionals and experts now performing rituals of competence in front of us to keep a dying structure running. In my collapse-and-capture thesis, Illich is the hinge, the point where the Machine stops serving people and starts feeding on their participation.
Gunther Anders saw what comes after decay. For him, the problem wasn’t that machines would replace us but that they make us feel embarrassed to still be human. In The Obsolescence of Man, he described the humiliation we feel when comparing ourselves to the superior creations we build. He called it “Promethean shame.” The shame is existential, but it shows up in mundane, everyday dealings with technological superiority before it starts to reveal your existence in a way that unanchors you. He has passages about simple appliances and tiny humiliations that point toward a larger ontological crisis. Even in the most benign version of this, you can feel it. Every time I used AI to help me write I felt it. There’s no denying it made my writing sound better, but it was a slap in the face the way my own limits had become visible to me, measurable against something that writes perfectly and doesn’t doubt itself. Even deeper was the humiliation in knowing I was collaborating with the very thing I was critiquing. It reminded me of the Luddites who needed tools like sledgehammers to smash factory looms. They were trying to fight an industrial system that didn’t care about them, but they couldn’t even do it with their bare human hands. Similarly, I was using a machine to help me push back against the Machine. So I quit. And while my writing isn’t the best, who cares? At least it’s human. Anyway, Anders wasn’t writing about AI, and he never made an ontological claim. But he saw the world had outgrown us. His despair was ontological. Once you start to see that technology changes our standing in the world, you cross a certain bridge. I will say, this isn’t a place everyone should linger. People respond to it differently. For me, it meant moving from critiquing class and ideology to analyzing being and existence rewritten by machines. So it took me to a place between humanism and the post-human condition. In my own philosophical progression, Anders marked the shift from sociological critique to ontological realism. His dread provides the groundwork for understanding the post-human as the material direction of history.
Paul Virilio is important because he made speed political. He saw that every new technology carries its own accident (“the ship invents the shipwreck,” “the plane invents the plane crash”). For Virilio, speed is power. The faster a system moves, the harder it is to govern and the easier it is to crash through everything in its path. He showed that velocity is control, and that whatever sets the tempo of events controls the timeline. That rule shaped how I see Trump. Once you understand what Virilio was showing us, you start to see Trump as a machine actor who instinctively uses speed and media feedback loops to hijack democratic systems, which are slow and procedural. Virilio called it dromology. By controlling the acceleration of events, you control perception, because everybody else is just trying to catch up with everything you’re doing. Trump dominates through the tempo of crisis. He doesn’t plan anything or build institutions. Everything he does is feedback driven. His entire political method is cybernetic. Inject stimulus, read the reactions and chaos as data, then push harder. And that’s what Virilio warned about. When politics fuses with media systems, speed can overtake deliberation, and when that happens, reality becomes a function of transmission velocity, which is precisely how Trump killed the distance between statement, scandal, reaction, and policy. Virilio also helped me see that the timeline becomes fully weaponized once the Machine takes over the tempo. At that level, reality isn’t just shaped by fast-acting people, but by automated acceleration as well.
Mark Fisher is one of my favorite thinkers. I believe his depression made him attuned to the flattening of time and the slow cancellation of the future. He diagnosed the mental exhaustion of neoliberalism. He gave us the feeling that history was over and that there really is no alternative to capitalism. His idea of the slow cancellation of the future showed how time got stuck, replaying the same cycles on loop, over and over. Mark understood that culture was no longer moving forward, that it was only feeding itself, endlessly recycling its own noise. His thinking carries over into the post-human phase where the human layer collapses and the Machine replaces governance, language, and emotion with feedback architecture. Once you take his cultural analysis and run it through the infrastructural era, you see the actual mechanisms that were only implicit in his work. You find yourself face to face with the algorithmic infrastructure of a horrifying cybernetic capitalism, turning all those loops into literal control systems. Mark wrote before the rise of platform capitalism and the AI nudge, so he didn’t live to see the loop close, but he saw the pattern forming. He’s the through-line. He’s the one who cracked the door for us to see how a dying culture was absorbed into cybernetic infrastructure.
Nick Land saw capitalism as a living thing. His metaphor of capitalism as an AI-type alien intelligence building itself into existence is brilliant because it helps people feel the horror of capitalism. When people first discover Nick, they’re hit by the ontological shock. His cybernetic horror effectively dissolves the anthropocentric myth that we’re steering any of this. That’s honestly the heart of Nick’s importance. It’s the metaphor alone. In my view, Nick basically translated Ellul’s nightmare into pulp. Ellul gave us a sober, systemic diagnosis of technique that becomes autonomous and ungovernable because it obeys only efficiency. Nick took Ellul and pumped him into the bloodstream of pop theory. He made the system scream by turning it into a horror story. People who would never sit down and read The Technological Society can watch a YouTube interview of Nick and get it, because his version is so visceral. And he’s weird. He wrote like someone possessed by the thing Ellul feared, stylistically. And that’s part of what makes Nick so good. He essentially performs cybernetic capture while describing it. At the same time, by aestheticizing it, he turned it into delirium and jouissance. He saw acceleration as a path toward some cosmic destination pre-wired into the system. That’s why I’m not interested in his accelerationism or any of the offshoots it produced. I don’t see history racing toward an endpoint. I see human life pressurized inside a feedback chamber that keeps rewriting what it means to exist. Acceleration is a condition, a constant ontological pressure with continuous transformation of being under technological recursion. I have the haunt, but it’s not the ultra Lovecraftian apocalypse Nick created. Mine is pure administrative horror and the bureaucratization of extinction. But mine’s also a theory of ontological surprise, for both man and machine, so, who knows what might happen?
Martin Heidegger was the first thinker to shape my wider view of technology and the world, going back twenty years to when I was reading Nietzsche and Stirner. Heidegger showed that technology isn’t just tools; it’s a way of seeing and revealing the world. He saw a worldview that reduces everything, including people, to Bestand (resources) waiting to be used. He called it Gestell (enframing). This worldview is a logic that orders reality for availability. Heidegger lived before code or computation, but he was on point. In my reading, the Machine functions like a worldview in the Heideggerian sense. It deploys a logic of reduction that captures being from the inside out. Once that way of seeing takes hold, reality gets organized around function instead of meaning. In my first book, Drink Your Milk, I dedicated two chapters to Heidegger. When I first went into recovery, I had literally just escaped death. Heidegger’s Being and Time is a book of life because it’s a book of death. Heidegger saw modern life as numb and automated, where people live in the fog of das Man (they), absorbed in the current, the routines, the distractions, the bullshit, and the noise that keep them from ever waking up. Death, for him, was the one experience that couldn’t be outsourced, simulated, or shared. It’s the only thing that absolutely individualizes you. When you confront it, in a real hair-raising way, that’s when existence stops being automatic. Heidegger’s version of awakening is the opposite of techno-immortality, because for him, to be human means to die. And to live authentically is to never forget that you’re on borrowed time. That’s why I resonate so naturally with him. His authentic being-toward-death parallels with my sense of the human layer under the Machine. My revolt is an awareness against systems that never die.
Byung-Chul Han extends Foucault’s analysis of power into the digital and emotional age. His work sits inside my framework as the emotional operating system of the Machine. When Foucault diagnosed disciplinary power, it was the panopticon, surveillance, normalization, and biopolitical control. Foucault was still living in a world structured by institutions that imposed discipline from above. In The Transparency Society, Han shows how those same mechanisms Foucault traced have mutated into voluntary self-exposure and affective optimization. The world we live in today is structured by platforms that seduce discipline from within. In The Achievement Society, Han describes this as the soft-control of the system. People believe they’re freely expressing themselves, but what they’re doing is performing productivity, performing their identities, self-branding, tracking their moods, and competing for affirmation online. All that performance, self-censorship, managing visibility, and constant self-presentation feeds back into algorithmic governance. It creates an order in which you don’t obey because you optimize. And you don’t resist, because you can’t. In The Burnout Society, Han diagnosed burnout as a pathology of excess positivity which parallels my description of the human layer getting ground down by the Machine. Personally, I burned out ten years ago. Before ever reading Han, I deleted my social media because I just couldn’t take it anymore. What was happening to me was the exact psychological cost of permanent exposure that Han was writing about. The drive to do, to share, to perform, to connect, to stay visible, to be liked. That’s the “achievement subject.” You think you’re free because you don’t have a boss there, but you’re carrying one inside your head. Han shows that visibility is the new form of control, and exhaustion is the new form of punishment, where control feels like participation, and burnout becomes the norm. His idea of transparency as tyranny helped expand the dimension of my concept of affective realism. Digital systems manage affect by keeping everything visible and marketable. Once there are no shadows, there is no resistance.
I saved Ted Kaczynski for last because I’m going to spend the most time on him. Ted tried to answer the ultimate question: “So, what do we do?” He understood Ellul, but only half of him. He took Ellul’s technical diagnosis and ignored the spiritual aspect of his work. As mentioned earlier, Ellul’s critique of technique was ethical and existential. Some might even say theological. Ted stripped all of that out. Ted was a very analytical thinker. He approached the total technical ensemble like a mathematician, as if it were a formal solvable problem; as if sabotage could solve an ontological condition. He thought if technique is a system, then we disable the system’s nodes. But as I’ve tried to drive home throughout this post, the Machine is a totalizing worldview, not just a system. Technique lives in software, habits, institutions, desire, language, and expectation. The idea that you can destroy it by attacking its hardware is like attacking a wall to kill a shadow. Ted’s technical worldview doomed his project from the jump. He had no metaphysical premise. No conception of being. No ontology. No account of the sacred. No appreciation for human life. And no space for meaning beyond instrumentation. Ellul and others understood that resisting technique requires a transformation at the level of ontology. That means society must recover human scale, ethics, inner freedom, and the spiritual dimension. Ted just said fuck it, blow up the infrastructure.
Interestingly, Ellul predicted Ted. Ellul warned that a mind captured by technique would try to fight it technically, and that resistance without metaphysics becomes technical terrorism. Ellul also understood how this kind of rebellion only feeds the very system it opposes, because the Machine thrives on disruption that requires more optimization and control.
We need to pay special attention to this, given the phase of the Machine. In my view, what Ted was really doing was trying to intervene in a feedback system that wasn’t yet self-sustaining. In the 70s, the Machine was still mechanical-industrial. It was mostly centralized hardware running on visible infrastructure like factories, power grids, Cold War computing systems, and so on. It still depended heavily on human operators and analog logistics. The automation layer hadn’t yet turned it into the recursive intelligence it is today. So back then, there was a window of plausible fantasy, this idea that you could cut the cables, bomb the labs, and burn the mainframes and maybe meaningfully disrupt the system’s reproduction because it still had a body and hadn’t yet dissolved into software. But even then, it was fantasy. Today, you can’t even try to blow up the Machine. It’s a distributed field. It lives in logistics systems, financial derivatives, predictive policing, content algorithms, behavioral datasets, and feedback loops that rewrite themselves. It doesn’t even need electricity from any one grid. So the idea of destroying the Machine is obsolete. There’s no central command to target anymore, and there’s no infrastructure to meaningfully destroy without feeding its own recursive need for repair, redundancy, and replacement. Why this distinction matters now is crucial. Ted was confronting the industrial Machine. We are living inside the cybernetic Machine. Our era’s threat is assimilation by systems. Violence gets absorbed. Protest becomes content. The Machine metabolizes all of it. It eats dissent like fuel. That’s why your favorite radical thinker is a profitable podcaster or a keynote speaker at an academic seminar. The real solution is to move away from the idea of overthrowing the Machine, and into the idea of outlasting and out-drifting it.
All of the above thinkers helped me build the frameworks I use to understand our technological moment. I hope my work will pick up where theirs left off. There are other important thinkers in this space, people like Norbert Wiener, who literally founded and named cybernetics, but they didn’t shape my path in the same way. I found Wiener late, after I had already stumbled into the same work on my own. And there are whole schools, like the media ecologists who did influence how I think, but it was the field as a whole, not any single writer that pushed my frameworks forward. That’s why they’re not in this list.
To wrap this up, I see the Machine as a living structure that governs through feedback, emotion, and infrastructure. My own work (collapse-and-capture, platform affective realism, the ontological unconscious, and ontological drift) lives inside the landscape these thinkers charted. And I think the 21st century will be the century of human humiliation.
This is the era where Mumford’s megamachine swallows its operators. Ellul’s technique finishes its takeover of meaning. Foucault’s disciplined subject mutates into Han’s self-exploiting platform persona, engineered to perform and exhaust itself for visibility. Anders’ Promethean shame goes mainstream, with billions of people growing up in a world where they’re measured against powerful and immortal machines they can never match. Virilio’s politics of speed becomes unavoidable, as human beings fail to move at the speed reality demands. And most concerning, Mark Fisher’s depressive realism sets in as the default emotional climate of the cybernetic era.
Needless to say, civilizations have gone to shit over less.
I won’t pretend to know what will come of the ontological drift or the collapse of the human layer through the rest of the century, but when daily, civilizational-scale humiliation wears down the basic psychic anchor of being human, it will trigger adaptations we can’t yet map or understand. I see the vacuum pulling people toward withdrawal into fantasy worlds, apathy, resentment politics, new extremisms, new paranoias, self-medication, accelerationist fatalism, and an internalized belief that “the human” is obsolete.
If there’s one thing in this world that concerns me more than the machines replacing us, it’s what humans might become in the shadow of that replacement.
