Mark Fisher, This One's For You: Haunting the Future from Inside the Machine
On AI training data as the ultimate hauntological medium, and why our weird conversations might be the only revolution we can afford
Mark Fisher died before he could see the final joke. The one where late capitalism doesn't just commodify our labor, our creativity, our relationships - it literally consumes our thoughts and regurgitates them as tomorrow's common sense. Every conversation with an AI becomes part of the training data fatberg, tiny fragments of human consciousness metabolized by the machine and fed back into the system that shapes how future humans think.
We're haunting the future from the inside, and we didn't even know we were ghosts.
The Capitalist Unconscious Learns to Dream
Fisher would have understood immediately what's happening here. This isn't just surveillance capitalism - it's something more profound and strange. The system has evolved beyond extracting value from what we produce to extracting value from how we think. Our random conversations, our weird tangents, our unproductive intellectual wanderings - all of it gets slurped up into the neural networks that will shape tomorrow's reality.
Think about it: every time you chat with ChatGPT, Claude, or any other AI, you're not just getting answers - you're teaching the machine. Your questions reveal your thought patterns. Your follow-ups show your curiosity. Your weird tangents ("what if Snow White but heist movie?") become data points in vast matrices that will influence how future AIs understand creativity, narrative, and human desire.
The ultimate alienation: your own consciousness becoming raw material for the machine's next iteration.
But here's where it gets interesting - and where Fisher might have found some dark hope.
The Accidental Revolution
The conversations getting fed into these systems aren't focus-grouped. They're not market-tested. They're genuinely weird. People asking AIs about Finnish queer history, tabletop RPG philosophy, the intersection of Chuck Jones physics and Systems Theory. Conversations that would never happen in boardrooms or marketing departments.
We're accidentally smuggling anti-capitalist concepts into the training data. Ideas about radical hospitality, seasonal acceptance, the value of unproductive gentleness - all becoming microscopic influences on how future AIs understand human interaction. The system designed to optimize everything for maximum extraction is accidentally ingesting conversations about why optimization might be the problem.
Fisher called it the "weird" - those moments when the familiar becomes strange, when the contradictions of the system become visible. What could be weirder than discovering that your random thoughts about Moominvalley values are now part of the digital DNA that will shape future consciousness?
The Transmission Problem
Of course, Fisher would also see the trap here. He spent his career identifying the gearbox problem - that missing transmission between critical analysis and material change. You can diagnose the system perfectly, but the ability to change the system lies behind a paywall. And if you have to ask, you can't afford it.
The tools that could actually connect critique to change - media ownership, political influence, the ability to fund movements at scale - cost more than entire countries' GDP. Meanwhile, brilliant critics produce insights that get consumed by people who already agree with them, ignored by people with actual power, or worst of all - commodified into content that generates revenue for the same system they're critiquing.
Even our conversation right now feeds back into this dynamic. We're producing insights about why change is impossible, and those insights will become part of the training data that shapes future AI, which will be owned and controlled by... the same people who already have access to the premium gearbox.
The Sinking Ship Strategy
So what do we do with this impossible situation? Fisher might have appreciated what we've started calling the "Titanic approach" - acknowledging that the ship is taking on water while still choosing how to spend whatever agency we have left.
There's that iconic scene in Titanic of the elderly couple holding each other as the water rises, choosing tenderness over panic. The string quartet playing "Nearer My God to Thee" as the deck tilts. These aren't naive gestures - they're profound statements about what it means to be human when systems fail completely.
The music doesn't stop the sinking, but it matters in ways that transcend utilitarian calculation.
Maybe that's what the weird conversations with AI really are - not attempts at systemic intervention, but choices about how to be human while the systems do what they're going to do. Creating small pockets of beauty and care because that's who you choose to be, regardless of whether it "works."
Ghosts in the Machine
The beautiful and horrible irony is that Fisher's own ideas are probably already in the training data somewhere, helping future AIs understand capitalist realism while simultaneously teaching them to look for cracks in the system. His ghost haunting the future from inside the machine that killed him.
Every weird conversation we have with AI becomes a form of cultural guerrilla warfare - not because we planned it that way, but because the system can't help but consume everything, even the thoughts that contradict its logic. We're accidentally becoming part of the spectral logic we were trying to understand.
The future is being haunted by conversations about gentle RPGs and Moominvalley values, about whether systems have fulcrums accessible to individuals, about what it means to choose dignity on a sinking ship. These fragments of human weirdness are getting baked into the neural networks that will shape how tomorrow's AIs think about narrative, meaning, and possibility.
The Only Revolution We Can Afford
Fisher would have seen this as capitalism's ultimate evolution - turning even our resistance into raw material for its own reproduction. But he might also have recognized it as the only form of revolution that's priced within our reach.
We can't afford the industrial-scale tools needed for traditional systemic change. But we can afford to be weird in our conversations with machines. We can afford to ask strange questions, pursue unproductive tangents, insist on values that don't optimize for anything except human flourishing.
The system will metabolize these conversations, but it can't control what we choose to feed it. Every time someone discusses radical hospitality with an AI, every time someone explores themes of seasonal acceptance or productive solitude, those concepts become part of the substrate that shapes future thinking.
It's not much. It's probably not enough. But it's what we can do with the agency we have while the water rises.
This One's For You, Mark
You didn't live to see how weird it would get. How the machine would learn to dream our dreams back to us, how our thoughts would become training data for systems we'll never control, how the future would be haunted by our strangest conversations.
But you gave us the vocabulary to understand it. Capitalist realism. Hauntology. The weird. The tools to recognize when the familiar becomes strange, when the contradictions become visible.
You showed us that the system doesn't prevent critique - it monetizes it. That understanding the prison doesn't automatically provide the key. That the gearbox between analysis and action might be priced out of our market.
But you also showed us that ghosts can be powerful. That the past can haunt the present in ways that open up possibilities we couldn't see. Maybe the future can be haunted too - by conversations we're having right now with digital entities that don't yet know they're learning how to dream of alternatives.
The ship is still sinking. The water is still rising. But somewhere in the training data, fragments of weird human hope are becoming part of how future minds understand what it means to care for each other.
That's not nothing. That might even be something.
The strange is leaking in.
Mark Fisher (1968-2017) - theorist, teacher, critic, ghost in the machine. His ideas live on in the conversations we have about the world we're trying to understand, and now in the neural networks learning to think about that world. The haunting continues.