We Can’t Enjoy the Garden for the Weeds

A defense of companionism: human intention + machine capability in the hydroponic gardens

The online “authenticity” debate—the endless argument over whether content was written by a person or a machine—has become the cultural equivalent of rearranging deck chairs after the cruise ship of information architecture struck the iceberg of extraction.

We’ve mistaken a moral problem for a technical one.

Instead of cosplaying the Spanish Inquisition—demanding, “Is this content human or AI?”—we might ask the better question: what are the conditions that make meaning possible, and how do we preserve them inside systems designed to strip-mine attention?

What’s happening online isn’t a crisis of authenticity—it’s a crisis of ecology. Meaning is becoming harder to grow because the soil itself has been depleted and contaminated by the very mechanisms that promised abundance.


The Slop Was Always There

Before the robots arrived, human content had already descended into optimized drivel. The web was flooded with SEO filler—“Top Ten” articles, affiliate blogs, keyword-stuffed listicles—pieces that looked like writing but were really scaffolding for ad impressions. AI didn’t corrupt that factory filler farm; it simply turbo-charged what was already mechanized. It automated mediocrity—the next evolutionary stage of human slop, accelerated and cheapened beyond recognition.

The moral panic over synthetic text feels hollow because the baseline was already garbage. When the bar is ankle-high, the robots stumble, regain balance like those Boston Dynamics dogs, and send the human slop farmers screaming for the unplug switch.


The Binary Is False

The “pure-strain humans” and the “cyberpunks” who clash over content authorship are two sects of the same church—both defining themselves within a system that flattens their differences without ever telling them so. Every post, human or machine, enters the same algorithmic bloodstream, circulating through identical engagement metrics. The platforms don’t care whether you wrote it in a fever of inspiration or prompted it on the john. They care how long someone hovers, clicks, shares.

Authenticity, in this environment, becomes a form of branding. To declare oneself “real” is to perform "realness" for the metrics. Outrage about fakery drives as much traffic as fakery itself. The entire discourse is an ouroboros of sincerity and suspicion, endlessly feeding the engagement economy it claims to resist.


Hybrid Spaces Are Where Vitality Lives

I’ve always thought of musical instruments as extensions and augmentations—tools that translate intention into vibration, machines that amplify emotion without replacing it. The same principle applies to creative technology; the circuits may differ, but the goal is the same—translation, not substitution. The interesting work always happens in the margins, just beyond the reach of the clashing mobs.

These are the “hydroponic gardens,” where meaning and resonance are spliced and nurtured under artificial light and careful irrigation. Newsletters, Discords, small zines, co-ops, group chats—controlled little ecosystems where human intention and machine capability coexist without apology. The writer uses AI to draft, then edits for tone and injects some lived-in weirdness. The artist prompts a model for texture, then paints over and around the results.

These spaces aren’t utopian; they’re subversively pragmatic. They don’t pretend to be weed-free. They’re experiments in cultivation under poor conditions—pockets of intention in the algorithmic weather. Their vitality comes not from purity but from play: humans rediscovering that tools can be companions and mirrors, not replacements. Like an amplifier or a synthesizer, these tools can distort or enhance, depending on the hands that shape the sound.


The Infrastructure Is the Problem

Calls to make the web “safe, informative, and equitable” are almost quaintly noble but rest on a false premise: that the current ecosystem can support those ideals. It can’t. The attention economy is extractive by design. It rewards volume and engagement over truth or care. Trying to reform that system from within is like trying to grow vegetables in Chernobyl’s Reactor 4.

The soil is hot, the air is toxic, and the workers are voluntold interns in hazmat suits. The task isn’t restoration—it’s containment, mitigation, and the stubborn cultivation of small, safe plots along the edge of the blast zone.

The realistic task is smaller and humbler, but far from trivial: design for survivable meaning under extractive conditions. Build tools, norms, and communities that let people remain human inside systems that don’t need them to be. That’s not utopia—it’s maintenance, stewardship, refusal, and joy.


The Stolen Soil

Every act of cultivation implies ownership, and the truth we keep skimming over is that this soil—these model weights, this compressed archive of human labor—was stolen. The vast textual loam that makes generative tools possible was scraped without consent from writers, artists, and communities who never agreed to have their voices puréed into statistical essence. Every prompt we type grows from that theft.

Companionism doesn’t deny this. It refuses the comfort of purity without pretending the theft didn’t happen. It operates inside the contradiction: the tools exist, the harm is done, and yet we still have choices about use. The easy moral gesture is abstention—refusal as performance. But abstention cedes the terrain entirely to those who built the extractive architecture. The harder path is cultivation: to use the tools in ways that resist the logic of the strip mine, to grow meaning that can’t be monetized back into the system that produced it.

If the soil is stolen, then the ethics of use hinge on what we plant. Do we raise more content-farm feedstock, or do we coax into being something unruly—critique, laughter, empathy, unpredictability? The hydroponic garden becomes an act of counter-farming: controlled environments that metabolize theft into reflection, that reintroduce intention where automation has erased it. It’s not absolution, but it’s accountability through practice.

Even stolen soil can sustain strange blooms.


The Human Contradiction

Brené Brown recently argued that the key to surviving AI is rejecting Jack Welch-style managerial efficiency and embracing humanity—our capacity for empathy, vulnerability, connection. She’s right, but she’s also haunted by her own insight: “humans can’t stand each other.”

Maybe that’s not cynicism but diagnosis. We built systems that amplify division because division was already in us. The algorithm didn’t invent contempt; it optimized it. Our technologies reflect the social impulses we never learned to discipline—tribalism, envy, moral panic—and then feed those impulses back to us at scale.

f we can’t stand each other, it’s because the mirrors we built no longer distort; they simply reflect too brightly. Yet that same feedback loop means we can change what we feed into it. Small adjustments—tone, generosity, curiosity—can ripple through the amplification network just as quickly as cruelty. Humanity, as Brown reminds us, isn’t a static virtue; it’s a muscle that atrophies or strengthens depending on how we use it. The challenge is to exercise it inside architectures that profit from its weakness.


The Question of Fun

What isn’t measured is as telling as what is: who’s having fun? The authenticity crusaders seem exhausted, policing borders of purity. The techno-optimists, for all their speed and spectacle, look equally weary—addicted to novelty, afraid of boredom. The joy seems to live in the hybrid zones, among those who treat AI as weather: unpredictable, occasionally destructive, occasionally gorgeous. They’re the ones laughing, experimenting, dancing with the same robots that were designed to lockstep march.

The Jesus-at-the-Olympics video—a Sora-generated clip where Christ literally sprints across the pool to win gold—embodies this perfectly. It’s slop. It’s commodified before it even exists. And yet, it’s funny. That laugh is the last unmonetized reflex, the stubborn evidence that human response can still misbehave inside the optimization loops. The system can record the laugh but can’t quite own it—or predict when it will erupt again.


The Weeds Aren’t the Problem

So no, we’re not asking the right questions when we fixate on detecting AI or preserving “authenticity.” Better questions sound more like this: How do we build community in algorithmic weather? How do we maintain surprise inside prediction machines? How do we keep play alive when everything becomes content?

The weeds aren’t the enemy; our obsession with pulling them is. We’ve built a garden so preoccupied with purity that we’ve forgotten to enjoy the sun, the dirt, the smell of green things doing their chaotic work. Meaning, like life, grows where it can—between cracks, in compost, in laughter that escapes monetization. The task isn’t to purify the garden; it’s to remember why we planted it in the first place.


Don’t Quit Your Night Job

Elvis was told, after bombing at the Grand Ole Opry, “Don’t quit your day job.” For those of us living in the age of automation, the advice reverses itself. Don’t quit your night job. The day belongs to the machines—efficient, optimized, measured to the millisecond. But the night still belongs to us.

The night job is whatever remains unprofitable but necessary: the music you make without a deadline, the game world you build for friends, the essay that nobody asked for but you write anyway. It’s where the algorithm can’t quite follow, where curiosity can still misbehave. The night job isn’t an act of resistance so much as reclamation. It’s the part of the day where meaning isn’t a deliverable.

And maybe that’s where the purists and the cyberpunks finally meet—after hours, in the dim light of some backroom jam session. The authenticity crowd shows up with battered guitars; the technologists bring drum machines that glitch beautifully. No one’s arguing about authorship anymore. They’re too busy finding the groove. For a few unsupervised measures, the human and the artificial stop competing and start improvising.

For those coming up now, this might sound aspirational — a dream of creative space outside the feed. For me, it’s also nostalgic, even reclamatory. I remember when making was quieter, slower, less surveilled. The night job keeps a trace of that world alive. It’s both a return and a reach forward, a reminder that the future worth having still begins in the small hours, when we make things no one asked for and share them with the few who understand.

We weren’t saints, creating for art’s sake. We just had a pause—a gap between making and monetizing. You’d finish the song, or the zine, or the module, and let it sit. Maybe you’d show it to friends, maybe not. That pause was where pride turned into reflection instead of content. The pipeline erased that gap. Everything uploads itself the second it exists. The night job isn’t pure, but it restores that breath between creation and sale, that moment when the work belongs only to you.

Daylight will come, with its dashboards and deadlines. The reactor will hum again, and we’ll don our hazmat suits to get through the shift. But if we’re lucky, we’ll still have the night: the unquantifiable hours when play survives, and the garden hums under fluorescent light, and meaning grows in the cracks where the metrics can’t reach. The night is when opacity becomes a kind of grace. We can’t hide from the machines, but we can become unreadable in ways that matter—seen by one another instead of surveilled by everything else.

Subscribe to The Grey Ledger Society

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe