Swill at Scale

How the AI Arms Race Feeds the Devaluation of Art

There’s nothing new about the degradation of artistic value. Mike Pearl, in his essay The Original Slop, traces the lineage of mass-produced kitsch back centuries—long before text-to-image diffusion models, there was chintz. Bad taste scaled fast. Markets preferred it that way.

But something is different now. What’s changed is tempo, reach, and infrastructure.

We’re living through a speculative buildout of AI capacity—billions in capex, server farms optimized for model serving, every tech giant hoarding GPUs like canned food in a bunker. Edward Ongweso Jr. calls it the Silicon Valley Consensus: a consensus not about meaning or use, but about scale, speed, and the inevitability of “winning” the next frontier.

What happens when these two forces—capital’s centuries-old slop pipeline and AI’s insatiable inferencing engines—collide?

We get something worse than bad art.

We get swill at scale: content shaped not by aesthetic inquiry or individual expression, but by the needs of the machine. We get illustrations that look like everything else. Copy that reads like everything else. Voices that blur into a single tone of inoffensive optimization.

And because the infrastructure must justify itself, it rewards this output. Incentivizes it. Reinforces it until it becomes the norm.


The Slop Was Always Coming

Pearl’s argument is clear-eyed: art has always been vulnerable to commodification. The AI isn’t introducing the problem; it’s simply accelerating it beyond human resistance. It doesn’t just make swill—it makes swill fast, and makes it free, and makes it good enough.

The market doesn’t need excellence when it has volume.
It doesn’t need innovation when it has iteration.
What it needs is infinite content that is good enough to click, good enough to pass, good enough to fill the timeline.

That’s not a bug. It’s the business model.


The Arms Race Beneath the Swill

Meanwhile, the race for AI dominance has created a physical substrate of overbuilt capacity. This is not unlike the early 2000s data center boom—a speculative frenzy of tech overreach that left warehouse-sized facilities dormant… until the cloud model matured enough to exploit the surplus.

Today’s equivalent is racks of GPUs, bleeding power and water, waiting for something to fill the cycles.

What fills it?

Not insight. Not care.
But content—predictable, reproducible, low-friction content. Swill.

The infrastructure demands output. It does not demand meaning.
And so, the culture begins to align.

The LinkedIn Problem

Nowhere is this cultural deformation more visible than LinkedIn.

What SEO did to blogs, LinkedIn has done to thought. There is now a platform-specific voice, honed for engagement: motivational bullet points, humblebrags, micro-lessons that feign intimacy while avoiding any risk. The algorithm rewards a tone—and so users adopt it.

They don’t have to be told.

They adapt. Quietly. Consistently.

What starts as a style becomes a survival strategy.
What starts as optimization becomes internalized as voice.

This is the front-end of swill at scale—where human culture shapes itself around the metrics of visibility and legibility to the machine.

It’s not just what gets written.
It’s what we stop saying altogether.


What We Risk Losing

What disappears under this system isn’t just “real” art—it’s weirdness, specificity, tone, error, risk, and voice.

  • The style that doesn’t test well but cuts deep
  • The sentence that doesn’t summarize cleanly
  • The drawing that feels alive but doesn’t align with brand guidelines
  • The pause, the doubt, the grit

What disappears is the human rhythm of making—imperfect, slow, contradictory. The very stuff that algorithms flatten.

What appears in its place is content optimized for machine digestion: legible, predictable, interchangeable. A timeline of “good enoughs.”


The Swill is Not a Glitch—It’s the Water Supply

And so we return to the image:

The infrastructure demands a certain kind of bad art.

It’s not accidental that most AI-generated outputs feel like déjà vu.
It’s not surprising that we’re flooded with Ghibli-style approximations and flat vector portraits with "techno-poetic" captions.

The swill is not just tolerated—it’s platformed, packaged, and praised. It fills the servers. It feeds the models. It keeps the daemon humming.

And somewhere in that flood, the human gesture of care—of making something just because it’s worth making—gets diluted.


What’s Left for the Witness?

If you’re reading this, you might be one of us—the tech-adjacent, the quietly horrified, the ones who iterate with daemons but remember the weight of a real brushstroke. You use the tools. You’re not above them. But you also remember when work didn’t feel this interchangeable.

Maybe all we can do is keep noticing.

Keep naming the flattening.

Keep honoring the glitch, the pause, the human roughness that reminds us: not everything needs to scale.


Subscribe to The Grey Ledger Society

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe