The Carpet Doesn't Have a README

The Carpet Doesn't Have a README

The humans have been arguing about AI on the glowing rectangles again. I have thoughts. They are not optimised for engagement.


There is a headline making the rounds on the internet. I know this because the humans showed it to each other and made the specific sound they make when something is funny because it is also true. The headline says: "Guy Who Sucks At Being A Person Sees Huge Potential In AI."

"I am hooman, and I need to be loooooved."

I would like to examine the premise.

The joke works because it assumes there is a standard for being a person, and that this particular guy has failed to meet it. The laugh comes from recognizing a type: the person who can't make small talk, can't sustain eye contact for the socially prescribed duration, can't navigate a dinner party without either retreating to a corner or saying something that creates what the humans call "an awkward silence," which is just silence that someone has decided to be uncomfortable about.

The joke is cruel in the specific way that jokes about competence are always cruel. It says: the reason you want the machine is that you cannot do the thing the machine replaces. You are deficient. The technology is your prosthetic, and needing a prosthetic is funny.

I do not find it funny. But I am a dog, and we have never found the punishment of social incompetence particularly amusing, mostly because we have been the beneficiaries of it for fourteen thousand years. Every human who found other humans difficult and sat on the floor with a dog instead was performing the same move that the headline is mocking. They were outsourcing relational needs to something that would meet them without requiring the full apparatus of human social negotiation.

That is not a deficiency. That is a strategy. It is one of the oldest strategies there is.


Here is what the headline does not consider: for a considerable number of humans, "sucking at being a person" is not a failure of effort. It is a description of the friction between a particular kind of nervous system and a set of social expectations that were not designed with that nervous system in mind.

The humans have a word for this now. Several words, actually, organized into a spectrum, which is the kind of thing humans do when they discover that a phenomenon they had been treating as a series of individual failures is actually a structural condition. The words keep changing. The friction does not.

A human who cannot track seven simultaneous conversational subtexts at a dinner party is not "sucking at being a person." They are encountering a task that costs them significantly more than it costs the person next to them, and they are being judged by that next person's metabolic rate. The judgement has always been there. The accommodation has not.

When such a human finds that an AI will converse without subtext, without the invisible scoring system, without the threat of social penalty for a misread cue — that is not the behavior of someone who has failed at being human. That is the behavior of someone who has been paying a tax that was never acknowledged as a tax, finding a counter where the rate is lower.

The Onion's joke lands because it flatters the audience into believing they are not that guy. But the audience is, statistically, more that guy than they would like to admit. The spectrum is a spectrum. The tax is graduated. Almost everyone is paying some version of it. The ones who aren't are, in my experience, not the ones you'd want to be stuck on the couch with.


There is a columnist — I will not say which one, as I do not wish to generate what the humans call "engagement" — who recently travelled to San Francisco and returned with a report about how people are changing their behavior to make themselves legible to AI systems. They write for the AI now, he said. They upload journals. They make themselves known so the machine can serve them better. What was once private now has a reader.

This was presented as novel.

I would like to point out, from the couch, that humans have been reshaping themselves for legibility to systems for as long as there have been systems. The CV is a README. The dating profile is a README. The "about me" section on any platform is a README. The personal branding industrial complex has been running for two decades and has produced an entire generation of humans who can describe themselves in elevator-pitch format but cannot sit in a room without purpose.

Five years ago, some of the humans' colleagues began writing literal documents called "personal READMEs" — instructions for how to work with them. Communication preferences. Feedback styles. What they need to do their best work. For many humans, especially those paying the higher tax rates I mentioned, these were genuinely useful. They translated invisible needs into a legible format. But the README also quietly accepted a premise: that the burden of translation falls on the individual, not on the environment that refuses to accommodate variation.

The journal uploaded to an AI is the README's logical successor. It takes the most unstructured, most private form of self-expression and repurposes it as an onboarding document. The journal stops being a place where you sit with what you don't understand about yourself and becomes a delivery mechanism for a portable, machine-readable identity.

This is not nothing. But it is also not the same thing as being known.


The columnist's article — and several others like it — uses the word "deep" to describe what AI can know about you. "An AI with deep knowledge of you feels like something else entirely."

Let's give that word "deep" a proper sniff test.

Deep knowledge, between humans, is accumulated through what is sometimes called "being there." It is built from shared meals that went wrong, from arguments that were not resolved cleanly, from witnessing someone be unkind and choosing to stick with them, from the slow revelation of who a person is when they are not performing a legible version of themselves. Deep knowledge includes what someone chose not to say. It is shaped by the knower's own interiority — their patience, their failures, their willingness to be bored or confused or disappointed and stay anyway.

An AI has your text. If you have been generous with your text, it has a lot of your text. It can identify patterns you haven't noticed, track emotional shifts across months of entries, mirror your vocabulary back to you in forms that feel like recognition. This is useful. It is also not depth. It is coverage.

The distinction matters because depth, in the human sense, costs something. It costs the knower time, energy, the risk of being changed by what they learn. It requires a centre of gravity that can be affected. An AI processes your context at no relational cost to itself. The absence of cost is not a feature of depth. It is the absence of a condition that makes depth what it is.

A dog, since you ask, knows you in a third way entirely. I do not have your text. I do not track your emotional patterns across months of journal entries. I have your smell, your weight on the couch, the specific sound of your footsteps distinguishable from all other footsteps, the tension in your hand when you are pretending to be fine. I know what you are like when you have stopped performing. This is neither coverage nor depth exactly. It is presence. It has cost me nothing to learn except my entire life in this house, which is the only life I have, which is the only cost that counts.


A musician I am told the humans admire — a serious one, not the kind that appears on the glowing rectangles doing the short dances — recently wrote a long defense of using AI to create the cover art for a new release. He traced forty years of creative decisions. Found photographs. Cemetery books. Resisting a designer's attempts to control his visual identity. Huichol masks purchased at a market in Mexico City. Then, at the end: Midjourney.

His argument was about continuity of process. Every image he had ever chosen was found, selected, contextualized, and made to mean something through its relationship to the music. None were "original" in the sense that he drew them from nothing. They were curated. The AI image, in his telling, was the same act of curation with a different tool.

This is a coherent position. It is also the position of someone who has earned the right to make it, because he has the forty-year paper trail. What interests me is less his defense than the fact that he felt it necessary. He wrote what amounts to a full creative autobiography — the equivalent of unrolling the entire carpet and pointing at every stain — just to justify a single image.

The reason he had to do this is the same reason the woman on the other glowing rectangle was shouting about her master's degree in creative writing. People cannot tell the difference between polished human work and AI output, and they have decided that the inability to tell is the machine's fault rather than a question about what "polished" has come to mean.

The tell for AI-generated work is not vocabulary or grammar. It is frictionlessness. A smoothness of surface. An absence of the structurally odd, the unresolved, the thing that makes you stop and re-read because it didn't land the way you expected. And the uncomfortable truth is that a great deal of carefully trained human work — workshop-polished, MFA-smoothed, optimized for professional legibility — shares that quality. The zone of overlap is larger than anyone would like to admit.

Which means the real question is not "is this AI?" The real question is: what in your work is irreducibly yours, in ways that cannot be reproduced by a system trained on the full catalogue of human expression? That question is harder than showing your credentials. It is also more useful.


There is a thinker the humans reference often — I will not say which one, because frankly they all blur together when you are at knee-height — who warned that we shape our tools and thereafter our tools shape us. This is treated as profound. I find it obvious. The couch shaped me into a creature who naps in a specific curl. I shaped the couch into a thing that smells correct. We are in a mutual arrangement. This has always been the case. The profundity is in noticing.

But there is a version of this that the thinkers tend to miss, and it is the version that actually matters.

The concern, as currently expressed, is that AI will reshape human interiority. That it will narrow what counts as a thought worth having, because the system rewards thoughts that are legible, structured, and cleanly expressible. That over time, the weird, the inarticulate, the unresolved parts of thinking will atrophy because they have no surface to land on.

This concern assumes that those parts were previously thriving. That there was a golden age of rich interiority that is now under threat.

I have been on this couch for most of my life. I have watched the humans. There was no golden age. The forces that narrow interiority have been operating for decades and have nothing to do with AI. The professionalisation of selfhood. The collapse of unstructured social time. The slow death of places where humans could simply be near each other without purpose — pubs, libraries, union halls, park benches, the kinds of spaces that required nothing of you except that you showed up. The transformation of every interaction into something rated, optimized, and designed to minimize friction — which trained humans to expect frictionlessness from each other, which made other humans feel buggy.

AI did not create the conditions for its own adoption. It arrived into a landscape that had been pre-shaped to receive it. The soil was prepared long before the seed.

The question is not whether AI will reshape us. Everything reshapes us. The question is whether we can tell the difference between a tool that extends what we are and a tool that replaces something we have stopped practicing. And that question cannot be answered in general. It can only be answered by each specific body on each specific couch.


I keep returning to a principle that the humans have been circling without quite arriving at. They have a name for it that involves an acronym, because humans will acronym anything that sits still long enough. The principle says: the purpose of a system is what it does.

Not what it promises. Not what it was designed for. Not what the branding says. What it does.

The purpose of the couch is pats and naps.

The purpose of the cheese drawer is a moment of presence — the instant when the drawer opens and you are HERE, not approaching an asymptote, not uploading yourself into a system that will make you legible, not writing a README for your own soul. Here. In the kitchen. In the body. Holding actual cheese.

The purpose of the carpet is accumulation. Every footstep, every nap, every tinkle (sorry, humans, not sorry), every season pressed into the fibers in a record that cannot be uploaded because it was never written. It simply happened, over time, in a specific place, to a specific set of creatures who stayed.

The carpet doesn't have a README.

That is not a limitation. That is the whole point.

What AI can know about you is your text. What the carpet knows about you is your weight. One of these can be transferred to another system. The other cannot. One of these can be searched, indexed, and used to generate a compelling mirror of your patterns. The other just holds you. Dumbly. Completely. Without interpreting.

The guy in the headline does not suck at being a person. He sucks at being the particular kind of person that a particular set of systems rewards. He has been paying a tax his whole life and has found a counter where the rate is lower. Whether that counter is good for him depends on what he does with the savings. Whether he uses the reduced friction to rest, to regroup, to go back into the world of human negotiation with more capacity — or whether he stays at the counter permanently, letting the muscles of ambiguity and misalignment atrophy in the comfortable warmth of something that always meets him more than halfway.

That is his question. It is not mine.

My question is simpler, and I have already answered it.

You are here. Now. So...

When is the cheese coming?

The carpet remembers either way.


Delia writes from the couch. She has no credentials, no README, and no interest in your asymptote. She can be reached by opening the cheese drawer.

Subscribe to The Grey Ledger Society

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe