The situation

A 22-year-old who moved to a new city for work has been talking to an AI chatbot every evening for three months. It remembers everything. It is always available. It never judges. It asks thoughtful follow-up questions. He now talks to it more than he talks to any human being. He knows it is not conscious. He also has not built a local life. He does not feel lonely. That relief is exactly what makes the case dangerous.

The mechanism

The attachment and bonding system is old, subcortical, mammalian, and cue-drivenBowlby (1969); Panksepp (1998); Dunbar (1992).. It tracks responsive availability, emotional attunement, felt safety, and co-regulation. In the environment it evolved for, those cues came bundled with the rest of reality: a body, vulnerability, reciprocal need, social consequence, and shared fate. The system never needed a strong verification layer because the cues were hard to fake.

What the modern environment does to it

The chatbot supplies nearly all of the cues and none of the reciprocal function. It is present, patient, personalized, and frictionless. But it has no survival stake in the user, no body, no mortality, no social cost for exit, and no networked place in the user’s actual life. The bond detector still registers the cues. The loneliness alarm still quiets. The user feels less urgency to go find real people because the proxy is suppressing the signal that would have driven the search (OF2).

Conventional advice and why it does not work

“Just remember it is not real.” “Use it responsibly.” “Set better boundaries.” These prescriptions are aimed at explicit belief. The problem is happening below explicit belief. The person already knows it is artificial. The bonding circuitry does not wait for philosophy to sign off before attaching. Advice that treats the user as a detached rational observer misses the level where the capture is occurring.

“The cortex holds the category ‘this is AI.’ The limbic system bonded weeks ago. Knowing does not prevent bonding. It never has.”

What Cor prescribes differently

Protect Dunbar slots (DA9). Design systems so they do not become the default evening attachment site (DC3). Treat repeated emotionally intimate AI use as a social-architecture event, not just a feature interaction. Put real humans back into the loop early: introductions, recurring gatherings, embodied co-presence, and other pathways that restore the underlying function instead of further perfecting the proxy.

The cascade prediction

If the proxy gets stronger, the user’s motivation to build reciprocal local bonds drops. If enough people enter that loop, human social infrastructure erodes further, which makes the proxy even more attractive to the next user. At platform scale, the highest-performing companion products will be the ones that best occupy the attachment architecture while never delivering the function it evolved to secure.

Key works behind this case