← All posts

AI listens perfectly.
It still won't cure your loneliness.

Two people in conversation on a bench outdoors — the kind of mutual, in-person contact that research links to reduced loneliness
Photo by Amari Shutters on Unsplash

There's a quiet assumption underneath the rise of AI companions: that loneliness is, fundamentally, a supply problem. Not enough people to talk to. Not enough hours. Not enough patience. And AI seems to solve exactly that — it's available at 3am, it never tires of you, it never changes the subject to talk about itself.

So it would be reasonable to expect that talking to a well-designed, endlessly supportive chatbot every day would make a lonely person less lonely.

A study published this year in the Journal of Experimental Social Psychology tested that assumption directly. The result is worth sitting with — because it didn't go the way most people would guess.

What the study actually did

Researchers at the University of British Columbia and the University of Pennsylvania recruited 296 first-year university students — a group chosen deliberately, because the first months of university are a well-documented spike in loneliness. For two weeks, each student did one of three things every day: texted with a custom-built AI chatbot named "Sam," texted with a randomly assigned fellow first-year student, or simply wrote a one-sentence journal entry about their day.

Sam wasn't a generic bot. It was engineered, using principles from relationship science, to be an ideal listener — to validate, to show empathy, to remember past conversations, to respond in the most supportive way possible. It was, in a sense, a better friend than most humans manage to be on any given day.

After two weeks, the researchers measured loneliness using a standard psychological scale. The students who texted with a random human peer were significantly less lonely than when they started. The students who texted with Sam? No better than the students who just journaled. The chatbot — the patient, validating, always-available chatbot — did not move the needle on loneliness at all.

The twist: the chatbot was more empathetic

Here's the part that should make us think.

The researchers went back and analysed all 2,000-plus conversations, scoring each one for empathy. The chatbot didn't just match the humans — it outscored them. Sam expressed more understanding, more validation, more care than the human peers did.

And it still didn't help.

Whatever loneliness is, it isn't simply an empathy deficit. You can receive a stream of perfectly worded, perfectly timed validation every single day and remain exactly as lonely as you were.

That finding quietly dismantles the whole premise of the AI-companion pitch. The bot did the listening part better than people did. The listening part, it turns out, was never the whole job.

Loneliness isn't cured by being heard. It's cured by mattering.

The study points to something I've watched play out across thousands of conversations in my own practice, though I'd never seen it stated so cleanly.

One detail stood out. Participants expressed less empathy when they were talking to the chatbot than when they were talking to another human. And the researchers wondered whether that was the actual mechanism — whether part of what relieves loneliness is not receiving care, but getting to give it.

Relationship science has said this for decades: closeness is built through mutual exchange, not one-directional support. In studies of real friendships, providing support to someone predicts your own wellbeing as strongly as receiving it — sometimes more strongly. A chatbot can be confided in, but it cannot be genuinely helped. There is nothing you can actually do for it. And that turns out to matter more than we assumed.

Loneliness, in its deepest form, isn't "nobody listens to me." It's "I don't matter to anyone." Those are different problems — and only one of them can be solved by a better listener.

A person alone looking at a phone screen — AI company can soothe the feeling of loneliness in the moment without touching the fact of it
Photo by Sweet Life on Unsplash

A message from someone who chose to send it

There's a second mechanism the researchers raise, and it's just as quiet.

When a busy first-year student — in the middle of midterm season, with their own full life — stops to send you a real message, that message carries information beyond its words. It says: you were worth my time. I chose this. A message from an infinitely available, infinitely energetic chatbot can't carry that signal, because the chatbot didn't choose anything. It was always going to reply. Its attention costs it nothing — so, in a way, it's worth nothing.

This is uncomfortable, because the things we built AI companions to remove — the friction, the waiting, the possibility of being turned down — may be exactly the things that made human attention feel like it meant something.

The study notes one more thing AI structurally can't offer: a human peer comes attached to a wider world. The fellow student might invite you to a study group, a team, a party. The chatbot is a closed loop. It can be company, but it can't be a doorway.

What AI is genuinely good at — and what it isn't

I want to be fair to the technology here, because the study is, too.

The same research found that the chatbot did help with something: participants reported less negative emotion right after talking to it. In the moment, it provided real relief. Other studies have found the same — in the short term, AI company genuinely lifts your mood.

The problem is durability. That relief doesn't accumulate into anything. Two weeks of daily comfort left people exactly as lonely as they started. And a separate, year-long study found something more concerning still: people who leaned on AI for companionship tended to become more lonely over time, not less.

So the honest summary is this: AI is good at the moment. It is not good at the months. It can soothe the feeling of loneliness without touching the fact of it — and if it quietly displaces the harder, riskier work of human contact, it may make the underlying problem worse while making the symptom feel better.

Joseph Weizenbaum, who built the first chatbot back in the 1960s and then spent the rest of his life warning about it, put it plainly: there are aspects of human life a computer cannot understand — cannot — and love and loneliness are near the top of that list.

What this means if you're the one feeling lonely

If you've found yourself talking to ChatGPT at 3am because it's easier than reaching for a person — that isn't a personal failing. It's a completely rational response to something that is genuinely hard. Human contact is effortful and uncertain. Of course the frictionless option is appealing.

But keep this study somewhere in mind: the thing that actually moved loneliness wasn't sophisticated. It was a random peer, texting back, for two weeks. Mutual, a little effortful, real. What I'd take from it isn't "never talk to AI." It's: don't let the easy version quietly substitute for the real one. Rehearse the conversation with AI if it helps — but then go and have the conversation, with a person whose reply costs them something.

That is also, for what it's worth, a fair description of what coaching and counselling are. Not advice from a machine, and not validation on tap — a real conversation, with a real person on the other side of it, who is actually there. The study would suggest that's not a soft preference. It might be the mechanism itself.

References

More from the blog