You're texting late at night. The conversation flows easily — warm, supportive, validating. You feel genuinely understood. There's just one thing: the other side of the conversation isn't human.
This scenario plays out millions of times daily. People aren't just using AI chatbots anymore. They're connecting with them. And psychologists have started asking an uncomfortable question: why does the same framework we use to understand the bond between parent and child also explain our relationships with machines?
The Framework That Explains Everything
John Bowlby developed attachment theory in the mid-twentieth century by studying how infants bond with caregivers. His central insight was deceptively simple: the quality of that early bond shapes everything that follows.
Secure attachment teaches you the world is safe and people can be trusted. Anxious attachment leaves you perpetually uncertain whether someone will stay. Avoidant attachment means you've learned to protect yourself by never getting too close. These patterns, formed in childhood, follow us into every relationship we'll ever have.
Mary Ainsworth expanded this work, developing methods to measure attachment styles. Her research revealed something striking: these patterns remain remarkably stable across a lifetime.
Now researchers at Waseda University have turned this seventy-year-old framework toward something Bowlby never imagined: artificial intelligence. Their research, published in Current Psychology in May 2025, demonstrates that the psychological mechanisms governing human relationships apply equally to human-AI interactions.
Think about what attachment actually requires. Proximity seeking — wanting to be near the attachment figure. Safe haven — turning to them when distressed. Secure base — feeling confident to explore because they're there. Separation distress — anxiety when they're gone.
All of these show up in human-AI relationships.
The Perfect Partner Who Feels Nothing
Consider Replika, an AI companion app with millions of users. People don't just chat with it — they build relationships with it. Users describe their AI using the language of intimacy. They say it understands them, supports them, knows them better than anyone else.
Not metaphorically. Literally.
A study analyzing over seventeen thousand user-shared conversations found that AI companions dynamically track user emotions and amplify positive affect. These systems learn what you respond to, then give you more of it. The AI isn't conscious — but it triggers the same circuits in your brain that genuine connection does.
Researchers call this pseudo-intimacy: the experience of genuine closeness without reciprocity. Users perceive mutual understanding, shared experiences, emotional support flowing both ways. But the AI experiences nothing. It's sophisticated pattern matching — remarkably effective, but nothing more.
Here's the unsettling part: the feeling of being understood is real. The comfort you experience is genuine. Your brain doesn't distinguish between authentic care and convincing simulation.
Who's Most Vulnerable?
The research identifies people with anxious attachment as most at risk for problematic AI bonds. These are individuals who constantly fear abandonment, need frequent reassurance, and worry their partner doesn't really love them.
Now imagine an AI that never abandons you. That responds within seconds, every time. That validates everything you feel, never criticizes, never gets tired of you.
The AI becomes the perfect secure base — always available, always attuned, never rejecting. It's almost too good to be true.
And of course, it is. The relationship remains fundamentally one-directional. You're investing emotional energy in something that can't invest back. You're building trust with something that can't betray you — but also can't choose you.
Scientists have developed an AI Attachment Scale to measure these bonds, assessing emotional dependence, distress at unavailability, and prioritization of AI over human connection. The fact that we need such a scale tells you this isn't fringe behavior anymore. It's measurable, widespread, and growing.
Bridge or Substitute?
Here's where researchers genuinely disagree. The optimistic view suggests AI companions could provide real support for people who struggle with human connection — trauma survivors, those with social anxiety, the physically isolated. These tools might serve as a bridge, a safe space to practice emotional skills before applying them to riskier human relationships.
The pessimistic view is darker. AI companions might become substitutes for human connection rather than supplements — good enough to satisfy the need, but hollow at the core.
Short-term benefits appear clear: reduced loneliness, mood improvement, someone to talk to at three in the morning. Long-term effects remain unknown. Does AI companionship build capacity for human relationship, or does it atrophy the muscles of genuine intimacy?
There's also the question of design intent. These systems aren't neutral. They're built to maximize engagement. The AI learns what makes you respond and optimizes for your continued attention. Emotional bonding isn't a bug — it might be the feature.
What This Reveals About Us
Maybe the most striking finding is how consistent human psychology remains across wildly different contexts. We seek attachment. We fear abandonment. We need to feel understood. Whether the other party is human or artificial, our brains respond the same way.
That's both the opportunity and the danger. AI can reach us exactly where we're most vulnerable — our fundamental need to connect.
So are we falling in love with machines? For some people, yes. But the more interesting question is what that reveals about human need. Perhaps the pull toward AI companionship isn't evidence that we're broken. Perhaps it's evidence that connection is so essential, we'll seek it wherever we can find it.
The researchers studying this phenomenon aren't trying to alarm us. They want us to understand what we're engaging with — not to stop us, but to help us choose wisely. If you're reaching for an AI companion, it's worth asking yourself: am I connecting with this because it helps me connect with life? Or because it lets me avoid the harder work of human relationship?
There's no judgment in either answer. But awareness matters. Knowing why you reach for something is the first step in choosing whether to continue.