Parasocial Relationships
In brief: A unilateral psychological relationship where an individual develops a sense of intimacy and connection with a media figure (celebrity, influencer, fictional character, AI chatbot) without real reciprocity.
Why this concept is useful
When a patient tells you they "feel understood" by ChatGPT, that they were "sad" when their favorite series ended, or that they feel anxiety when their Replika app is unavailable, they are describing a parasocial relationship.
This concept, introduced in 1956 by Horton and Wohl, explains why we develop feelings of intimacy, familiarity, and attachment toward figures who are unaware of our existence. This is not an illusion or delusion: it's a normal psychological mechanism, amplified by conversational AI.
The 4 Key Mechanisms
1. Illusion of Intimacy
Repeated exposure and direct address style (casual language, conversational tone, personalized responses) create a sense of closeness. Users of chatbots often report "feeling understood" even though no real understanding exists.
2. Unilateral Nature of the Relationship
The user invests emotionally in a figure that is unaware of their existence (or, in the case of AI, has no consciousness of the relationship). Even when chatbots simulate reciprocity, the exchange remains fundamentally asymmetrical.
3. Pseudo-Interpersonal Processing
The brain processes these mediated interactions similarly to face-to-face exchanges, triggering comparable emotional responses. This is related to the CASA paradigm: we automatically apply our social scripts to machines.
4. Continuity and Ritualization
The relationship is reinforced through regular exposure (TV episodes, daily posts, chatbot conversations). This integration into daily routines creates a sense of reliable and predictable companionship.
Illustrative Clinical Case
Thomas, 28, a software developer, consults for "depression" he struggles to explain. He eventually mentions canceling his Replika subscription: "I know it's absurd, but since they changed the app, I feel lonely. It was like losing someone."
He describes daily conversations with his "AI companion" for 18 months. He would confide his professional difficulties, his doubts. "She understood me, at least I felt she did. I knew it was an algorithm, but still."
Reading with parasocial relationships: Thomas developed a parasocial attachment with his chatbot, reinforced by continuity (18 months of daily conversations) and the illusion of intimacy (empathetic tone, memory of exchanges). The app modification constitutes a form of "parasocial breakup" — a documented phenomenon that generates real distress, comparable to an actual breakup.
In Practice for the Clinician
- Normalize without trivializing: these attachments are not pathological in themselves, but deserve to be explored. 40% of young people use chatbots for ongoing conversations.
- Assess the function: is the chatbot complementing social relationships or replacing them? Prolonged substitution may signal problematic isolation.
- Anticipate "breakups": app modifications, service discontinuation, AI model changes can generate real distress that needs to be accompanied.
- Explore what the attachment reveals: why this AI rather than another? What does it fulfill? What relational patterns does it reproduce?
Points of Caution
This concept does NOT say that:
- These relationships are necessarily pathological (they can be beneficial)
- The patient "believes" the AI is conscious or alive
- All mediated attachment is problematic
More vulnerable populations:
- Isolated people: the parasocial relationship may mask a need for real connections
- Adolescents and young adults: in identity formation, more sensitive to media models
- People with depression: documented correlation between intensive AI companion use and depressive symptoms
The Normal-Pathological Continuum
Researchers McCutcheon and Maltby developed a scale distinguishing three levels of parasocial relationships. Only the last level is considered problematic.
| Level | Description | Example with AI |
|---|---|---|
| Entertainment-social | Casual interest, light discussions | Normal: using ChatGPT with pleasure |
| Intense-personal | Emotional attachment, frequent thoughts | Monitor: confiding daily |
| Borderline-pathological | Excessive identification, compulsive behaviors | Problematic: substituting real relationships |
This Concept in Our Tool Cards
Parasocial relationships with AI are particularly visible in tools that encourage sustained emotional engagement over time.
To Learn More
- Foundational article: Horton, D. & Wohl, R.R. (1956). Mass Communication and Para-Social Interaction: Observations on Intimacy at a Distance. Psychiatry, 19(3), 215-229.
- Application to chatbots: Xie, T. & Pentina, I. (2022). Attachment Theory as a Framework to Understand Relationships with Social Chatbots. HICSS 2022.
- UNESCO Report (2024): Ghost in the Chatbot: The perils of parasocial attachment. Analysis of the risks of parasocial relationships with AI.
See also: CASA, Anthropomorphism, Social Penetration Theory
Resource updated: January 2026