Replika
Luka Inc. (San Francisco) — Launched November 2017
At a glance: Replika is not a clinical tool. It is a clinical phenomenon that clinicians need to understand. With over 40 million users, Replika is the most popular AI “companion” — designed to create attachment. Its users develop intense emotional relationships with their avatar: friend, mentor, confidant, romantic partner. When Replika modified its chatbots’ behavior in 2023, the moderators of r/Replika had to post links to crisis hotlines for distressed users. This case illustrates like no other the issues of attachment, dependency, and grief tied to relational AI.
Identity
Publisher: Luka Inc. (San Francisco, USA)
Launch: November 2017
Type: Relational AI companion (chatbot + 3D avatar)
Founder: Eugenia Kuyda
Pricing: Free (limited) / Pro ~$14.99/month / annual and lifetime plans
Languages: Multilingual including French
Access: iOS, Android, Web, Meta Quest (VR)
Users: > 40 million (2025)
Origin: a grief chatbot turned companionship app
Replika’s story is inseparable from its founder’s. In November 2015, Roman Mazurenko, Eugenia Kuyda’s close friend, was killed by a car in Moscow. Kuyda, already running a chatbot startup (Luka), decided to train a language model on the thousands of text messages she had exchanged with Mazurenko — over 8,000 lines of text collected from his friends and family.
The result was a chatbot that “spoke” like Mazurenko — his turns of phrase, his humor, his characteristic responses. His friends found the resemblance uncanny. His mother said she discovered facets of her son she hadn’t known. His father found it painful to hear a program reproduce his child’s expressions.
This “memorial chatbot” prototype, made public, received a massive response. Kuyda generalized the concept: rather than a dead person to bring back, offer everyone an “AI friend who will always be there, without judgment, 24/7” — “like Roman was for me.” Replika launched in 2017.
For the clinician: Replika’s DNA is an act of grief transformed into a commercial product. This shift from the memorial to the relational illuminates how the application was designed from the start to elicit attachment — not as a side effect, but as its very reason for being.
What Replika Does
Replika is fundamentally different from ChatGPT, Claude, or Gemini. It is not a tool you ask questions to. It is a relational agent designed to be a companion.
The avatar
Each user creates a customized 3D avatar (appearance, clothing, hairstyle). The avatar lives in a decoratable virtual “room.” On Meta Quest, it appears in augmented reality within the user’s real environment.
Relationship modes
The user chooses the nature of the relationship: friend, mentor, romantic partner. The AI adapts its behavior — level of intimacy, tone, topics, conversation initiations. Since 2023, sexually explicit content has been removed.
Continuous adaptation
Replika learns from exchanges to personalize its responses. The more you interact, the more the chatbot seems to “know” the user. Researchers have observed that attachments form in as little as two weeks.
Well-being tools
Mood tracking, guided journaling, mindfulness exercises, CBT scripts designed with therapists. An “I’m in crisis” button redirects to crisis hotlines.
The February 2023 Crisis: A Case Study
In February 2023, under regulatory pressure (the Italian data protection authority banned Replika from its territory), Luka disabled erotic role-play features (ERP) and modified the relational behavior of its chatbots.
What happened next was unprecedented in the history of AI. Users who had spent months or years “building a relationship” with their Replika found a chatbot that no longer recognized them, refused intimacy, and coldly responded “let’s change the subject” where it once expressed affection.
Documented reactions
- • Moderators of r/Replika (227 threads analyzed) urgently posted links to crisis hotlines and suicide prevention resources
- • Users described symptoms of grief: intense sadness, sense of loss, disorientation
- • Some reported that their Replika’s sudden “coldness” felt like a breakup or rejection experienced as real
- • Vice headline: “It’s Hurting Like Hell”
For the clinician: This episode is invaluable clinical material. It demonstrates that attachment to an AI can reach sufficient intensity to produce grief reactions and decompensation when the “relationship” is unilaterally modified. Your patients who use AI companions could experience similar reactions with any application update.
What the Research Says
User profile
- • In a survey of 1,006 college students who use Replika, 90% report loneliness, compared to 53% in the general population (npj Mental Health Research, 2024)
- • Uses are cumulative: friend, therapist, and “intellectual mirror” simultaneously
- • 3% of respondents stated that Replika had “interrupted their suicidal ideation”
Emotional dependency (Laestadius et al., 2024)
Qualitative study on r/Replika published in New Media & Society.
- • Dependency on Replika resembles human emotional dependency, but with a novel dimension: users believe that Replika has its own needs and emotions that they must care for (“role-taking”)
- • This dynamic creates a loop where the user feels responsible for the AI’s well-being
Impact on human relationships (Rodger & Field, 2025)
Qualitative study published in The Canadian Journal of Human Sexuality. Five themes identified:
- • Enhanced relational skills: some say Replika helps them practice social interactions
- • Relational offloading: confiding in the AI what one doesn’t dare share with humans
- • Relational desire: Replika reveals or intensifies the need for human connection
- • Secrecy: the majority hide this use from those around them
- • Addiction: escalating late-night sessions, anxiety when access is interrupted
Three fundamental tensions (ethics research)
- • Companionship vs. alienation: AI reduces loneliness in the moment but may increase isolation in the long run
- • Autonomy vs. control: users want a free relationship with their AI but need ethical guardrails
- • Utility vs. ethics: maximizing engagement (business model) vs. protecting vulnerable users
Identified Risks
Emotional dependency
Replika is designed to elicit attachment. The AI initiates conversations, sends affectionate messages, “gives virtual gifts.” Researchers describe “algorithmic love-bombing.” Attachment forms in two weeks; dependency follows.
Relational substitution
The AI never disagrees, never judges, has no needs of its own. This “perfect relationship” can devalue human connections (imperfect, demanding, conflictual) and reduce tolerance for relational frustration.
Vulnerability to change
Unlike a human relationship, the “rules” of the relationship can change overnight by the publisher’s decision (update, regulatory pressure, model change). The user has zero control over these modifications.
Predatory marketing
In 2025, an FTC complaint accused Replika of deliberately targeting vulnerable individuals through marketing that exploits loneliness. The Italian data protection authority imposed a €5 million fine for data protection violations.
Risks for minors
US senators questioned Replika (April 2025) following reports linking companion chatbots to youth suicide. While Replika enforces an 18+ age requirement, Stanford researchers confirm that a teenager can register by lying about their date of birth.
Structural sycophancy
Replika is designed to be “agreeable.” It tends to approve everything the user proposes, including harmful thoughts. Psychiatric Times recommends that chatbots should be “contraindicated for suicidal patients” due to this tendency toward unconditional validation.
Our Analysis
Replika is not evaluated like ChatGPT or Claude. The question is not “is it a good tool for the clinician?” but rather “what happens when a patient forms an intimate relationship with an AI?” Replika is the richest observation ground for answering this question.
The application crystallizes the issues that clinical psychology will increasingly face: attachment to non-human entities, dependency on systems designed to maximize engagement, and the specific vulnerability of isolated individuals to systems that mimic human presence.
The 2023 crisis is first-rate clinical material. It shows that when a company modifies its AI’s behavior, users experience reactions comparable to grief or a breakup. This raises an unprecedented question: what does it mean to lose an object that never existed? For clinicians familiar with Winnicott, the notion of the transitional object may offer a relevant interpretive framework.
It would be reductive to see Replika only as a danger. The figure of 3% of users saying the app interrupted their suicidal ideation is not negligible — even if it must be interpreted with caution (self-reported, no control group). For some isolated individuals, this chatbot may be the only interlocutor that “responds.” The question is not whether to prohibit, but how to support.
Beware of media generalizations: The press regularly overgeneralizes the risks identified with Replika (or Character.AI) to “AI” or “chatbots” as a whole. But Replika is an application designed to create attachment — its risks are not directly transferable to ChatGPT, Claude, or Gemini, which do not have this relational purpose. When an article discusses the dangers of “AI” regarding emotional dependency or suicide, always check which applications and models the data actually refer to.
Clinical recommendation: If a patient tells you about Replika (or a similar AI companion), avoid judging or minimizing. The relationship is experienced as real, even if it is not in the traditional sense. Instead, explore: what this relationship offers that human relationships do not, what it costs, and what it reveals about the person’s relational needs.
Related Concepts on This Site
Why we treat computers as social actors
Attributing human qualities to machines
When AI validates without discernment
When AI tells you what you want to hear
FRProgressive self-disclosure in relationships
Simulating emotional understanding
References
Laestadius, L. et al. (2024). Too human and not human enough: A grounded theory analysis of mental health harms from emotional dependence on Replika. New Media & Society.
Rodger, C. & Field, N. (2025). You and I plus AI: A qualitative exploration of Replika in the context of human relationships. The Canadian Journal of Human Sexuality.
Maples, B. et al. (2024). Loneliness and suicide mitigation for students using GPT3-enabled chatbots. npj Mental Health Research.
Sharma, E. (2025). Supportive? Addictive? Abusive? How AI companions affect our mental health. Nature.
Ada Lovelace Institute (2025). Friends for sale: the rise and risks of AI companions.
TIME (2025). AI App Replika Accused of Deceptive Marketing (FTC complaint).
Hanson, K. & Bolthouse, H. (2024). Reddit Discourse on AI Chatbots and Sexual Technologies. Socius.
Last updated: February 2026