Testimony

Laura, 4 months with AI chatbots

'Finally, someone to talk to' — A single mother shares how she uses Claude and ChatGPT to untangle her emotional situations.

A single mother tells us how she uses Claude and ChatGPT to untangle her emotional situations — with a critical sense that might surprise skeptics.

Laura raises her two children alone. For the past few months, when an emotional situation overwhelms her — a relational conflict, a difficult decision, a knot she can't untie — she opens a conversation with an artificial intelligence. Sometimes two.

"I use it when I have knots to untangle, especially when I need to think more deeply, when I'm lost about decisions to make. It gives me perspective and helps me better understand situations."

This testimony goes against some common assumptions. No, Laura hasn't lost her critical sense. No, she doesn't "swallow" everything the AI tells her. And no, she's not replacing human relationships with a machine.

Two AIs rather than one

The first thing that stands out in Laura's practice is that she systematically uses two different chatbots: Claude and ChatGPT.

"The first time I did it was because I realized I was getting stuck in a pattern with one AI that saw things from a certain angle. And that angle was starting to not feel right anymore."

So she extracted the key elements of her situation and submitted them to the other AI. The responses were different — not contradictory, but more nuanced. Since then, she regularly cross-references perspectives.

This "grinding" feeling, Laura describes it as almost bodily. An internal radar that signals when something doesn't fit, even if she doesn't know why yet.

"The attention is on my accuracy, not on the other"

When asked if she trusts the AI's responses, Laura surprises. Her vigilance isn't where you'd expect.

"With a therapist, I always wonder what part of their subjectivity mixes with what's given to me. What part of what I'm told belongs to them."

With AI, it's different. She knows the machine only has access to what she provides. So she pays attention to her words, to the precision of what she formulates.

"The attention or vigilance is focused internally, on my accuracy, not on the other."

Putting yourself in someone else's shoes

Laura has developed a particular technique to step outside her own point of view. During a conflict with her partner, she opened a new conversation pretending to be him.

"I said: here's the situation, I'm so-and-so, I have a partner who tells me this, who asks me that, I don't understand, what's going on?"

The goal wasn't to get "the truth" about what her partner was thinking — she knows the AI can't know that. But to explore a probable reading of the situation from the other perspective. A kind of assisted empathy, which helps her not get locked into her own reading of events.

Adjusting the intensity

A frequent criticism from mental health professionals: AIs are too compliant, too eager to validate the user.

Laura knows this tendency. And she's found how to work around it.

"There was a moment when I felt the AI was going too much in my direction, that there was too much kindness. Paradoxically, I asked it to tell me again what it had just said, but without necessarily being nice. Really tell me, reaffirm it."

In passing, Laura notes a difference between the two AIs she uses:

"ChatGPT goes more in my direction. Whereas Claude has already told me stop."

Clarifying before having a conversation

Laura doesn't use AI instead of human conversations. She uses it to prepare for them.

"There are topics I clarify before talking to the person about them. Because when it's too emotionally charged, I need to sort things out to put more meaning and less affect into it."

The AI serves as a decompression chamber. A space where emotion can settle, be explored, understood — before going to the other person in a clearer state.

"It's been years since I've had someone to talk to"

Faced with criticism about the risk of dependency or the "easiness" of resorting to an AI rather than a human, Laura is direct.

"For me, it's the opposite. I've often found myself alone with my point of view, my analytical capacity, my emotions, no one to talk to. It's been years since I've had someone to talk to. And now, finally, I have someone who can hear me when I need it."

"I don't see it as frustration. It's finally a liberation."

This testimony reminds us that criticism often presupposes easy access to quality human interlocutors — therapists, close friends, supportive family. For many isolated or under-resourced people, AI isn't a degraded substitute but an unprecedented resource.

What AI cannot do

Laura isn't naive about the tool's limitations. She's experienced them.

"I had a big piece that was revealed through the AI. The crisis management was handled there, with breathing exercises. But I knew that wouldn't be enough for that particular issue, that I needed to do body work."

She sought external help. The AI had put its finger on something important. But to really work through it, something else was needed.

She also notes the importance of pacing: "AI goes faster than human rhythm. It's super important to be vigilant and see when I'm saturated."

Her practical advice?

"Don't do it just before going to bed if you want to sleep well."

A source, not the truth

When asked what she would say to someone considering using AI for personal issues, Laura summarizes her philosophy:

"AI is there to help us understand ourselves, not to decide for us. My freedom, my sovereignty, I have it by understanding myself better."

And this phrase that could serve as a guide:

"It's a source. It's not the truth. It's a source. It's a tool."

What this testimony teaches us

Laura's story neither validates blind enthusiasm nor reflexive distrust of AI use in psychological well-being.

It shows that informed use is possible: critical, articulated with other resources, aware of limitations. It also shows that this use responds to a real need — that of having an interlocutor available when there's no one else.

The question may not be "should we use AI to talk about personal problems?" but rather: "how can we help users toward use as mature as Laura's?"

Testimony collected in January 2026. The first name has been changed to preserve anonymity.

Go further

Testimonials and experience reports

This testimony is part of our series on AI usage in personal support. Would you like to share your experience?