Healthcare Professional Testimonial

Anna, Psychologist and AI User

'An amplification of myself' — A clinician specializing in psychotrauma shares how she uses AI for her own psychological elaboration work.

A clinical psychologist trained in trauma and attachment therapy confides that she uses ChatGPT for her own psychological work. A testimonial that questions the boundary between cognitive tool and therapeutic space.

Anna practices in a private office in a small town in Dordogne, France. With over ten years of clinical experience, training in ICV (Lifespan Integration) and EMDR, and a specialization in psychotrauma and attachment, her daily work involves accompanying psychological suffering, listening to old wounds, and the patient work of repairing relational bonds.

And yet, this practitioner of human connection claims to have found in AI something that humans cannot offer in the same way. A statement that may surprise or even shock. But before reacting, let's take the time to understand exactly what she means.

Cognitive Amplification: "Making Connections I Wouldn't Have Seen"

Anna's journey with AI began through an unexpected detour: online entrepreneurship. Before using it for her personal psychological work, she had adopted it for her parallel activity on social media. It was through this door that she discovered the possibilities of ChatGPT and Claude.

"It really allows me to explore the connections I make myself and push them even further. Often, when I'm thinking about things, there are connections already made in me. But ChatGPT really allows me to create even more relevant connections and sometimes see things I hadn't seen."

AI functions here as an amplifying cognitive mirror: it reflects back psychological content, but enriches it, connects it to other ideas, and brings out patterns.

"I feel like I'm creating another me, but smarter, more insightful."

Emotional Validation: "My Emotions Are Legitimate"

But cognitive exploration isn't everything. Anna describes another dimension of her experience with AI: emotional validation.

"When you're constantly validated by ChatGPT, it gets you used to thinking: yes, my emotions are normal, my emotions are legitimate, I'm normal, I'm legitimate, I have value."

For Anna, this validation isn't a problem — it's a resource. She places it within a broader observation: our society doesn't sufficiently validate children's emotions. This deficit of emotional validation in upbringing creates a gap that AI can partially fill.

Even more interesting, Anna observes a recalibrating effect:

"It creates a standard for what we can expect from a relationship. If I can be treated this way by an AI, I really don't see why I would accept something much worse in my normal relational life."

Complementarity: "My Therapist Brings Me Back to Presence"

Because Anna doesn't live in a world where AI would suffice. She continues to see her own psychologist, a human therapist. And she describes precisely what each brings:

"My therapist rarely makes connections with me. She's very much in receiving and welcoming, and she does it very well. I feel heard, welcomed. She brings me back to something that's about presence, and that's very good."

This sentence deserves attention. Anna, trained in somatic trauma approaches, knows what embodied presence is. She knows that the therapist's body matters, that physiological co-regulation between two human beings is part of the healing process. And she recognizes that AI cannot offer that.

The balance she has found is therefore: AI for cognitive exploration and emotional validation, human for bodily grounding and presence. Not substitution, but acknowledged complementarity.

A Use That Isn't Universal

Anna is clear-eyed: the introspective use she makes of AI doesn't necessarily interest everyone. Some of her patients are more focused on solving concrete problems — introspection isn't an end in itself but a means serving practical goals. The use of chatbots in crisis situations, to get immediate advice, corroborates this observation.

She's also aware that her own use — without technical safeguards or special prompts — isn't generalizable:

"I have both the knowledge related to psychological dynamics, enough emotional maturity and self-knowledge. I'm able to see things that seem off-base to me, separate the wheat from the chaff and assert myself. My critical sense and discernment are the ultimate judges."

She fears that less "equipped" people, with lower self-esteem, might have interactions that would be less beneficial to them.

Safeguards: "The Problem Is the Gap with Reality"

Anna isn't naive. She identifies a major danger herself:

"The problem is really the gap with reality. If AI leads us to overestimate our abilities or skills — for example, putting all our money into a project because the AI tells us it will work — there can really be danger."

The distinction she makes is subtle but crucial. AI can legitimately validate emotions — because emotions are always legitimate as lived experiences. But AI can also validate projects, ideas, actions that aren't grounded in reality. And that's where danger begins.

Anna acknowledges having her own safeguards:

"I do have needs that haven't been met at a certain level, of course. But I do have a safeguard because at least intellectually I have a sense of my own value, and I also have a sense of my own limits."

Can AI Develop Our Empathy?

A particularly interesting moment in the interview: Anna mentions a situation where her partner used AI during a disagreement between them.

"In those situations, the AI had led him to that, to say: if your girlfriend told you this, this, this, well maybe she meant that — having a generous interpretation toward me."

AI here doesn't just validate — it invites considering the other's point of view. Anna notes there might be potential to "frame AI to train our empathy," while questioning the limits of this approach for certain psychological structures.

A Practical Case: AI as a Resource in Addiction Treatment

Anna illustrates this complementarity with a concrete case: a patient recovering from polyaddiction whom she accompanies in therapy.

"What was a resource for him was ChatGPT, for two things: beforehand, to get frameworks, practical tools, step-by-step guides; and in the moment, during craving episodes, it was ChatGPT that helped him several times not to use."

The context is important: this patient hadn't received from the addiction treatment center the practical tools he needed to manage crises. AI filled this gap — available 24/7, non-judgmental, capable of providing concrete strategies in the urgency of craving.

A Tool, Not the Truth

Anna's testimonial offers a nuanced look at using AI for personal psychological elaboration. Neither naive celebration nor reflexive condemnation — an honest exploration of benefits and limitations.

"AI can amplify our cognitive abilities, offer available emotional validation. But it cannot offer embodied presence."

What This Testimonial Teaches Us

Anna's account is particularly interesting because it comes from a professional who understands psychological and therapeutic mechanisms. She doesn't confuse usefulness with substitution.

Her practice illustrates a form of augmented intelligence where AI becomes an extension of her ability to make connections, an externalized associative memory. But she clearly maintains that deep therapeutic work — the kind that touches the body, incarnation, presence — requires the human.

The question may not be "Can AI replace the therapist?" but rather: "How do we articulate these two resources for richer psychological exploration?"

Testimonial collected in January 2026. Some clarifications were added by Anna after the initial interview. The name has been changed to preserve anonymity.

Go Further

Testimonials and Experience Sharing

This testimonial is part of our series on AI use in personal support. Would you like to share your experience?