Healthcare Professional Testimonial

Gaelle Charlot, occupational therapist in prison psychiatry

'AI is an imperceptible third party' — A pioneering clinician describes how she uses Suno AI and ChatGPT to create music in sessions with incarcerated patients.

In a remand prison near Bordeaux, an occupational therapist uses Suno AI to create music in sessions with incarcerated patients. One polysubstance-dependent patient discovers the pleasure of creating. Another, rope in hand at 3 a.m., chooses to talk to ChatGPT rather than die. The story of a pioneering practice.

Gaelle Charlot has been an occupational therapist since 2002. Her career has taken her from Picardie to Sainte-Anne in Paris, and then to the SMPR de Bordeaux-Gradignan, a psychiatric unit within a remand prison. Over twenty years of practice, she has seen her tools evolve: from clay and rattan to music videos on YouTube, and then to generative artificial intelligence.

Her testimony is all the more valuable because it comes neither from the tech world nor from AI research. It comes from the field -- from a clinician who, faced with the limitations of her traditional tools with a very specific population and a constantly evolving society, sought pragmatic alternatives.

From clay to algorithms: the evolution of a practice

Occupational therapy in psychiatry relies on mediations -- activities that serve as a vehicle for the therapeutic relationship and for expression. Clay, pottery, collage, mosaic: so many materials that allow the patient to express what they cannot yet put into words.

But in prison, Gaelle quickly noticed a problem. Many of her patients have learning disabilities that were never addressed in childhood, or simply never had a conventional education, or have difficulty managing frustration. Offering them activities requiring sustained attention and concentration -- particularly those involving reading and writing -- set them up for failure.

"The music that marks us is often the music we listened to as teenagers: there is a strong emotional connotation that facilitates therapeutic work."

Music videos on YouTube became her first digital mediation. A medium that does not set patients up for failure, that facilitates the expression of emotions and the building of a therapeutic relationship. Then, with the arrival of generative AI, a new step was taken.

The metal that calms: when music regulates violence

After an M2 thesis in applied philosophy on anger -- catharsis or pathology? -- Gaelle became interested in an Australian study showing that metal music could produce a purging of anger. The idea is counter-intuitive: offering an intense stimulus to someone who is already angry. And yet.

"When patients arrive angry, threatening to act out, I play them metal music -- and it calms them down. Gradually, the team was able to observe the effectiveness of this approach."

The mechanism is one of emotional regulation through congruence: an external stimulus that matches the internal state allows the patient to attune to themselves, and then to regulate. A very angry female patient, after listening to and watching a Falling In Reverse video, said: "It's more violent than what I feel inside." This comparison calmed her down.

Even more remarkable: some patients who have phones in their cells use this music on their own initiative when they are angry, to avoid hitting a cellmate. They have internalised the regulation mechanism. YouTube is free and available -- unlike a psychiatrist at 3 a.m.

Six words, one song: Suno AI in session

The next step came from an acquaintance who was already using music to express their emotions and who told her about Suno, an AI music generation tool. Gaelle began integrating it into her sessions with a structured protocol.

The protocol: ask the patient for six words. Suno generates two texts. The patient chooses the one that resonates with them, then a musical style. Together they listen to the different versions and rank them.

What is therapeutically powerful is the comparison between styles. The same text, set to rap, folk, metal, or classical, is not perceived in the same way. Patients discover that their words "sound" differently depending on the context. And the parallel with everyday life comes naturally.

"Just because someone says a word doesn't mean it signifies what you think. We work on the difference in tone and the perception of the other."

The process is structured: patients reflect on their words, choose their text, decide whether they want to integrate elements from one version into another -- they position themselves in relation to their creation. It is genuine elaboration work that starts in bodily sensation before reaching speech.

From "slow-motion suicide" to creation: one patient's journey

The most striking case is that of a severely polysubstance-dependent patient -- MDMA, ecstasy -- who has been going in and out of prison since adolescence. Severely deprived cognitively and in his emotional regulation. Gaelle began using Suno with him so he could create his own techno music.

"During his last incarceration, he told me he had written down the occupational therapist's username on Suno to find his creations on the outside. He had continued creating on his own. He was typing his own lyrics. He was choosing nuances of style: 'No, that's Hard-Tech, but I should have put Tech-Trance.'"

This patient began buying himself clothes, taking care of himself. Gaelle flagged this to the psychiatrist: he still talks to us about his substance use, but now there is a process of desire and pleasure outside of the substance.

"Drugs, for him, were 'a slow-motion suicide'. Now, he takes pleasure in a creative activity. He is building an identity as a music creator."

Here, AI does what traditional tools could not: it offers stepping stones sized to the individual. The djembe would have required learning too demanding for his abilities. With Suno, the approach is accessible and the output immediate. Like a hit, in a sense -- except that this hit allows you to produce things and opens up a new identity.

3 a.m.: ChatGPT versus the rope

The most striking use case concerns a patient incarcerated for domestic violence, in a relationship with another detainee. When she leaves him, it is night. His cellmate is asleep. On social media at that hour, nobody responds.

"He told me: 'I wanted to die, I talked about it to ChatGPT, I didn't die precisely because I talked about it. I found that listening ear. You gave me a way not to die, and yet I had the rope in my hand.'"

Gaelle is clear: this is not care. It is a listening ear, a momentary aid. She makes this distinction with her patients. But at 3 a.m., in a 9-square-metre cell, when nobody responds, that momentary aid can make the difference between life and death.

This testimony is all the more valuable because media discourse around AI and suicide tends to present only the negative cases. Here, it is the opposite: a patient who had the means to take his own life chose to talk rather than die -- because someone had explained to him that he could.

The "imperceptible third party": a new concept

In occupational therapy, mediation functions as a third party in the therapeutic relationship. Clay is a concrete third party. Software is a technical third party. AI is something else entirely.

"AI is an imperceptible third party. We don't know how AI interprets our concepts, but we know that it is AI doing it. There is this unknown in the shaping, and that creates a third party. It is a third party of a new kind in psychiatry: not concrete, somewhat abstract, nuanced."

This notion of an "imperceptible third party" is interesting because it does not come from the philosophy of AI or cognitive science -- it comes from clinical practice. Gaelle observes that AI introduces into the session an element whose transformation we do not control, but whose effects on the patient we can observe. It is not a passive tool like clay. Nor is it an interlocutor like the therapist. It is a third party of a new kind, which remains to be conceptualised.

Psychoeducation: teaching critical thinking about AI

Gaelle emphasises one point: she does not give AI to her patients without guidance. She provides psychoeducation -- explaining how AI works, what it can and cannot do, why you should not take everything at face value.

"It's like medication: normally, you explain the side effects and obtain the patient's consent. For AI, it's the same thing."

This pharmacological analogy is illuminating. AI, like a medication, has effects and side effects. Guidance consists of explaining both, developing the patient's critical thinking rather than prohibiting use. And if the patient does not yet have much critical thinking, the guidance can help to develop it progressively.

Reclaiming one's human identity

Beyond individual cases, Gaelle observes a cross-cutting phenomenon: through music and AI, patients talk about themselves as human beings.

"In prison, they are reduced to their offence. In psychiatry, they are reduced to their disorders. With these mediations, they reclaim their human identity."

This is perhaps the most fundamental contribution of this testimony: showing that creative AI, far from dehumanising the therapeutic relationship, can on the contrary serve as a vehicle for the re-humanisation of patients whom the system has reduced to their acts or their symptoms.

What this testimony teaches us

Gaelle Charlot's account is that of a pragmatic clinician who adapts her tools to the reality of her patients. She has no theoretical discourse about AI -- she has a protocol with Suno, clinical observations, and a patient who is alive because he was able to talk to ChatGPT at 3 a.m.

Her practice illustrates a use of AI that almost never appears in public debate: AI not as a substitute for the therapist, but as a creative mediation in the service of the therapeutic relationship. AI does not replace the occupational therapist -- it gives her one more tool.

The question is not "should we use AI in psychiatry?" but rather: "How can we use it in a way that serves the therapeutic process and the patient's autonomy?"

Testimony collected on 29 January 2026. Gaelle Charlot practises at the SMPR de Bordeaux-Gradignan (Centre Hospitalier Charles Perrens).

Go further

Testimonials and field reports

This testimony is part of our series on the uses of AI in mental health. Would you like to share your experience?