EHR (Electronic Health Record)
Electronic Health Record: digital system centralizing all of a patient's health information (history, treatments, test results, consultation reports). Enables information sharing between healthcare professionals and, increasingly, patient access.
Clinical implication: Connecting an EHR to a conversational AI raises major confidentiality and informed consent questions. Does the patient truly understand what sharing their complete medical record implies?
GDPR
General Data Protection Regulation: European legal framework (2018) governing the collection, processing, and storage of personal data. Imposes strict obligations on organizations (explicit consent, right to erasure, portability) and provides for significant penalties for violations.
Clinical implication: Health data is considered "sensitive" under GDPR and benefits from enhanced protection. Using American AI tools to process this data raises legal compliance questions.
HIPAA
Health Insurance Portability and Accountability Act: US federal law (1996) establishing national standards for health information protection. Defines "covered entities" (healthcare providers, insurers) and their obligations regarding confidentiality, security, and breach notification.
Clinical implication: Tech companies offering health AI tools in the US must generally comply with HIPAA. However, consumer applications used directly by patients often escape this framework, creating a regulatory gray area.
Single Point of Vulnerability
In computer security, designates a situation where critical data is centralized in a single location, creating a prime target for cyberattacks. In case of compromise, the impact is multiplied compared to a distributed architecture.
Clinical implication: The concentration of medical, wellness, and conversational data with a single private actor (like OpenAI with ChatGPT Health) creates a single point of vulnerability. A data leak would have considerable consequences for millions of affected users.
CNEDiMTS
Commission Nationale d'Évaluation des Dispositifs Médicaux et des Technologies de Santé (National Commission for the Evaluation of Medical Devices and Health Technologies): specialized commission of the French HAS responsible for evaluating medical devices for reimbursement by the national health insurance. It issues opinions on the expected service (SA) and improvement of expected service (ASA) of devices. In digital mental health, the CNEDiMTS issued unfavorable opinions on Deprexis (depression, 2021) and HelloBetter (insomnia, 2024).
Clinical implication: The CNEDiMTS decides whether a digital therapy will be reimbursed in France. Its criteria, designed for traditional medical devices, are a major issue for the future of digital tools in mental health.
PECAN
Prise en Charge Anticipée Numérique (Anticipated Digital Reimbursement): French regulatory mechanism allowing temporary reimbursement of an innovative digital medical device (DMN) before its definitive evaluation by the CNEDiMTS. The manufacturer must demonstrate a "presumption of innovation" in terms of clinical benefit or care organization. PECAN is the main pathway to reimbursed market access for digital therapies in France.
Clinical implication: PECAN is the pathway used by Deprexis and HelloBetter — both received unfavorable opinions. The "presumption of innovation" standard has proven very demanding for digital therapies in mental health.
DTx (Digital Therapeutics)
Digital Therapeutics: software-based therapeutic interventions, clinically validated, aimed at preventing, managing, or treating a medical condition. Unlike simple wellness apps, a DTx follows a rigorous clinical evaluation process (controlled trials) and seeks reimbursement by healthcare systems. Examples in mental health: Deprexis (digital CBT for depression), HelloBetter (digital CBT for insomnia).
Clinical implication: The distinction between a DTx and a wellness app is crucial: a DTx claims a measurable therapeutic effect and must prove it. Zero mental health DTx are reimbursed in France (vs 30+ in Germany via the DiGA system).
DiGA
Digitale Gesundheitsanwendungen (Digital Health Applications): German regulatory framework established in 2019 by the Digital Healthcare Act (DVG), enabling reimbursement of digital health applications through statutory health insurance. The process is a "fast-track": provisional listing for 12 months with real-world data collection, then definitive evaluation. This model, unique in Europe, has enabled the reimbursement of dozens of applications, including several in mental health.
Clinical implication: The German DiGA model is the main counterexample to the French model in debates about digital therapy evaluation. Its philosophy of "reimburse to prove in real life" contrasts with France's "prove first, reimburse later" approach.
Medical Device (DM)
Medical Device (Dispositif Médical): any instrument, apparatus, software, or other article intended by the manufacturer to be used for medical purposes of diagnosis, prevention, monitoring, treatment, or alleviation of disease. Medical devices are regulated at European level (Regulation 2017/745) and must obtain CE marking before being placed on the market. In France, their reimbursement is evaluated by the CNEDiMTS.
Clinical implication: AI software used in clinical contexts can qualify as a medical device, with all the regulatory obligations that entails (CE marking, clinical evaluation, vigilance). This qualification is a strategic issue for mental health AI tools.
DMN (Digital Medical Device)
Dispositif Médical Numérique (Digital Medical Device): subcategory of medical device whose primary function is performed by software. Includes therapeutic mobile applications, remote monitoring tools, and clinical decision support software. In France, the France 2030 program specifically funded a "DMN in Mental Health" call for projects (3 laureates: Theremia, Emobot, Edra PRO).
Clinical implication: The DMN category is strategic because it opens access to reimbursement via PECAN. A digital tool not qualified as a DMN (a simple wellness app) cannot claim reimbursement.
LFSS
Loi de Financement de la Sécurité Sociale (Social Security Financing Act): law voted annually by the French Parliament, setting health insurance spending targets and the modalities for coverage of care and devices. The 2026 LFSS (article 84) notably tasks the HAS with creating "relevance referentials" for public funding of clinical decision support systems — a framework that will shape the future of AI tools in mental health.
Clinical implication: The LFSS is the legislative lever through which health AI tools gain (or fail to gain) reimbursement. Article 84 of the 2026 LFSS is worth monitoring: it will shape the evaluation framework for clinical decision support tools.
A.V.E.C. (HAS Framework)
Apprendre, Vérifier, Estimer, Communiquer (Learn, Verify, Estimate, Communicate): mnemonic acronym from the first HAS guide on generative AI use in healthcare (October 2025). "The proper use of generative AI in healthcare is done WITH the professional." The four pillars structure a cautious approach: learn about AI functioning, systematically verify its outputs, estimate the relevance of use, and communicate transparently with patients and colleagues.
Clinical implication: The A.V.E.C. framework is designed for professional use in somatic medicine. It does not cover mental health specificities: patient self-use, risk of transference, enhanced confidentiality of psychological data, impact on the therapeutic alliance.
CE Marking
Conformité Européenne (European Conformity): certification attesting that a medical device complies with the essential safety and performance requirements defined by European regulation (Regulation 2017/745 for medical devices). CE marking is a prerequisite for placing devices on the European Economic Area market and, in France, for any reimbursement application to the CNEDiMTS. For medical-purpose software (including AI tools), obtaining CE marking requires clinical trials and substantial technical documentation.
Clinical implication: An AI tool without CE marking is not a medical device in the regulatory sense — it cannot claim a medical purpose or seek reimbursement. This applies to ChatGPT, Replika, or Character.AI, which are not medical devices. France 2030 projects (Theremia, Emobot, Edra PRO) explicitly target this certification.