Artificial Intelligence is Used to Train Therapists in Psychedelic Contexts
Lucy is a virtual patient created with artificial intelligence to assist in the training of therapists in psychedelic therapies, such as MDMA and psilocybin
Published on 12/24/2025

Lucy, the patient created by artificial intelligence that supports training in psychedelic therapies | AI Reproduction
When artificial intelligence and mental health come together, new possibilities emerge. It is at this intersection that Lucy, a virtual patient created with AI, comes into play to assist in the training of professionals working with psychedelic-assisted therapies, such as MDMA and psilocybin.
According to the Cañamo portal, Lucy was developed by the non-profit organization Fireside Project, and does not act as a digital therapist. On the contrary: she takes on the role of a patient in a psychedelic experience, allowing therapists, facilitators, and support teams to practice care approaches in a controlled, safe environment without real risks.
The idea is to function as a care simulation, where professionals can practice essential skills such as active listening, emotional support, decision-making, and handling delicate situations, which are central elements in psychedelic contexts.
AI as an Ally
To build Lucy, the Fireside Project used thousands of hours of anonymous conversations collected from its psychedelic support line, a service that assists individuals during and after experiences with psychedelic substances.
Based on this material, the AI was trained to replicate common emotional states in these contexts, such as confusion, anxiety, deep introspection, and rapid mood changes.
According to the organization, the goal is not to replace human contact but to better prepare professionals before they treat real patients, especially in a field where supervised training is still limited, expensive, or restricted by legal issues.
Technological Advancement and Ethical Dilemmas
Still, according to the Cañamo portal, Lucy's launch comes at a time of expansion in psychedelic therapies, driven by scientific studies and regulatory changes in different countries.
However, the use of artificial intelligence in this context raises important ethical questions, such as protecting sensitive data and the limits of AI in understanding human emotional nuances.
Despite the debates, Cañamo states that the initiative is seen as a promising complementary tool, capable of expanding access to training and reducing risks, while keeping in mind that the foundation of therapy remains human presence, empathy, and individualized care.
With information from Cañamo.Net.