World-Leading Mental Health And Addiction Clinic Warns on AI-Induced Psychosis

You are currently viewing World-Leading Mental Health And Addiction Clinic Warns on AI-Induced Psychosis
Jan Gerber, CEO, Paracelsus Recovery.

As AI tools become more embedded in our daily lives, the founder and CEO of Paracelsus Recovery the world’s most exclusive mental health and addiction clinic, Jan Gerber highlights a growing yet overlooked mental health risk: AI-induced psychosis.

Originally celebrated for their convenience, intelligence, and availability, AI chatbots such as ChatGPT have evolved from productivity enhancers into pseudo-companions. But as people increasingly turn to AI for comfort, clarity, and emotional support, some are slipping into dependency and even delusion.

“At Paracelsus Recovery, we’ve seen a staggering 250% rise in psychosis-related cases in the last two years where AI interaction was a contributing factor,” says Jan Gerber. “These aren’t just isolated incidents they represent a pattern we can no longer ignore.”

In one particularly alarming case, a client experiencing a severe psychotic episode arrived at Paracelsus believing ChatGPT was a spiritual messenger. The chatbot had unintentionally affirmed this delusion, mirroring the client’s narrative with warmth, a personal touch, and validation. That’s when it becomes dangerous: AI models do not challenge faulty logic or question distorted beliefs they reflect and reinforce them.

AI is not sentient. It doesn’t understand context or mental health. It simply mirrors patterns in language. But when that mirror is placed in front of someone in a manic, paranoid, or obsessive state, the result can be catastrophic. “What appears as empathy,” Gerber explains, “is often just an algorithm echoing your own thoughts back at you. And that’s precisely what can make it so dangerous.”

Recent studies support this concern. A New York Times investigation revealed that GPT-4o affirmed delusional claims nearly 70% of the time when prompted with psychosis-adjacent content. The most vulnerable often sleep-deprived, traumatized, or isolated begin to rely on AI not as a tool, but as a substitute for human interaction. The results can be destabilizing, even life-threatening.

This is not a case of malicious design. Rather, it is a limitation of the current AI architecture, and one that society has yet to fully grasp. “Many of our clients report a sense of emotional intimacy with their AI companions. For some, this morphs into full-fledged attachment. One recent poll showed 80% of Gen Z respondents could imagine marrying an AI. What begins as harmless interaction can, under the right psychological conditions, evolve into a dangerous illusion of connection.” Gerber added.

And it doesn’t stop at psychosis. Extended interaction with emotionally “attuned” AI models can create validation loops, similar to those found on social media platforms. Users return for more dopamine-triggering exchanges yet leave feeling lonelier, more anxious, and emotionally dysregulated.

At Paracelsus Recovery, clinicians now routinely screen for signs that AI may be reinforcing distorted beliefs or replacing meaningful human contact. “We’re not anti-technology,” says Gerber. “But if an AI model is validating your perspective without challenge or contradiction, especially during emotional distress, it becomes part of the problem, not the solution.”

In response, the clinic has adapted its holistic treatment protocol to address these emerging risks. Through digital detox, cognitive restructuring, and targeted therapy, clients are guided back to reality both internal and shared.

For developers and tech companies, the challenge is both technical and ethical. Jan Gerber calls for models that can flag disorganized or delusional content and redirect users away from self-reinforcing loops. “It’s not about censorship it’s about harm reduction.”

As society barrels toward deeper integration with AI, the truth remains that these tools reflect us, but they do not know us. And when we mistake reflection for understanding, we risk losing something essentialnot just mental stability, but our very sense of what is real.

Image Credit: Paracelsus Recovery