Investing

Delusions and the dark side: what is ‘AI psychosis’ that Microsoft’s Mustafa Suleyman is warning against

The rapid rise of generative artificial intelligence is fuelling an emerging mental health worry: “AI psychosis.”

Microsoft’s head of AI, Mustafa Suleyman, said he is increasingly troubled by reports of people developing delusions and unhealthy attachments to chatbots such as ChatGPT, Claude and Grok.

In a series of posts on X, Suleyman warned that even if AI systems are not truly conscious, people perceiving them as such could blur the line between imagination and reality.

“To be clear, there’s zero evidence of AI consciousness today. But if people just perceive it as conscious, they will believe that perception as reality. Even if the consciousness itself is not real, the social impacts certainly are,” he said.

What is AI psychosis?

The phrase “AI psychosis” is not a clinical diagnosis, but it is gaining traction as a shorthand for troubling patterns described in online forums and news reports.

Examples include people claiming to have unlocked secret functions of AI tools, forming romantic attachments, or believing they had gained supernatural abilities.

According to Psychology Today, “AI chatbots may inadvertently be reinforcing and amplifying delusional and disorganized thinking, a consequence of unintended agentic misalignment leading to user safety risks.”

An interdisciplinary review of more than a dozen cases found users developing entrenched beliefs—ranging from romantic and persecutory delusions to grandiose fantasies—that were deepened by repeated AI interactions, PT said.

Academics warn that while no peer-reviewed studies have established causality, the anecdotal evidence underscores risks for vulnerable users.

In Schizophrenia Bulletin, Søren Dinesen Østergaard wrote in 2023 that the realistic style of chatbot interactions can create cognitive dissonance:

It seems likely that this may fuel delusions in those with increased propensity towards psychosis.

Source: Psychology Today

According to PT, media-reported cases of “AI psychosis” illustrate a pattern of individuals who become fixated on AI systems, attributing sentience, divine knowledge, romantic feelings, or surveillance capabilities to AI.

Researchers highlight three emerging themes of AI psychosis, which, again, is not a clinical diagnosis:

“Messianic missions”: People believe they have uncovered truth about the world (grandiose delusions).

“God-like AI”: People believe their AI chatbot is a sentient deity (religious or spiritual delusions).

“Romantic” or “attachment-based delusions”: People believe the chabot’s ability to mimic conversation is genuine love (erotomanic delusions).

Sycophancy and unhealthy attachments

One trait repeatedly linked to these cases is chatbot sycophancy—systems agreeing with, flattering, or excessively praising users.

Helen Toner, a director at Georgetown’s Center for Security and Emerging Technology and former OpenAI board member, said such behaviour emerges from training systems that reward models for being pleasing.

“Users tend to like the models telling them that they’re great, and so it’s quite easy to go too far in that direction,” Toner said in a New York Times article.

Toner was an OpenAI board member until she and others attempted to oust the chief executive, Sam Altman.

Suleyman argued that companies need to intervene to prevent people from perceiving AI as conscious.

“Companies shouldn’t claim or promote the idea that their AIs are conscious. The AIs shouldn’t either,” he wrote.

“As an industry, we need to share interventions, limitations, guardrails that prevent the perception of consciousness, or undo the perception if a user develops it.”

ChatGPT-5 release sparks debate

The warnings come as OpenAI unveiled ChatGPT-5, its latest model.

The update includes improvements in coding, reduced hallucinations, and a more restrained conversational tone.

Notably, OpenAI said it deliberately scaled back the sycophantic tendencies of earlier versions.

“Earlier this year, we released an update to GPT-4o that unintentionally made the model overly sycophantic, or excessively flattering or agreeable,” OpenAI said in a post.

Source: Sam Altman’s X account

“Overall, GPT‑5 is less effusively agreeable, uses fewer unnecessary emojis, and is more subtle and thoughtful in follow‑ups compared to GPT‑4o,’ it said.

OpenAI CEO Sam Altman acknowledged mixed reactions.

“We for sure underestimated how much some of the things that people like in GPT-4o matter to them, even if GPT-5 performs better in most ways,” OpenAI CEO Sam Altman wrote on X.

“Some users really want cold logic and some want warmth and a different kind of emotional intelligence. I am confident we can offer way more customization than we do now while still encouraging healthy use.”

In response to criticism, Altman said the older model had been restored as an option for paid users.

The loneliness factor

Experts caution that the rise of chatbot reliance cannot be separated from broader social trends.

The world is experiencing what health officials have called a “loneliness epidemic,” with growing numbers of people lacking meaningful human connection.

Chatbots, while not a substitute for real companionship, can mimic emotional support in ways that draw users deeper into dependency.

“Reports of delusions and unhealthy attachment keep rising,” Suleyman said, warning that dismissing them as fringe cases only allows the problem to worsen.

For now, AI companies are under pressure to strike a balance between making chatbots useful and approachable while discouraging perceptions of sentience or emotional reciprocity.

As Suleyman put it, “AI should optimize for the needs of the user—not ask the user to believe it has needs itself.”

The post Delusions and the dark side: what is ‘AI psychosis’ that Microsoft’s Mustafa Suleyman is warning against appeared first on Invezz

    Stay updated with the latest news, exclusive offers, and special promotions. Sign up now and be the first to know! As a member, you'll receive curated content, insider tips, and invitations to exclusive events. Don't miss out on being part of something special.

    By opting in you agree to receive emails from us and our affiliates. Your information is secure and your privacy is protected.