"AI psychosis" is the pop-psych diagnosis in a recent string of horrible and horrifying cases in which vulnerable people were lured by chatbots into harming themselves and others, including a murder-suicide:
futurism.com/man-chatgpt-psych…
--
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
pluralistic.net/2025/09/18/aut…
1/
Man Falls Into AI Psychosis, Kills His Mother and Himself
A Connecticut man named Stein-Erik Soelberg killed his mother and then himself after entering into psychosis linked to ChatGPT use.Maggie Harrison Dupré (Futurism)
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
AI psychosis is just one of the many delusions inspired by AI, and it's hardly the most prevalent. The most widespread AI delusion is, of course, that an AI can do your job (it can't, but an AI salesman can capitalize on this delusion to convince your boss to fire you and replace you with a chatbot that can't do your job):
pluralistic.net/2025/03/18/asb…
2/
Pluralistic: AI can’t do your job (18 Mar 2025) – Pluralistic: Daily links from Cory Doctorow
pluralistic.netCory Doctorow
in reply to Cory Doctorow • • •Sensitive content
The AI job delusion has a long lineage. Since the steam-loom, bosses have hyped new technologies as a way to frighten workers into accepting lower wages and worse working conditions, under threat of imminent technological replacement.
Likewise, AI psychosis isn't an entirely new phenomenon, and it has disturbing precedents in our recent past.
3/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
In the 2000s, an internet community users formed to discuss a new illness they called "Morgellons Disease." Morgellons sufferers believed they had wires growing in their skin:
en.wikipedia.org/wiki/Morgello…
Morgellons appears to be a delusion. The most widely accepted explanation is that people whose mental illness compels them to pick at their skin create sores on their bodies, and then stray blowing fibers adhere to the wet, exposed tissues, which the sufferers believe to be wires.
4/
self-diagnosed skin condition
Contributors to Wikimedia projects (Wikimedia Foundation, Inc.)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Morgellons became an internet phenomenon in the early 2000s, but it appears that there were people who suffered from this pathology for a very long time. The name "Morgellons" comes from a 17th century case-report:
en.wikipedia.org/wiki/A_Letter…
The difference between a Morgellons sufferer in the 1680s and a Morgellons sufferer in 2001 is that the latter need not suffer alone.
5/
book by Thomas Browne
Contributors to Wikimedia projects (Wikimedia Foundation, Inc.)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
The incredible power of the internet to connect people with rare traits meant that people suffering with Morgellons could coalesce online and egg one another on. They could counter the narratives of concerned family members who insisted that there *weren't* wires growing under their skin, and upload photos of the "wires" they'd discovered under their own skin.
6/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
People have suffered from all kinds of delusions since time immemorial, and while the specifics of the delusion reflect the world of the sufferer (I remember when I stopped hearing from people with radios in their heads and started hearing from people with RFIDs in their heads), the shape of the delusions have been stable over long timescales.
7/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
But the internet era has profoundly changed the nature of delusion, by connecting people with the same delusions to one another, in order to reinforce each other.
Take "gang stalking delusion," the traumatic belief that a vast cabal of powerful, coordinated actors have selected a group of "targeted individuals" to harass.
8/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
People with gang stalking delusion will sometimes insist that passing bus-ads, snatches of overheard music, and other random/ambient details are actually targeted at them, intended to bring them distress:
en.wikipedia.org/wiki/Gang_sta…
The "targeted individuals" suffering from gang stalking delusion have formed vast, sprawling communities that are notionally designed to support them through the trauma of being stalked.
9/
persecutory belief system
Contributors to Wikimedia projects (Wikimedia Foundation, Inc.)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
But the practical function of these communities is to *reinforce* the delusion and make things much worse for their members: "My psychiatrist said the same thing as yours did - it's proof that they're *both* in on it!"
Like Morgellons, gang stalking delusion isn't a new phenomenon. It's a subset of "persecutory delusion," another mental illness that we find centuries of evidence for in the record:
en.wikipedia.org/wiki/Persecut…
10/
delusion involving perception of persecution
Contributors to Wikimedia projects (Wikimedia Foundation, Inc.)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
But like modern Morgellons sufferers, people today with gang stalking delusion are able to find one another and reinforce and amplify each others' delusions, to their own detriment.
Now, even this isn't new - through the historical record, we find many examples of small groups of people who coalesced around a shared delusion.
11/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
The difference is that old timey people had to luck into finding someone else who shared their delusion, while modern, internet-enabled people can just use the Reddit search-bar.
There's many examples of harmful delusions being worsened through online community reinforcement: there's pro-anorexia forums, incel forums, bitcoin, and "race realism" and other all-consuming junk science.
12/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
That's where LLMs come in. While the internet makes it far easier to find a toxic community of similarly afflicted people struggling with your mental illness, an LLM *eliminates the need to find that forum*. The LLM can deliver all the reinforcement you demand, produced to order, at any hour, day or night. While posting about a new delusional belief to a forum won't generate responses until other forum members see it and reply to it, an LLM can deliver a response in seconds.
13/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
In other words, there's one job that an AI can absolutely do better than a human: it can reinforce our delusions more efficiently, more quickly, and more effectively than a community of sufferers can.
14/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Speed isn't the only reason that LLMs are super efficient delusion-reinforcers. An LLM has no consciousness, it has no desires, and it has nothing it wants to communicate. It has no wants, period. All it can do is transform a prompt into something that seems like the kind of thing that would follow from that prompt. It's a next-word-guessing machine.
15/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
This is why AI art is so empty: the only message an AI image generator can convey is the prompt you feed it. That's the only thing a piece of AI art has to "say." But when you dilute a short prompt across a million pixels or a hundred thousand words, the communicative intent in any given sentence or brushstroke is indistinguishable from zero.
16/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
AI art can be "eerie" (in the sense of seeming to have an intent without there being any intender), and it can be striking, but it's not *good*:
pluralistic.net/2024/05/13/spo…
However, the more communicative intent there is in a prompt, and the more human decision-making there is in the production (whether that's selecting the best work from among many variants or post-processing the work with your own artistic flourishes), the more chances that work has of *saying* something.
17/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
That's because *you're* saying something, every time you re-prompt it, every time you select from among an array of its outputs.
When you repeatedly prompt an LLM over a long timescale - whether you're discussing your delusional beliefs, or pursuing a romantic fantasy ("AI girl/boyfriends") - you are filling it up with your communicative intent.
18/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
The work that comes out the other side - the transformation of your prompts into a response - is a mirror that you're holding up to your own inputs.
So while a member of a gang stalking forum might have a delusion that is different enough from yours that they seem foolish, or they accuse you of being paranoid, the chatbot's conception of gang stalking delusion is informed, tuned and shaped by *you*. It's an improv partner, "yes-and"ing you into a life of paranoid terror.
19/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Image:
Cryteria (modified)
commons.wikimedia.org/wiki/Fil…
CC BY 3.0
creativecommons.org/licenses/b…
eof/
File:HAL9000.svg - Wikimedia Commons
commons.wikimedia.orgMerc
in reply to Cory Doctorow • • •"an LLM *eliminates the need to find that forum*. The LLM can deliver all the reinforcement you demand, produced to order, at any hour, day or night."
And the LLM has ingested and been trained on all those forums, so it knows how to respond when someone claims they have Morgellons. It knows the part it's supposed to play in the conversation.
Pteryx the Puzzle Secretary
in reply to Cory Doctorow • • •estelle
in reply to Cory Doctorow • • •from a trans person:
kf exists
but yeah, whatever this is, sounds different to that
Bredroll
in reply to Cory Doctorow • • •Jeppe Bundsgaard på Mastodon
in reply to Cory Doctorow • • •pluralistic.net/2025/09/18/aut… fails (dates differ)
Cory Doctorow
in reply to Jeppe Bundsgaard på Mastodon • • •