AI therapists could reduce humanity to predictive patterns and thus sacrifice the intimate, personal touch expected of traditional human therapists. “The logic of PAI leads to a future in which we may all find ourselves as patients in an algorithmic asylum run by digital overseers,” Oberhaus writes. “In an algorithmic shelter, there is no need for bars on windows or rooms with white upholstery, because there is no way to escape. Shelter is already everywhere – in your homes and offices, schools and hospitals, courtrooms and barracks. Wherever there is an Internet connection, shelter awaits.”
Critical analysis
Treating Mental Health with Artificial Intelligence
John Fullam
RUTLEDGE, 2025
Eoin Fullam, a researcher who studies the relationship between technology and mental health, echoes some of the same concerns in his work. Chatbot Therapy: A Critique of Artificial Intelligence Treatment for Mental Illness. A heady academic primer, the book examines the assumptions behind the automated treatments offered by artificial intelligence chatbots and how capitalist incentives can corrupt such tools.
Fullam notes that the capitalist mentality behind new technologies “often results in questionable, illegitimate and illegal business practices in which customer interests are secondary to strategies for market dominance.”
This doesn't mean that therapy bot makers “will inevitably engage in nefarious activities that are contrary to the interests of users in their quest for market dominance,” Fullam writes.
But he notes that the success of AI therapy depends on the inextricable desire to make money and treat people. In this logic, exploitation and therapy feed off each other: each digital therapy session generates data, and that data feeds a system that profits from unpaid users seeking help. The more effective the therapy seems, the more the cycle becomes reinforced, making it harder to distinguish between leaving and commodification. “The more users benefit from an app, in terms of its therapeutic or other mental health benefits,” he writes, “the more exploited they become.”
This sense of an economic and psychological ouroboros—a snake eating its own tail—serves as a central metaphor in Cutthe debut novel by Fred Luntzer, an author with a background in artificial intelligence research.
Described as “the story of a boy meets a girl and an AI therapist.” Cut follows Adrian, a young Londoner who writes rap lyrics for a living, in his romance with Mackie, a business professional with a knack for finding profitable technologies in beta testing.

Fred Luntzer
CELADON BOOKS, 2025
The title refers to the flamboyant advertising AI therapist named Syke, loaded into the smart glasses that Adrian uses to ask questions about his myriad worries. “When I signed up for Sike, we installed my dashboard, a wide black panel like an airplane cockpit that displayed my daily 'vital signs,'” Adrian says. “Sike can analyze the way you walk, the way you make eye contact, what you say, what you wear, how often you pee, poop, laugh, cry, kiss, lie, whine and cough.”






