The chatbot will see you now: how AI is being trained to spot mental health issues in any language | Global development

When patients call Butabika Hospital in Kampala, Ugandaby seeking help with mental health problems, they themselves help future patients by helping to create a therapeutic chatbot.

Calls to the clinic's hotline are used to train an artificial intelligence algorithm that the researchers hope will eventually power a chatbot offering therapy in local African languages.

One person out of 10 Africa is struggling with mental health problems, but the continent is experiencing serious shortage of mental health professionalsand stigma is a huge barrier to health care in many places. Experts believe AI can help solve these problems where resources are scarce.

Professor Joyce Fall-Belted, Research Director of the Artificial Intelligence Laboratory at Makerere University. Photo: Courtesy of Kampala Geopolitical Conference.

Professor Joyce Nakatumba-Nabende is the Research Director of the Makerere Artificial Intelligence Laboratory at Makerere University. Her team works with Butabika Hospital and Mirembe Hospital in Dodoma in neighboring Tanzania.

Some callers simply want factual information about opening hours or staff availability, but others report suicidal thoughts or report other warning signs of their mental health.

“Someone probably won't say the word 'suicide' or won't say the word 'depression' because some of these words don't even exist in our local languages,” says Nakatumba-Nabende.

By stripping patient-identifying information from call recordings, Nakatumba-Nabende's team uses artificial intelligence to read them and determine how people speaking Swahili or Luganda—or another of Uganda's dozens of languages—might describe specific mental health conditions such as depression or psychosis.

Over time, the recorded conversations could be run through an artificial intelligence model that would determine that “based on this conversation and the keywords, maybe there is a tendency towards depression, there is a tendency towards suicide.” [and so] can we escalate the call or call the patient back for follow-up,” says Nakatumba-Nabende.

She said current chatbots typically do not understand the context of how care is provided or what is available in Uganda and are only available in English. The ultimate goal is to “bring mental health care and services to the patient” and identify as early as possible when people need the more specialized care offered by psychiatrists.

According to Nakatumba-Nabende, the service can even be provided through SMS messages to people who do not have a smartphone or internet access.

According to her, the chatbot has many advantages. “When you automate, it happens faster. You can easily provide more services to people and get results faster than if you train someone to get a medical degree and then specialize in psychiatry and then do internships and training.”

Promoting World Mental Health Day in Kampala, Uganda in 2020. Photo: Xinhua/Alami

Scale and capability are also important: an AI tool is easily accessible at any time. And, Nakatumba-Nabende says, people don't want to be seen in clinics seeking mental health help because of the stigma. Digital intervention gets around this.

She hopes the project will mean the existing workforce can “help more people” and “reduce the burden of mental illness in the country.”

Miranda Wolpert, director of mental health at the Wellcome Trust, which funds various projects using AI for mental health around the world, says the technology opens up big opportunities in diagnostics. “Right now we're very reliant on people filling out what are essentially paper and pencil questionnaires, and maybe AI can help us think more effectively about how we can identify those who are struggling,” she says.

Technology-assisted treatment can also be very different from traditional mental health options such as talk therapy or medication, Wolpert says, citing a Swedish study on how playing Tetris can help. relieve symptoms of post-traumatic stress disorder.

However, regulators are still grappling with the implications of the increased use of AI in healthcare. For example, South African Health The Products Regulatory Authority (SAHPRA) and medical NGO Path are using Wellcome funding to develop the regulatory framework.

Bilal Matin, director of artificial intelligence at Path, says it is important for countries to develop their own regulations. “Does this thing work well in Zulu?” is an issue that concerns South Africa, but the FDA is not addressing the issue. [US Food and Drug Administration]I don’t think I ever considered,” he says.

Christelna Reinecke, SAHPRA's chief operating officer, wants users of a mental health AI algorithm to have the same confidence as those taking a medicine that it is tested and safe. “It won't cause you to hallucinate, give you weird results, or do more harm than good,” she says.

Staff from Butabika Hospital in Uganda, which launched an artificial intelligence call center in collaboration with Makerere University and Mirembe Hospital in Tanzania. Photo: Courtesy of Butabika Hospital.

In the background is the specter of chatbot-related suicides and cases where A.I. seems to have caused psychosis.

Reinecke wants to develop an advanced monitoring system that can identify “risky” results from generative artificial intelligence tools in real time. “It can't be something 'post-event,' so far removed from the event that you might have put other patients at risk because you didn't intervene quickly enough,” she says.

UK regulator, Medicines and Healthcare products Regulatory Agency (MHRA), similar initiative and works with technology companies to understand how best to regulate artificial intelligence in medical devices.

Regulators must decide which risks are important to monitor, Mateen says. Sometimes the benefits outweigh the potential harms to the point where we have “an incentive to put it in people's hands because it will help them.”

It is hoped that the chatbot will help reduce the mental health workforce gap not only in Africa but throughout the world. Illustration: Getty Images

While most of the talk about artificial intelligence revolves around chatbots such as Google Gemini and ChatGPTMateen suggests that “AI and generative AI… could be used to do much more,” such as training peer counselors to provide better care or matching people with the best treatment faster.

Billion people around the world today we are experiencing mental health problems,” he says. “We don't just have a labor shortage in sub-Saharan Africa; we have labor shortages everywhere – talk to someone in the UK about how long they will have to wait to access talk therapy.

“Unmet needs everywhere could be met more effectively if we had better access to safe and effective technologies.”

Leave a Comment