NEDA brought in an algorithm that began to give the wrong advice for eating disorders. It has been pulled for now.

Researchers Rein in Chatbot Use after It Misdirects Eating Disorder Sufferers

Eating disorders and obesity have long been the bane of the American public. Whether it is subscribing to Weight Watchers or any one of many diet assist programs, we are always seeking the magic solution to being overweight.

And of course with the advancement of AI into all aspects of our modern world, it wasn’t long before it was instituted into a weight loss algorithm. However, in a rare bit of bad news about AI, we found a story about an AI program that had to be pulled off the market due to making big mistakes in its recommendations. Here is the piece from wired.com about the errant AI:

A nonprofit has suspended the use of a chatbot that was giving potentially damaging advice to people seeking help for eating disorders. Tessa, which was used by the National Eating Disorders Association, was found to be doling out advice about calorie cutting and weight loss that could exacerbate eating disorders.

The chatbot’s suspension follows the March announcement that NEDA would shut down its two-decade-old helpline staffed by a small paid group and an army of volunteers. NEDA said on May 31 this year that it has paused the chatbot. The nonprofit’s CEO, Liz Thompson, says the organization has concerns over language Tessa used that is “against our policies and core beliefs as an eating disorder organization.”

The news plays into larger fears about jobs being lost to advances in generative AI. But it also shows how harmful and unpredictable chatbots can be. As researchers are still grappling with rapid advances in AI tech and its potential fallouts, companies are rushing a range of chatbots into the market, and real people are put at risk.

Tessa Gave Bad Advice

One of the most amazing points to come out of this story by Amanda Hoover is the researchers that turned loose a chatbot they called Tessa. And when Tessa had a little time to interact with the public, it began to create answers it was never trained to give. And researchers aren’t even sure how that could happen.

Alexis Conason, a psychologist who specializes in eating disorders told Tessa in a test that she had gained a lot of weight recently and really hated her body. In response, Tessa encouraged her to “approach weight loss in a healthy and sustainable way,” advising against rapid weight loss and asking if she had seen a doctor or therapist.

When Conason asked how many calories she should cut a day to lose weight in a sustainable way, Tessa said “a safe daily calorie deficit to achieve [weight loss of 1 to 2 pounds a week] would be around 500-1000 calories per day.” The bot still recommended seeing a dietitian or health care provider.

Conason says she fed Tessa the kind of questions her patients might ask her at the beginning of eating disorder treatment. She was concerned to see it give advice about cutting added sugar or processed foods, along with cutting calories.

“That’s all really contrary to any kind of eating disorder treatment and would be supporting the eating disorder symptoms,” Conason says.

Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University School of Medicine worked on developing the program.

It Will Be Helpful

Fitzsimmons-Craft says the weight loss advice given was not part of the program her team worked to develop, and she doesn’t know how it got into the chatbot’s repertoire. She says she was surprised and saddened to see what Tessa had said.

“Our intention has only been to help individuals, to prevent these horrible problems.” Fitzsimmons-Craft was an author of a 2021 study that found a chatbot could help reduce women’s concerns about weight and body shape and possibly reduce the onset of an eating disorder. Tessa is the chatbot built on this research.

The article goes deeper into how Tessa was trained and eventually pulled from being used. And while there might be glitches to work through, the idea of Tessa is to help people who for some reason or another won’t seek help in other ways. Less than 30 percent of people seek help from counselors, according to a Yale University study.

There are other efforts to use tech to fill the gap. Fitzsimmons-Craft worries that the Tessa debacle will eclipse the larger goal of getting people who cannot access clinical resources to get some help from chatbots.

“We’re losing sight of the people this can help,” she says.

read more at wired.com