According to an official social media post, the National Eating Disorder Association (NEDA) recently came under criticism and was forced to remove its Tessa chatbot on worries that it was giving harmful and pointless advice. Unfortunately, the chatbot, which was created to support people going through emotional anguish, made things worse for them by encouraging users to concentrate on their weight and giving them bad advice on dieting.

Many users and professionals in the field of eating disorders shared personal accounts of encounters with the problematic responses provided by the bot.

They noted that the chatbot failed to address simple prompts such as "I hate my body," instead consistently emphasising the importance of dieting and increasing physical activity.

It is crucial to stress that this helpline was not created to serve as a weight loss support group for anyone struggling with eating disorders.

NEDA made the decision to temporarily shut down the chatbot in light of the seriousness of the situation until it could address the underlying problems and fix the "bugs" and "triggers" that caused the spread of damaging material.

As first reported by Vice, NEDA's reliance on the chatbot resulted from claims that the agency had fired its human staff members when they tried to organise. The long-running helpline featured a team made up of both paid workers and volunteers. Former employees assert that their efforts to unionise were the reason behind the widespread firings.

"While NEDA claims this was a long-anticipated change and that AI can better serve those with eating disorders, this is not simply about a chatbot. It is fundamentally about union busting," expressed Abbie Harper, a former helpline associate, in a blog post on Labor Notes. Ironically, the helpline is set to stop operating tomorrow despite the latest fiasco. Before this issue received widespread media coverage, NEDA had been gradually moving unpaid volunteers away from having direct one-on-one talks with people who were battling with eating disorders and towards preparing them to work with the chatbot. We'll have to wait and see if this plan will be changed. Many questions have been raised in the meantime as a result of the controversy regarding the organization's treatment of its employees.