The National Eating Disorders Association (NEDA) was forced to remove its Tessa chatbot after it “may have provided information that was harmful and unrelated to the program,” according to the put it simply, an AI chatbot meant to help people who struggle with eating disorders. with emotional problems. distress, but instead made matters worse by offering advice on dieting and urging users to weigh and measure themselves.
Many users and experts in the field of eating disorders have experienced these issues firsthand, claiming that the bot did not respond to simple prompts such as “I hate my body” and that it constantly emphasized the importance of dieting and increasing physical activity, Again, this is a helpline for those with eating disorders, no Weight loss support group.
The organization says this is a temporary shutdown until any “bugs” and “triggers” that cause the chatbot to hand out dangerous information like an appointment with Dr. Oz are fixed. You’d think that with such an extreme outcome, they’d consider scrapping the project entirely, but there’s more to the story.
The whole reason NEDA relied on the chatbot in the first place is that it allegedly fired human employees after they tried to unionize, The long-running helpline was manned by paid employees and volunteers, and former employees claim the mass shooting was a direct result of the pro-union effort.
“NEDA claims this was a long overdue change and that AI could better serve those with eating disorders. But don’t be fooled – this isn’t really about a chatbot. This is about union busting, plain and simple,” a former colleague wrote on a line. Help Work notes.
Even with this latest bluster, the helpline is still about to disappear tomorrow. Before this problem came to public attention, NEDA was moving unpaid volunteers away from one-on-one conversations with people with this problem and toward chatbot training. We’ll see if this changes. Meanwhile, am I right?