isAs Evie was scrolling through X in April, she saw some unwelcome posts in her feed. One was a photo of a visibly skinny person asking if they were skinny enough. Another post wanted to compare how few calories users were consuming in a day.
Debbie, who did not want to give her last name, is 37 and was first diagnosed with bulimia when she was 16. She did not follow either of the accounts behind the posts in the group, which has more than 150,000 members on the social media site.
Out of curiosity, Debbie clicked on the group. “As I scrolled down, I saw a lot of pro-eating disorder messages,” she said. “People asking for opinions about their bodies, people asking for advice on fasting.” A post pinned by an admin urged members to “remember why we’re starving.”
observer Twitter found seven more groups, totaling around 200,000 members, openly sharing content promoting eating disorders. All of the groups were created after Twitter was bought by billionaire Elon Musk in 2022 and rebranded as X.
Eating disorder campaigners said the scale of harmful content showed a serious failure in moderation by X. Councillor Wera Hobhouse, chair of the cross-party parliamentary group on eating disorders, said: “These findings are extremely worrying… X should be held accountable for allowing this harmful content to be promoted on its platform, which puts so many lives at risk.”
The internet has long been a hotbed of content promoting eating disorders (sometimes called “pro-ana”), from message boards to early social media sites like Tumblr and Pinterest, which banned posts promoting eating disorders and self-harm in 2012 following outcry over their prevalence.
Debbie remembers internet message boards in support of Anna, but “I had to search to find them.”
This kind of content is now more accessible than ever before, and critics of social media companies say it is pushed to users by algorithms, resulting in more and sometimes increasingly explicit posts.
Social media companies have come under increasing pressure in recent years to step up safety measures following a series of deaths linked to harmful content.
At an inquest into the death of 14-year-old Molly Russell, who died by suicide in 2017 after viewing suicide and self-harm content, the coroner ruled that online content contributed to her death.
Two years later, in 2019, Mehta-owned Instagram announced it would no longer allow any explicit content depicting self-harm. The Online Safety Act passed last year requires tech companies to protect children from harmful content, including advertising eating disorders, and will impose heavy fines on violators.
Baroness Parminter, who sits on the cross-party group, said the Online Safety Act was a “reasonable start” but failed to protect adults. “The obligations on social media providers only cover content that children are likely to see – and of course eating disorders don’t stop when you turn 18,” she said.
In the user policy, X We do not allow content that encourages or promotes self-harmwhich explicitly includes eating disorders. Users can report violations of X’s policies and posts, as well as use filters in the timeline to report that they are “not interested” in the content being served.
But concerns about a lack of moderation have grown since Musk took over the site: Just weeks later, in November 2022, he fired thousands of staff, including moderators.
The cuts have resulted in a significant reduction in the number of employees working on moderation. According to figures provided by X to the Australian Online Safety Commissioner:.
Musk also brought changes to X that meant users would see more content from accounts they didn’t follow. The platform introduced a “For You” feed, which became the default timeline.
in Last year’s blog postAccording to the company, about 50% of the content that appears in this feed comes from accounts that the user doesn’t yet follow.
In 2021, Twitter launched “Communities” as an answer to Facebook Groups. Communities have become more prominent since Musk became CEO. In May, Twitter announced that “Your timeline will now show recommendations for communities you might enjoy.”
In January, Meta, a rival to X, which owns Facebook and Instagram, said it would continue to allow the sharing of content documenting struggles with eating disorders but would no longer encourage it and make it harder to find. While Meta began directing users searching for eating disorder groups to safety resources, X does not show any warnings when users are looking for such communities.
Debbie said she found X’s harmful content filtering and reporting tools ineffective, and shared screenshots of the group’s posts with the posters. observer Even after she reported it and flagged it as not relevant, the post continued to appear in her feed.
Mental health activist Hannah Whitfield deleted all of her social media accounts in 2020 to aid in her recovery from an eating disorder. She said she then returned to some sites, including X, where “thinspiration” posts glorifying unhealthy weight loss appeared in her For You feed. [eating-disorder content] The downside of X was that it was a lot more extreme and radical. Obviously it was a lot less moderated and I felt it was a lot easier to find something very explicit.”
Eating disorder support groups stress that social media does not cause eating disorders, and that people who post pro-eating disorder content are often unwell and do not mean any harm, but social media can lead people who are already struggling with eating disorders down a dark path.
Researchers believe that users may be drawn to online communities that support eating disorders through a process similar to radicalization. Published last year by a computer scientist and psychologist from the University of Southern Californiafound that “content related to eating disorders is easily accessible through tweets about ‘dieting,’ ‘losing weight,’ and ‘fasting.'”
The authors, who analysed two million eating disorder posts on X, said the platform offers people with illnesses a “sense of belonging”, but that unmoderated communities can become “toxic echo chambers that normalise extreme behaviour”.
Paige Rivers was first diagnosed with anorexia when she was 10. Now 23 and training to be a nurse, she came across content about eating disorders on XFeed.
Rivers said he found the X setting, which allows users to block certain hashtags or phrases, was easily circumvented.
“People started using weird hashtags like anorexia, which is a combination of numbers and letters, and that got through,” she said.
Tom Quinn, Director of External Relations Eating disorder charity Beat“The fact that these so-called ‘pro-ana’ groups are allowed to proliferate demonstrates an extremely worrying lack of moderation on platforms like X,” it said.
For those in recovery, like Debbie, social media held the promise of support.
But Debbie feels powerless to limit it, and her constant exposure to provocative content is backfireing: “It discourages me from using social media, and it’s really sad because I struggle to find people in a similar situation or who can give me advice about what I’m going through,” she says.
Company X did not respond to a request for comment.
Source: www.theguardian.com