Instagram’s recommendation algorithms bring pro-anorexia and eating disorder content to millions of users, including those whose bios identify them as under 13, according to a new report by Fairplay, an advocacy organization focused on the digital well-being of children.
The report states that nearly 20 million Instagram users are “fed content from Instagram’s Pro-Eating Disorder bubble,” and that many of them are in their teens or younger.
To understand the reach of pro-eating disorder content, the researchers identified 153 “initial accounts” that were public, had more than 1,000 followers, and expressly advocated for eating disorders. They calculated that approximately 1.6 million Instagram users followed one of these accounts and 88,655 followed three or more. The researchers found that nearly 20 million Instagram users followed at least one of those 88,655 accounts, and they may be prompted to follow the initial accounts. because they had a mutual connection.
“One of the things that struck me was how profoundly easy it was to identify this pro-eating disorder bubble,” said Rys Farthing, director of data policy at advocacy group Reset Australia and leader of the research.
Farthing said exposure to the content was primarily driven by suggestions from Instagram about which users to follow. Test accounts that expressed an interest in weight loss or disordered eating were quickly inundated with recommendations from the platform to follow other users with these interests, including those who openly encourage disordered eating.
“All you would have to do is put some security measures around that ‘follow recommendations’ algorithm and you could burst that bubble,” Farthing said.
In response to questions from BuzzFeed News about the Fairplay report, Meta spokeswoman Liza Crenshaw said in a statement: “Reports like this often misunderstand that completely removing content related to people’s travel or the recovery of eating disorders can exacerbate difficult times and isolate people. Experts and safety organizations have told us that it’s important to strike a balance and allow people to share their personal stories while removing any content that encourages or promotes eating disorders.”
Crenshaw said by email that when users search for or post pro-eating disorder content, the company highlights organizations that can provide help, and users have the option to specifically report eating disorder-related content. He added that accounts sharing self-harm content will not be recommended and Instagram is working to restrict search results related to known self-harm hashtags or accounts.
Researchers, journalists, and advocates have raised the alarm about eating disorder content on Instagram for years, culminating in the fall of 2021 when facebook internal documents provided by whistleblower Frances Haugen showed that Instagram made teenage girls feel worse about their bodies. This new report shows that Meta’s struggles to curb this type of damage still continue.
But Farthing and others hope change is just around the corner: US Senators Richard Blumenthal and Marsha Blackburn recently introduced the Children’s Online Safety Act, which would create a duty for platforms to “act in the best interest of a minor” when using their services. The California legislature is considering a like arrangementinspired by the UK Age Appropriate Design Code, which would require companies to consider the “best interests” of children when building or modifying their algorithms.
“If we can muster the courage to hold tech companies to account, we might be able to pass some of this legislation,” Farthing said. “And maybe when we have this conversation next year, it might actually have put me out of business.”