June 17, 2018 at 08:18PM
via WIRED
And as with other kinds of harmful content on the internet, platforms hosting pro-ED communities have long struggled with how to moderate them. In 2001, Yahoo removed more than 100 pro-ED sites from its servers, saying they violated its terms of service. A decade later, a HuffPost exposé about teenage girls creating “thinspiration” blogs on Tumblr prompted that site and others to ban explicitly pro-ED communities.
New research published last month in the peer-reviewed journal New Media & Society highlights how pro-ED groups continue to evade attempts at moderation. The study also found that sites like Pinterest and Instagram sometimes suggest more pro-ED content to users via their recommendation algorithms. It isn’t an isolated problem—researchers have found that recommendation engines on platforms like YouTube also suggest problematic content, like conspiracy theories. But unlike fake news, users who share pro-eating disorder content could be suffering from a serious illness like anorexia or bulimia. Companies need to weigh not just the content itself, but also the effect that removing it might have on the vulnerable people who share it.
“It’s very, very difficult to tease out what would fit under the category of toxic, or pro-eating disorder content,” says Claire Mysko, the CEO of the National Eating Disorders Association, which has worked with social media sites to help moderate pro-ED communities. “The people who are posting it and who are engaged in these communities are really struggling. You don’t want to set it up as this is good and bad, demonizing the users who are posting this content.”
The Hashtag Dilemma
The New Media & Society study focuses on hashtags as a tool for content moderation, and underscores how difficult it can be for tech companies to find problematic content and to decide what should and shouldn’t be removed.
After the HuffPost article was published in 2012, Instagram, Pinterest, and Tumblr began moderating pro-ED hashtags and search terms.
Now when users search for tags related to eating disorders, such as #bulimia, the sites either block results entirely or surface a pop-up message asking if they want to seek help.
It’s an understandable strategy: hashtags, unlike images, can be easily categorized and flagged by automated detection systems and human moderators. They’re also how many people find new content, so blocking certain hashtags limits how many people a post may reach. Historically, tagged content has also been over-emphasized by social science researchers for many of the same reasons. It’s easier to analyze a large data set of specific hashtags than to manually comb through untagged content.
But specific hashtags aren’t permanent fixtures of social media, and they can easily morph to suit the communities’ needs. A separate 2016 study from the Georgia Institute of Technology found that pro-ED users simply began to intentionally misspell or alter terms: “#thinspiration” became “#thynspiration,” “#thinspire,” or “#thinspirational,” for example. In order to evade moderation, many pro-ED accounts and blogs don’t use any hashtags at all, making them harder for platforms to find and researchers to study.
“This group is really savvy, because they know what they’re doing is unacceptable in society’s eyes, in the platform’s eyes, in trolls’ eyes,” says Ysabel Gerrard, the author of the recent study and a lecturer at the University of Sheffield. “They’re so aware that their account might be taken down.”
How The Study Was Conducted
Gerrard began her research by creating new accounts on Instagram, Pinterest, and Tumblr on an iPhone. All three sites’ Community Guidelines state that they don’t permit users to post content “encouraging or urging people to embrace self-injury,” or that “actively promotes ... eating disorders.”
She immediately found that Instagram’s pro-ED hashtag ban has an easy workaround: You can search for people who have the keywords in their usernames, just not hashtagged in their posts. She identified 74 public accounts that had terms like “proana,” “proanorexia,” or “thighgap” in their names or bios and who also posted pro-ED content. Then, she analyzed 1,612 of their posts—only 561 of which had hashtags—by cataloguing the content of the image and its caption.
On Tumblr, Gerrard followed a number of terms related to pro-ED content, like “thinspo,” “proana,” and “bulimic.” Tumblr allows you to follow topics without needing to follow specific users. For example, you can simply follow “movies” without following any specific user who posts about that topic. Through this method, she found 50 pro-ED blogs and analyzed 20 posts from each, or 1,000 posts total. Only 218 of the posts were tagged.
’This group is really savvy, because they know what they’re doing is unacceptable in society’s eyes, in the platform’s eyes, in trolls’ eyes.’
Ysabel Gerrard
Examining the 2,612 Instagram and Tumblr posts, Gerrard uncovered a complicated lexicon of signals that would likely evade any platform’s moderation efforts. For example, she found a number of posts related to diet plans, like the Ana Boot Camp Diet, which promotes drastically lowering your caloric intake. Advertised as the ABC Diet, it’s difficult to distinguish from much of the mainstream fitness and diet content that proliferates on apps like Instagram. Some of those diets, like the popular ketogenic diet, are also incredibly restrictive, indicating how blurry the line is between pro-ED communities and others that ostensibly don’t violate a site’s rules.
The Role of Recommendations
One of the most troubling findings of the study is the role that recommendation algorithms play. Gerrard found that after viewing pro-ED posts on Pinterest, the site suggested she might “love” other “ideas” that were all “connected to death and suicide, such as ’how to die’ and ’wanting to die quotes.’” Eating disorders often co-occur with other forms of mental illness, like depression and anxiety. The site also recommended “popular Pins for you” to Gerrard via email, which included topics like “hip bones.”
“We have policies against saving this type of content from other sites and social media to our platform,” a Pinterest spokesperson said in a statement. “If this content enters our system, we have a number of automated and manual processes in place to remove it or make it harder to find, and we will not show results if someone searches for these terms. We take this content very seriously and are always working to get better.”
Instagram and Tumblr’s recommendation algorithms behaved similarly, according to Gerrard’s research. On Instagram, for example, Gerrard never liked or commented on any posts—which could be seen by users and potentially influence their behavior—but she did “save” pro-ED content to her saved posts folder, which doesn’t send notifications to the poster. After she began saving things, Instagram’s Explore tab flooded with other pro-ED content. Instagram and Tumblr did not respond to repeated requests for comment.
Recommendation algorithms on other platforms have also inadvertently promoted problematic content to users. Both the New York Times and the Wall Street Journal found earlier this year that YouTube’s recommendation algorithm amplifies conspiracy theories and other extremist content.
“Platforms have not yet algorithmically reconciled their moral stances on eating disorders and self-harm, meaning they simultaneously push and deny problematic content to their users,” Gerrard wrote in her study.
What Platforms Can Do
Not all experts agree that pro-ED content should be taken down wholesale. One study found that these groups can be useful places for people to articulate their illnesses, sometimes for the first time. Another concern is that if platforms remove pro-ED users, they will move to harder-to-reach places like private groups and ephemeral apps like Snapchat.
At the same time, there’s no doubt that pro-eating disorder groups can be toxic and harmful. “It was one of the worst exacerbators of the illness for me, those websites,” says one woman who has recovered from anorexia and asked not to have her real name used. “A forum is a perfect place to spend all your time and your energy on your disorder. You’re getting validation for your sickness.”
Mysko, the CEO of NEDA, and Gerrard agree social media companies can do more to help the people who struggle with eating disorders who use their sites.
Gerrard suggests platforms refrain from cancelling a person’s entire account, and instead consider deleting individual posts. That way, they’re not abruptly cut off from other users who may be supportive. Many pro-ED accounts use pseudonyms, making it difficult to reestablish contact if an account is suspended.
Gerrard also recommends platforms employ trained moderators who specialize in identifying harmful pro-ED content. Companies’ Community Guidelines are often vague; it’s not easy for moderators or users to know when they’ve crossed the line.
Mysko says platforms should also consider working directly with already-popular fitness and health influencers to promote healthier messages about eating and body image. Pro-eating disorder groups and blogs have always been online. What’s changed is that they now exist alongside a sea of personalized, often gendered content that reinforces the same ideal of thinness, just not explicitly.
“There’s a tendency to compare. A lot of people are really perfectionist, and that really gets amplified in a social media space,” says Mysko. “It’s a constant reinforcement of those insecurities that are at the heart of many eating disorders.”
If you or someone you care about is struggling with an eating disorder, you can call the National Eating Disorders Association’s hotline at (800) 931-2237. You can also text "NEDA” to 741741. More information about available resources can be found here.