July 13, 2018 at 08:14AM
Gillian Luchejko, a 46-year-old client services manager in New Jersey, first found out her sister Pamela Elarabi had killed herself on Facebook. Elarabi posted a photo of herself on the social networking site as she prepared to take her own life. The photo appeared at the top of Luchejko’s Facebook feed.
On the evening of Friday, June 22, around 9 p.m. after making several cryptic text posts on her page throughout the day, Elarabi posted an image where she appeared to be taking her own life. The photo was public, so anyone looking at Elarabi’s Facebook page could also see it.
Elarabi was going through a divorce and had been publishing increasingly despairing status updates on her page in the preceding months, Luchejko said, so her sister was not sure what to make of the image. She called their brother and asked him to check on Elarabi, who had struggled with suicidal thoughts and depression for most of her life.
Luchejko wasn’t the only one who found the post alarming. By the time her brother arrived, paramedics were already on the scene — a neighbor had seen the image and called 911. Elarabi was rushed to the hospital, where she later died at age 49.
While Elarabi’s siblings decided whether to turn off her life support, the disturbing image of her last moments remained visible to her 792 Facebook friends.
In the midst of the chaos that evening — while Elarabi’s siblings were deciding whether to remove life support from their brain-dead sister, breaking the news of her death to her two adult children, and making arrangements for a burial — the disturbing image of Elarabi’s last moments remained on Facebook, visible to her 792 friends, and anyone who visited her page.
“People kept texting us, asking what was going on, and what the Facebook post was about, and I was thinking, ‘I can’t answer you right now because she is dying,’” Luchejko told MarketWatch. “Everyone felt helpless.”
Facebook friends wrote, ‘What’s going on? Are you OK?’
But it took more than 72 agonizing hours to get the photo taken down. Luchejko said their family considers itself relatively private, and was horrified to watch the news of her sister’s death spread across the Facebook community and their small hometown.
The experience highlights the painful challenges that social media companies like Facebook
FB, +0.17%
and their users must confront as more people put their entire lives online. Facebook walks a fine line between intervening and saving lives while avoiding censorship. The company has the ability to step in when users appear poised to hurt themselves, but as Elabari’s case shows, it doesn’t always act in a timely manner in the aftermath of tragedy.
With limited cell phone service in the hospital, Luchejko said they could do very little to respond to the texts pouring in from concerned friends who had seen the post. Dozens of people continued to interact with it after Elarabi’s death, leaving sad face and heart emojis, and writing comments like, “What’s going on?” and, “Are you OK?” Luchejko said.
She began to publicly reply to comments on the post to tell people Elarabi had passed away. She begged followers to report the photo, as recommended in Facebook’s “Help Center” guidelines on removing images. More than 200 of Elarabi’s Facebook friends told Luchejko they had reported it.
Facebook has struggled to remove alarming content in the past. In December 2016, footage of a 12-year-old girl taking her life continued to circulate despite multiple efforts to remove it. In January 2017, a 14-year-old girl killed herself on Facebook Live, and in April 2017, a Pennsylvania man posted a video of himself shooting and killing a man on Facebook.
Following these events, Facebook implemented more suicide prevention tools and announced it would hire 3,000 more moderators to review videos to prevent live streaming of crimes. Facebook has also been developing artificial intelligence tools to intervene in situations where users may be a danger to themselves or others, and to take down alarming content more quickly.
Phrases in a post like, “Are you OK?” and, “Can I help?” can trigger first responders to be alerted to help the person in question. As of March, Facebook has called first responders to make more than 1,000 wellness checks, the company said.
Facebook’s community standards do not allow posts promoting self-injury and suicide, but such posts may only be removed through Facebook’s online reporting tool. Once a user makes a report by filling out an online form, trained content moderators at Facebook will review the post to see if it qualifies for deletion. And that takes time.
See also: What happens when you Google for suicide methods
A spokeswoman from Facebook said the company cannot comment on Elarabi’s case because the post was deleted and cannot be investigated. The Menlo Park-based company has a team of 10,000 to 20,000 content reviewers working at all hours around the world to monitor content, she said. Many of them are trained in suicide prevention.
If someone reports a post threatening suicide or self-harm, Facebook will automatically suggest the user contact law enforcement to help their friend. However, the spokeswoman did not have information on what the average response time is after someone reports a troubling post.
Facebook’s policy states the company will not take down live videos of suicidal content in case somebody watching can help, the company’s vice president of Global Policy Management, Monika Bickert explained in 2017.
“Experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats,” she said.
Such cases must be reviewed by Facebook moderators
Facebook has to strike a delicate balance between censorship and safety, said Jen Golbeck, a computer scientist and associate professor at University of Maryland’s College of Information Studies. Although the company is working on artificial intelligence solutions to automatically detect suicidal posts, it is particularly difficult to detect the content of specific images.
That means many of these cases continue to be reviewed by hand, Golbeck said. “Usually that is done quickly, but once humans are brought into the loop — especially people who are not reporting problems with their own accounts — there can be delays,” she said.
“Imagine all the requests from parents to have their children’s suggestive photos taken down or complaints about bullying,” she added. “These parents’ requests are in that stream; that’s not to say they are unimportant, but rather that once you ask a real person to do something, you have to accept that lots of other people are asking for things and it can be slow.”
Trying to translate Facebook’s policies to algorithms is difficult, leaving the responsibility up to a relatively small team of moderators.
One internal document leaked from Facebook in 2017 showed moderators escalated 4,351 reports of self-harm in a two-week span in 2016, 63 of which had to be dealt with by law enforcement. In 2017, the figure was higher, at 5,016 reports in one two-week period and 5,431 in another.
In these documents, moderators were told to ignore suicide threats if “intention is only expressed through hashtags or emoticons” or when the proposed method is unlikely to succeed, as well as if the user’s threat to take his or her own life appears to be planned for more than five days in the future. Trying to translate these policies to algorithms is difficult, leaving the responsibility up to a relatively small team of moderators, Golbeck said.
“Facebook is trying to balance a lot of concerns, not just about privacy and the sensitivity of the content, but also the impact on people who are suffering and potentially suicidal,” she said. “A complex policy seems to be the right solution, but it does mean that the human machine will move more slowly.”
‘Facebook is not reality. Sometimes people don’t understand that.’
Monika Bickert, Facebook’s head of global policy management, has said the company wants to leave certain content up to “be sure that friends and family members can provide support and help.”
But that content can also be extremely distressing for users struggling with suicidal thoughts of their own, said Lauren Hersch Nicholas, assistant professor at Johns Hopkins Bloomberg School of Public Health. “We are just beginning to scratch the surface of understanding how these social media platforms affect people’s mental health and some of the beneficial and harmful effects of this kind of communication,” she said.
Commentary surrounding suicides on social media can also have negative effects on those who are caring for mentally ill or suicidal loved ones and those who are also suicidal. Nicholas noted the recent highly publicized suicides of Kate Spade and Anthony Bourdain that sparked a public discussion about mental health and included some helpful commentary. But it also led to what she called “armchair quarterbacking” of mental-health issues.
’You don’t know what is happening off of Facebook. Facebook is not reality. Sometimes people don’t understand that.’
“We saw a lot of posts encouraging people to be more cognizant of those around them, and suggesting maybe they aren’t doing enough,” she said. “It has become easier to comment on situations we don’t know very much about with things increasingly being shared on social media. In many cases, people are good at hiding warning signs and it isn’t helpful to place blame on their loved ones.”
In the case of Elarabi’s death, Luchejko felt the judgment of Facebook friends exacerbated the pain of the situation. One user commented, “Why wasn’t anyone there for you?” and another said, “I’m sorry we all let you down.” One person publicly posted that suicide was a selfish choice. These words were particularly difficult for Luchejko and Elarabi’s children to see. She responded to some people, saying the family had done everything it could.
“You don’t know what is happening off of Facebook,” Luchejko said. “Facebook is not reality. Sometimes people don’t understand that. Obviously if someone is suicidal, they have gotten to a point of no return, and it’s really hard to get them back.”
‘She was all about creating camaraderie’
A yoga teacher, Elarabi was known for her ability to bring people together, Luchejko said. She volunteered two days a week at a senior center, teaching yoga classes for the elderly and helping them make crafts. Crafts were a passion for Elarabi, who was great with “remembering the little details,” Luchejko said.
For every family dinner, she would hand-make elaborate table cards, cutting out paper patterns with everyone’s names on them. Luchejko said her husband always looked forward to the name tags, and has all the cards Elarabi made him over the years saved in a folder in their New Jersey home.
For every family dinner, she would hand-make elaborate table cards, cutting out paper patterns with everyone’s names on them.
Elarabi loved to golf and started a group for women golfers at her local country club called Ladies of the Links, in hopes of uniting women who wanted to learn how to play golf but had nobody to golf with. She launched the group in 2014 with just 10 people and by the time of her death in 2018, it had 200 members. “She was all about creating camaraderie,” Luchejko said. “She loved that.”
Just before her death, Elarabi participated in the Solstice in Times Square celebration on June 21, in which thousands of people unite in the iconic New York City intersection to do yoga. The last yoga class she taught was at a rehabilitation center in New Jersey, where she helped formerly incarcerated men find a healthy outlet for their anger through exercise.
Elarabi loved her two children and was often posting photos of them on Facebook, as well as pictures of herself doing new yoga poses and golfing. She had struggled with mental health issues since she was 13 years old, Luchejko said.
Pamela Elarabi’s family finally took matters into their own hands
After monitoring comments on the image of her sister’s death, Luchejko managed to have it removed — but not by Facebook’s moderators. Elarabi’s 26-year-old daughter drove to her mother’s house and searched her desk drawers until she found a handwritten list of passwords. Then she hacked into the account and deleted the photo herself.
“It was very traumatic for her. Her children are now traumatized because this is the last image they saw of their mom — they can’t remember her the way she was,” Luchejko said.
The company has recently launched efforts to repair its public image after the privacy scandal involving data firm Cambridge Analytica, including full-page newspaper advertisements warning users against fake news. Google
GOOG, +0.57%
searches for “delete Facebook” tripled in the days following revelations about the scandal.
The effects of Elarabi’s final post have remained long after it was deleted. Comments from friends are continuing to stream into Elarabi’s Facebook page.
Luchejko said Facebook should also examine its methods for responding to users in crisis. “They are looking at whether we get fake or news or spam, not inappropriate posts or what to do in a situation that is dire,” she said. “Why isn’t there anyone to contact directly? Why isn’t there a customer service phone number to say this is an emergency?” she said. “It just shows Facebook does not care about their customers.”
Facebook disputes that characterization and said it does respond to law enforcement requests submitted through its Law Enforcement Online Request System, a spokeswoman told MarketWatch. However, the Hillsborough Police Department, in Elarabi’s town told MarketWatch it initially struggled to contact Facebook.
“There is no phone number for police to contact Facebook,” a Hillsborough Police Department detective said. “We did not contact Facebook through its law enforcement portal because [by the time they knew of it] the post had already been removed.”
The effects of Elarabi’s final post have remained long after it was deleted. Comments from friends are continuing to stream into Elarabi’s Facebook page. Other posts she wrote on the day she posted that last photo remain visible.
Facebook allows family members of someone who has died or become medically incapacitated to remove their page, but Elarabi’s family has opted to leave it up to allow loved ones to share messages and memories about her.
Instead of a funeral service, they are holding a private memorial for Elarabi on what would have been her 50th birthday: Sept. 3. People are asking on her Facebook page how to send prayer cards to the service.
“She never believed that she was loved, which I think was part of the problem,” Luchejko said. “Now, there are a lot of people reaching out. It’s nice to know about all the people that loved her, the outpouring is just wonderful.”
If you or someone you know may be struggling with suicidal thoughts you can call the U.S. National Suicide Prevention Lifeline for confidential support 24 hours a day at 1-800-273-8255.