In the wake of the suicide of a 31-year-old Taiwanese woman who told friends on Facebook she was planning to kill herself, Facebook‘s managers tell ABC News they have plans to work with other leading websites to provide more robust suicide prevention resources to web users.
“We’re working with other internet companies at formulating a list of best practices, so that there’s an understanding and a consensus, along with experts in the suicide prevention community, for online properties dealing with this issue,” Frederic Wolens, a spokesman for Facebook, told ABC News.
Wolens said the suicide of Claire Lin, who killed herself on her 31st birthday on March 18, highlighted a problem that social networks have been trying to grapple with for years: how individuals who are suicidal often let their despair reflect on their social networking profiles, by chatting to friends about it or leaving other signs.
“More and more, as Facebook becomes more widespread and pervasive, it’s becoming a better and better mirror for what’s going on in the real world,” Wolens said. “With suicides going on in the real world, the suicide touches some part of Facebook, whether it’s the signs leading up to it, or people who wrote things on their Facebook.”
In the case of Lin, the connection to Facebook was particularly gruesome. Lin chatted with nine Facebook friends on the website while she slowly killed herself by asphyxiation, inhaling the fumes from a charcoal barbecue in a closed room and typing messages about her slow death.
The friends begged her to open a window and put the fire out, but did not call police.
In other instances, individuals have written Facebook “status updates” confessing they wanted to kill themselves, or written messages to friends expressing suicidal thoughts. Rutgers University freshman Tyler Clementi brought widespread media and public attention to the issue after he killed himself in 2010. Moments before, he had posted a Facebook message saying, “Jumping of the george washington bridge. Sorry.”
Currently, Facebook offers resources to users in the U.S. who ask for them. If a person planning suicide mentions it on Facebook, and friends report it to administrators, they will send messages to the person and his or her friends, offering help.
A private, one-on-one Facebook chat with a suicide prevention counselor would pop open on the person’s Facebook page, offering counseling free of charge. The person would also be offered local resources that could be found offline, Wolens said.
For a user who reports suicidal postings by a friend, Facebook offers resources on how to help a friend through that crisis or whom they could recommend the friend contact for help.
“So in the U.S. specifically, we already have a system where when we receive a report of a user that’s in distress, that goes into our safety team, which reviews the report to make sure it’s an authentic report, and after we’ve verified it, we reach out to person who has reported it and the distressed user,” Wolens said. Facebook then offers the specific chat and local resources, a model the company plans to duplicate abroad.
Facebook also already houses helpline phone numbers and other resources in its Help Center.
What the company won’t do is scan users’ online activity for warning signs or mentions of suicidal thoughts, Wolens said. The ability to crunch data from billions of users’ messages each day — coupled with the nuance and context of messages that might contain words like “kill myself” — would make sorting through the data impractical.
For Facebook and other social websites, including Twitter, the opportunity potentially to help suicidal users is great, while the challenge of implementing a practical system has been enormous, Wolens said.
“Facebook has a tremendous opportunity. There are over 850 million users on the website, creating the largest community watch group ever,” he said. “So while we provide ways to report these possible problems to us, we want to make sure we have the correct processes in place and are doing exactly what experts advocate. Regardless of the size of the problem, we have a tremendous opportunity to help.”
Wolens declined to name the other companies Facebook is working with on the issue, though he said they are leading Internet and tech companies. The group first met in January 2011 to begin talking about the problem and possible solutions, and met again in January 2012, he said.
The group’s goals are to standardize the best ways a website or Internet company can deal with suicidal users, primarily by offering resources and one-on-one help to those who ask for it. They have consulted with suicide prevention specialists and organizations whose members are at greater risk for suicide, such as the LGBT community and veterans, Wolens said.
“Eventually we’ll be able to have best practices that we can go out and distribute to other internet companies and work with the online community on adoption,” he said.
While the loose coalition works to formalize its plans, Wolens said Facebook will continue to work with suicide prevention groups to implement resources and raise awareness through its own site.
Related posts:
Views: 0