• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Facebook’s suicide prevention tools will now be available to all users

Status
Not open for further replies.
Facebook has updated its suicide prevention tools and is now making them available worldwide.

facebook-suicide-prevention.jpg


The tools, which let people flag posts from friends who may be at risk for self-harm or suicide, were previously available only for some English-language users. Other users could report posts through a form, but the new tools make the process quicker and less complicated.

In an announcement, Facebook said its suicide prevention resources will be available in all languages supported by the platform. The company’s global head of safety Antigone Davis and researcher Jennifer Guadagno wrote that the tools were “developed in collaboration with mental health organizations and with input from people who have personal experience with self-injury and suicide.”

The tools were first made available to some users in the United States last year with the help of Forefront, Lifeline, and Save.org. Facebook said it will continue to partner with suicide prevention and mental health organizations in different countries.

Users everywhere will soon be able to flag a friend’s post from a drop-down menu if they are worried about self-harm or suicide. Facebook gives them several options. For example, a list of resources, including numbers for suicide prevention organizations, can be shared anonymously, or a message of support can be sent (Facebook suggests wording).

The post may also be reviewed by Facebook’s global community operations team, which may then “reach out to this person with information that might be helpful to them,” according to its Help Center. If someone is at immediate risk of hurting themselves, however, Facebook warns that police should be contacted.

Facebook’s suicide prevention tools may help save lives—or at least raise awareness of an important issue. Increasing rates of suicide around the world means that it has become public health crisis in many countries. In the U.S., suicide rates are at their highest in three decades, particularly among men of all ages and women aged 45 to 64.

The company, however, has to balance suicide prevention with the privacy concerns of its 1.65 billion monthly active users—especially since Facebook posts are already seen as a treasure trove of research data by many psychologists. Facebook itself was forced to apologize in July 2014 for conducting psychological experiments on users.

In fall 2014, United Kingdom charity Samaritans suspended its suicide prevention app, which let users monitor their friends’ Twitter feeds for signs of depression, just one week after its launch, following concerns about privacy and its potential misuse by online bullies.

TechCrunch has contacted Facebook for comment on how it will balance helping people with respecting their privacy.
http://techcrunch.com/2016/06/14/facebook-suicide-prevention/
 
That's great, but this part is kind of weird:

The post may also be reviewed by Facebook’s global community operations team, which may then “reach out to this person with information that might be helpful to them,” according to its Help Center. If someone is at immediate risk of hurting themselves, however, Facebook warns that police should be contacted.

I mean, if anyone's writing anything that could be considered seriously suicidal on Facebook, shouldn't the police be alerted immediately? Seems strange that Facebook would only reach out to them with information, not take it one step further and notify people who could actually help in a situation.
 
I mean, if anyone's writing anything that could be considered seriously suicidal on Facebook, shouldn't the police be alerted immediately? Seems strange that Facebook would only reach out to them with information, not take it one step further and notify people who could actually help in a situation.

It's meant to reach varying degrees of health concerns. Some individuals are not at immediate risk and would benefit from help.

My concern is how effectively people can perceive that someone's at risk of suicide, even friends, but it doesn't need to be perfect to help a lot of people i guess.
 

X-TREME GAFFER

Neo Member
That's great, but this part is kind of weird:



I mean, if anyone's writing anything that could be considered seriously suicidal on Facebook, shouldn't the police be alerted immediately? Seems strange that Facebook would only reach out to them with information, not take it one step further and notify people who could actually help in a situation.

Maybe this is referring to comments/posts that are a little more ambiguous, but still hint that the poster is not in a super hot state of mind/possibly hinting that self harm could be a thing. I mean if somebody says "I'm going to hurt myself" that's obviously a call to the police. But maybe you would flag something if a post just leaves you feeling unsettled.
 

Ashby

Member
That's great, but this part is kind of weird:



I mean, if anyone's writing anything that could be considered seriously suicidal on Facebook, shouldn't the police be alerted immediately? Seems strange that Facebook would only reach out to them with information, not take it one step further and notify people who could actually help in a situation.

It's probably to guard against the cops showing up and murdering the suicidal person.
 

Sliver

Member
That's great, but this part is kind of weird:



I mean, if anyone's writing anything that could be considered seriously suicidal on Facebook, shouldn't the police be alerted immediately? Seems strange that Facebook would only reach out to them with information, not take it one step further and notify people who could actually help in a situation.

Pretty abusable by trolls otherwise is my guess. Sounds like a feature Neogaf could use with its trend of suicidal threads.
 
Status
Not open for further replies.
Top Bottom