Social Media

How Facebook Is Working To Stop Livestreamed Suicides

An unintended use of Facebook Live has the platform scrambling to introduce protections

Social Media
Illustration: Vocativ
Mar 01, 2017 at 4:04 PM ET

Following the livestreamed suicides of several people using the Facebook Live feature, the social media giant has announced that it will be rolling out new suicide prevention tools.

“Facebook is in a unique position — through friendships on the site — to help connect a person in distress with people who can support then,” its blog post reads. The company notes that experts say those closest to suicidal persons can be most effective in preventing it.

Now, people watching a concerning live video will be able to report the video to Facebook or reach out to the other user directly, armed with resources provided by the site. While the option to report live videos for containing material that violates community standards is not new, the site is now testing a streamlined process that uses artificial intelligence to recognize videos and posts potentially containing suicide or self-injury-related content and makes the option to report it more prominent and comprehensive for viewers.

The announcement follows a related post made by Facebook CEO Zuckerberg last month.

“There have been terribly tragic events — like suicides, some live streamed — that perhaps could have been prevented if someone had realized what was happening and reported them sooner,” he wrote. “These stories show we must find a way to do more.”

The post also notes that Facebook will begin working to find how AI can identify posts that glorify or promote terrorism.

More Mark Zuckerberg Wants To Use AI To Stop Terrorist Recruiters

Facebook has been working with mental health organizations for a decade, last year introducing a feature that let users report or reach out when faced with a friend’s post that they thought were indicative of unsafe well-being.

Instagram, which Facebook owns, has also developed systems that work to keep users from seeing content relating to self-harm and suicide, though some experts question whether its pop-up warnings use unnecessarily extreme language.

U.S.-based Facebook users will also be able to access crisis support organizations using Messenger.