Facebook’s suicide prevention tools are being extended to Facebook Live in an attempt to decrease the number of people taking their own lives on the platform.
In January this year, aspiring actor Jay Bowdy used Facebook Live to threaten suicide. Despite friends and family contacting police, Bowdy took his own life an hour after the livestream.
No ad to show here.
In the same month, a 12-year-old and 14-year-old livestreamed their deaths on two separate occasions.
Facebook has featured suicide prevention tools for 10 years, but the new Live platform has presented a new set of issues to tackle.
“Our suicide prevention tools for Facebook posts will now be integrated into Facebook Live. People watching a live video have the option to reach out to the person directly and to report the video to us. We will also provide resources to the person reporting the live video to assist them in helping their friend,” an announcement yesterday read.
The person broadcasting the livestream will also be provided with resources, including an option to reach out to a friend, a list of helplines and a set of tips on how to pull themselves away from suicidal thoughts.
“Experts say that one of the best ways to prevent suicide is for those in distress to hear from people who care about them,” Facebook asserts. “Facebook is in a unique position — through friendships on the site — to help connect a person in distress with people who can support them.”
Facebook Live has launched several tools to help suicidal people and their loved ones, but the implementation leaves a little to be desired
Though this may be the case, the report feature is very unintuitive.
Right now, a user has to report the post, select that they do not think it belongs on Facebook, click “see more options”, and only then report that it shows someone considering or attempting self harm. The complicated process could see many concerned friends giving up on the support system in frustration.
According to Facebook, they are working on making this process easier.
But waiting for friends to see the post and report it could mean the difference between life and death. Because of this, the platform is developing an algorithm to detect suicidal thoughts without needing a third party.
“We’re also testing pattern recognition to identify posts as very likely to include thoughts of suicide. Our Community Operations team will review these posts and, if appropriate, provide resources to the person who posted the content, even if someone on Facebook has not reported it yet.”
The algorithm will only be tested in the US for now, and Facebook has not specified if they intend on making it global.