Using artificial intelligence and algorithms, Facebook is now looking into ways of reaching out to those most at risk for committing suicide. The algorithms can spot warning signs and will alert Facebook’s human review team to a potential situation. The social media platform will then contact the user with ways to seek help. This includes contacting a friend or family, links to hotlines, or tips for support.

Previously, Facebook has offered advice to users, but it was necessary for them to take the first step, such as clicking on articles in their feed. Now Facebook will send a message after looking at patterns in behavior or key words or phrases such as ‘pain’ or ‘sadness.’ Comments from friends like as ‘Are you OK?’ or ‘I’m worried about you’ will also be taken into account in the AI technology. Once the post is flagged by the AI system, it is sent to rapid review by the Operations team. If deemed appropriate, the team will then prompt the user with options for seeking help.

Facebook has worked with several mental health institutions on the topic.

This topic presents questions of whether Facebook is overstepping its bounds, or whether it should, in fact, be doing more to help those at risk. Dr. John Draper, director of the US National Suicide Prevention Lifeline, wants to see Facebook taking a more active role, with the option of alerting friends or family.

He says,

“It’s something that we have been discussing with Facebook. The question is how we can do that in a way that doesn’t feel invasive. I would say though that what they are now offering is a huge step forward.”

While this technology is currently only be trialed in the USA, Facebook is also looking to help via their new Live broadcast tool.

Suicide Prevention Facebook Algorithim

This comes after a 14 years Florida girl live streamed her suicide in January. Facebook said they were already taking actions on suicide prevention and help networks, but going forward a member of Facebook would be alerted, and prompts would appear during such a video.

“Some might say we should cut off the stream of the video the moment there is a hint of somebody talking about suicide,” said Jennifer Guadagno, Facebook’s lead researcher on the project. “But what the experts emphasized was that cutting off the stream too early would remove the opportunity for people to reach out and offer support. So, this opens up the ability for friends and family to reach out to a person in distress at the time they may really need it the most.”

With this feature rolling out globally, Facebook is now looking into contacts across the globe that users could be in touch within such a case.



Facebook Comments