Facebook using artificial intelligence to help suicidal users

Toni Houston
March 3, 2017

Facebook also is running a test of live chat support from crisis-support orgs through Messenger, with participating organizations including Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline.

Facebook's suicide prevention tools have actually existed for 10 years already, but now they're improving it and integrating it into some of the newer features of the network, like Facebook Live.

The social media giant has announced it is expanding its suicide prevention tools to Facebook Live, which gives Australian support groups the opportunity to target young people in the moment of their distress. It will enable any viewer watching their friend's livestream to reach out and report that video if they say something worrisome.

This move comes in the wake of prevailing suicide escalations across the world, wherein one death by suicide is being reported every 40 seconds with it being the primary cause of death for those aged around 15 to 29 years.

Suicide rates jumped 24 percent in the United States between 1999 and 2014 after a period of almost consistent decline, according to a National Center for Health Statistics study.

Some time ago, Mark Zuckerberg had noted that it has been observed that people are committing suicide over live videos, which could have been prevented if someone had reported them on time. Separately, the person filming will also be shown a set of resources pop up on their phone screen, so they can contact a friend or a help line.

In addition, Facebook is also launching a handful of new suicide prevention tools across video and Facebook Messenger.

In this regard, Facebook because of its position in connecting people around the world has for more than ten years had specific tools and resources developed in collaboration with mental health and specialized organizations with the aim of helping to prevent suicide.

Noting concerns about the reporting process that was available previously, Facebook is streamlining things, and is using AI to identify potentially worrying posts. The team will review posts to see if the person appears to be in need of help and provide resources directly if they deem it appropriate. Videos of her death then found their way onto Facebook and took two weeks for the company to purge.

This feature will enable to not only make the suicide options prominent while reporting but also help review posts which haven't been marked as suicidal by other community members.

Other reports by Ligue1talk

Discuss This Article