ALSO READArtificial intelligence to power Facebook translations Facebook is using artificial intelligence to combat terrorism Facebook 'chatbots' create own language, start communicating sans humans Now, Facebook Live will have closed captions for hearing-impaired users Live streaming via Facebook camera in pilot phase, confirms company
In yet another attempt to prevent suicides, Facebook is starting to roll out Artificial Intelligence (AI)-based tools to help identify when someone might be expressing thoughts of suicide, including on Facebook Live.
The initiative - that will use pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide to help authorities respond faster -- will eventually be available worldwide, except the European Union, Facebook said in a blog post on Tuesday.
"Facebook is a place where friends and family are already connected and we are able to help connect a person in distress with people who can support them.
"It's part of our ongoing effort to help build a safe community on and off Facebook," wrote Guy Rosen, Vice President of Product Management at Facebook.
In October, Facebook worked with first responders on over 100 wellness checks based on reports it received via its proactive detection efforts.
"We use signals like the text used in the post and comments (for example, comments like "Are you ok?" and "Can I help?" can be strong indicators).
"In some instances, we have found that the technology has identified videos that may have gone unreported," Rosen said.
Facebook has a team that includes a dedicated group of specialists who have specific training in suicide and self harm.
"We are also using AI to prioritise the order in which our team reviews reported posts, videos and live streams. This ensures we can get the right resources to people in distress and, where appropriate, we can more quickly alert first responders," the blog post read.
In addition to those tools, Facebook is using automation so the team can more quickly access the appropriate first responders' contact information.
"We have teams working around the world, 24/7, who review reports that come in and prioritise the most serious reports. We provide people with a number of support options, such as to reach out to a friend," Rosen informed.
Facebook in September said it was working with suicide prevention partners in India to collect phrases, hashtags and group names associated with online challenges encouraging self-harm or suicide.
Started on World Suicide Prevention Day on September 10, the initiative would also connect people in India with information about supportive groups and suicide prevention tools in News Feed.