Breaking News

Facebook to use technology to prevent suicide I Facebook will save the suicides I ‪‪Facebook‬, ‪artificial intelligence‬ , ‪Mark Zuckerberg I Facebook rolling out AI tools to help prevent suicides‬‬

Facebook to use technology to prevent suicide I Facebook will save the suicides I ‪‪Facebook‬, ‪artificial intelligence‬ , ‪Mark Zuckerberg I Facebook rolling out AI tools to help prevent suicides‬‬
In yet another attempt to prevent suicides, Facebook is starting to roll out Artificial Intelligence (AI)-based tools to help identify when someone might be expressing thoughts of suicide, including on Facebook Live.
The initiative – that will use pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide to help authorities respond faster — will eventually be available worldwide, except the European Union, Facebook said in a blog post on Tuesday.
“Facebook is a place where friends and family are already connected and we are able to help connect a person in distress with people who can support them.
“It’s part of our ongoing effort to help build a safe community on and off Facebook,” wrote Guy Rosen, Vice President of Product Management at Facebook.In October, Facebook worked with first responders on over 100 wellness checks based on reports it received via its proactive detection efforts.
“We use signals like the text used in the post and comments (for example, comments like “Are you ok?” and “Can I help?” can be strong indicators).
“In some instances, we have found that the technology has identified videos that may have gone unreported,” Rosen said.
Facebook has a team that includes a dedicated group of specialists who have specific training in suicide and self harm.
“We are also using AI to prioritise the order in which our team reviews reported posts, videos and live streams. This ensures we can get the right resources to people in distress and, where appropriate, we can more quickly alert first responders,” the blog post read.In addition to those tools, Facebook is using automation so the team can more quickly access the appropriate first responders’ contact information.
“We have teams working around the world, 24/7, who review reports that come in and prioritise the most serious reports. We provide people with a number of support options, such as to reach out to a friend,” Rosen informed.
Facebook in September said it was working with suicide prevention partners in India to collect phrases, hashtags and group names associated with online challenges encouraging self-harm or suicide.
Started on World Suicide Prevention Day on September 10, the initiative would also connect people in India with information about supportive groups and suicide prevention tools in News Feed.
The founder of Facebook mark Zuckerberg announced that the social network is embedded in the artificial intelligence and special algorithms will allow to identify potential suicides. He wrote about this in his blog.
“This step is part of the social media giant, aimed at helping to create a safe community Facebook,” said the company.
During the month thanks to new tools experts Facebook contacted more than 100 people who were on the verge of suicide, and emphasize the importance of artificial intelligence that actually saves lives.
How Facebook does it
Algorithms social network very quickly identificeret potential signal (for example, in public messages and comments) and provide relevant data to the Facebook experts that are on call 24 hours a day worldwide.

Artificial intelligence is not limited to the analysis of text messages are also messages in pictures and direct broadcast users.
The company has a team of Community Operations – it includes professionals with special training in mental health issues. They look at the data on possible suicides identified.
According to Dr. Dan Reidenberg if friends, the family of the man to notice disturbing things during the broadcast on Facebook Live, and you see that a person intends to harm themselves or threatening to do something similar, they react to it, leaving certain comments.
“He uses the resources in order to help the streamer and tell the people who are watching the stream, what should they do. On the user’s screen, threatening to commit suicide, a message appears that lists the contact information of the closest center of psychological support or advice on how to cope with stress. That is, the user will be given support,” said the consultant.

No comments