I’ve written before about how bad social media can be for your health. It can be terrible for anyone, but particularly young people. Unfortunately, there is research which shows that social media may be contributing to a rise in teenage suicides, and that it is almost certainly contributing to increased depression and anxiety among teenagers. Those findings are even stronger for woman then men, and teenage women have also seen a higher increase in teenage suicide (please keep in mind, correlation does not equal causation).
There’s some good news on the horizon: It seems that Facebook is unveiling new tools to catch users who may be at risk of attempting suicide. According to Facebook’s website, it will be doing three things:
- Using pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide, and to help respond to reports faster
- Improving how we identify appropriate first responders
- Dedicating more reviewers from our Community Operations team to review reports of suicide or self harm
As noted by the Washington Post, Facebook will be using artificial intelligence to scan posts and comments for suicidal potential, allowing posts to be found sooner and addressed to authorities:
Facebook said that it will use pattern recognition to scan all posts and comments for certain phrases to identify whether someone needs help. Its reviewers may call first responders. It will also apply artificial intelligence to prioritize user reports of a potential suicide. The company said phrases such as “Are you ok?” or “Can I help?” can be signals that a report needs to be addressed quickly.
n the case of live video, users can report the video and contact a helpline to seek aid for their friend. Facebook will also provide broadcasters with the option to contact a helpline or another friend.
This…well, this is actually great. I have repeatedly come down pretty hard on technology in terms of it’s impact on mental health, but this is unquestionably a good thing. What’s most interesting to me is that Facebook is using artificial intelligence to try to reduce suicides; technology causes a problem, and technology is then used to limit said problem.
There are, of course, limits to the effectiveness of this new initiative. Yes, it can potentially catch a person in crisis and stop them from hurting themselves. But it won’t do anything to stop a person from reaching that point. Social media can still do enormous harm individuals from a mental health perspective, and that’s why it is so important that anyone using social media do so responsibly and in a manner which ensures that they won’t make themselves more depressed.
Still, it’s good to see Facebook acknowledge this issue and try to do something to fix it.