CU Anschutz Medical Campus: Study Shows Promising New Web Approach to Prevent Firearm Suicide

CU Anschutz Medical Campus: Study Shows Promising New Web Approach to Prevent Firearm Suicide. “Access to firearms and other lethal methods of suicide during periods of risk can make it more likely that a suicide attempt will end in death. Yet many patients with suicidal thoughts or behaviors receive no counseling about this from healthcare providers, and many have questions about options for firearm or medication storage. To address the issue, clinicians and researchers at the University of Colorado School of Medicine at the Anschutz Medical Campus partnered with Grit Digital Health. The team created Lock to Live, a web resource to help suicidal adults – and family, friends or providers – make decisions about reducing access to firearms, medications, and other potential suicide methods.”

BBC: Facebook removes 11.6 million child abuse posts

BBC: Facebook removes 11.6 million child abuse posts. “Facebook has released the latest figures in its efforts to remove harmful content from its platforms. They reveal 11.6 million pieces of content related to child nudity and child sexual exploitation were taken down between July and September 2019. For the first time, Facebook is also releasing figures for Instagram and including numbers for posts related to suicide and self-harm.”

University of New Mexico Health Sciences: The Devil is in the Data

University of New Mexico Health Sciences, and I really really really hate this headline: The Devil is in the Data. “In a paper published last month in the Journal of the American Medical Informatics Association, the team reported their finding that instances of self-harm among people with major mental illness seeking medical care might actually be as much as 19 times higher than what is reported in the billing records.”

BBC: The woman who tracks ‘dark’ Instagram accounts

BBC: The woman who tracks ‘dark’ Instagram accounts. “Intervening to help suicidal Instagram users is not a role Ingebjørg [Blindheim] would have chosen for herself. She doesn’t work for the social media site, and she isn’t paid for what she does. Nor is she formally qualified to offer help, having received no training in mental healthcare. Instead she feels compelled to act, realising she’s often the last chance of help for those posting their despair online.”

Molly Russell: Instagram extends self-harm ban to drawings (BBC)

BBC: Molly Russell: Instagram extends self-harm ban to drawings. “Instagram has pledged to remove images, drawings and even cartoons showing methods of self-harm or suicide. The move is its latest response to the public outcry over the death of British teenager Molly Russell.”

The Next Web: Pinterest says AI reduced self-harm content on its platform by 88%

The Next Web: Pinterest says AI reduced self-harm content on its platform by 88%. “Yesterday, on international World Mental Health Day, Pinterest announced in a blogpost that for the past year, it’s been using machine learning techniques to identify and automatically hide content that displays, rationalizes, or encourages self-injury. Using this technology, the social networking company says it has achieved an 88 percent reduction in reports of self-harm content by users, and it’s now able to remove harmful content three times faster than ever before.”

TechCrunch: Facebook tightens policies around self-harm and suicide

TechCrunch: Facebook tightens policies around self-harm and suicide. “Timed with World Suicide Prevention Day, Facebook is tightening its policies around some difficult topics, including self-harm, suicide and eating disorder content after consulting with a series of experts on these topics. It’s also hiring a new Safety Policy Manager to advise on these areas going forward. This person will be specifically tasked with analyzing the impact of Facebook’s policies and its apps on people’s health and well-being, and will explore new ways to improve support for the Facebook community.”