Gizmodo: In Troubling Experiment, UK University To Monitor Students’ Social Media To Prevent Suicide. “A university in the UK announced that it will surveil student social media posts, among other data, to try and determine whether they are suicidal. The project is part of a pilot program and will reportedly be deployed across all British institutions if it works as intended.”
BBC: What to do if you see an Instagram post about suicide. “Malaysian police say a 16-year-old girl killed herself earlier this week, after she asked her Instagram followers whether she should live or die. The Malaysian teenager had hosted a poll on her Instagram story, with the question: ‘Really Important, Help Me Choose D / L’, where D stood for death, while L stood for life, according to police. Some of her followers voted for ‘death’.”
Mashable: Instagram co-launches a mental health awareness campaign to help people find support. “The #RealConvo Campaign — spearheaded by both Instagram and the American Foundation for Suicide Prevention (AFSP,) an organization that helps those affected by suicide — encourages people to use the hashtag to share their own personal mental health experiences and speak more openly about their struggles.”
University of Washington: Suicidal thoughts? Therapy-oriented website might help. “Researchers asked more than 3,000 website visitors how they felt before they got to the website compared with after a few minutes after arriving. Nearly one-third were significantly less suicidal, and the intensity of their negative emotions had also decreased. Findings were published in the Journal of Medical Internet Research, an open-access publication.” This site apparently launched in 2014, but it’s new to me.
The Next Web: Deep learning can help us eradicate suicide – but only if we let it. “Humanity’s mental health crisis has reached pandemic proportions. Bluntly put: we don’t seem capable of solving the problem on our own. Cutting edge AI research shows a clear path forward, but society as a whole will have to accept the fact that mental health is real in order for us to take the first steps.”
Berkeley Lab: Berkeley Lab Team Uses Deep Learning to Help Veterans Administration Address Suicide Risks. “Researchers in the Computational Research Division (CRD) at Lawrence Berkeley National Laboratory (Berkeley Lab) are applying deep learning and analytics to electronic health record (EHR) data to help the Veterans Administration (VA) address a host of medical and psychological challenges affecting many of the nation’s 700,000 military veterans.”
Ars Technica: Suicide instructions spliced into kids’ cartoons on YouTube and YouTube Kids. “Tips for committing suicide are appearing in children’s cartoons on YouTube and the YouTube Kids app. The sinister content was first flagged by doctors on the pediatrician-run parenting blog pedimom.com and later reported by the Washington Post. An anonymous ‘physician mother’ initially spotted the content while watching cartoons with her son on YouTube Kids as a distraction while he had a nosebleed. Four minutes and forty-five seconds into a video, the cartoon cut away to a clip of a man, who many readers have pointed out resembles Internet personality Joji (formerly Filthy Frank). He walks onto the screen and simulates cutting his wrist. ‘Remember, kids, sideways for attention, longways for results,’ he says and then walks off screen. The video then quickly flips back to the cartoon.”