WIRED: This Chatbot Aims to Steer People Away From Child Abuse Material

WIRED: This Chatbot Aims to Steer People Away From Child Abuse Material. “THERE ARE HUGE volumes of child sexual abuse photos and videos online—millions of pieces are removed from the web every year. These illegal images are often found on social media websites, image hosting services, dark web forums, and legal pornography websites. Now a new tool on one of the biggest pornography websites is trying to interrupt people as they search for child sexual abuse material and redirect them to a service where they can get help.”

NBC News: Wickr, Amazon’s encrypted chat app, has a child sex abuse problem — and little is being done to stop it

NBC News: Wickr, Amazon’s encrypted chat app, has a child sex abuse problem — and little is being done to stop it. “Wickr Me, an encrypted messaging app owned by Amazon Web Services, has become a go-to destination for people to exchange images of child sexual abuse, according to court documents, online communities, law enforcement and anti-exploitation activists.”

Monash University: Crowdsourcing to combat child abuse

Monash University: Crowdsourcing to combat child abuse. “Launched today, AiLECS researchers are asking persons aged 18 and above to contribute photographs of themselves as children through the My Pictures Matter crowdsourcing campaign. These pictures will be used to train AI models to recognise the presence of children in ‘safe’ situations, to help identify ‘unsafe’ situations and potentially flag child exploitation material.”

ITV: Online child sexual abuse at record high levels – with some exploited within minutes

ITV: Online child sexual abuse at record high levels – with some exploited within minutes. “An Internet Watch Foundation (IWF) report says the greatest threat to children online is self-generated content where perpetrators groom and coerce children into creating images and videos of themselves. The offender records that content and shares it on the web. The IWF, which searches and removes vile abuse, says it has seen an ‘explosion’ in this type of crime over the past two years, with an increase of 374%.”

BBC: How child sex abuse rose during pandemic in India

BBC: How child sex abuse rose during pandemic in India. “Although the publication, transmission and possession of CSAM is banned under Indian law, it is still widespread. And the problem has been exacerbated by the coronavirus pandemic. According to activists and police officials, there has been a surge in the online demand and dissemination of child abuse imagery in the country since last year, as lockdowns imposed to contain Covid-19 confined people to their homes.”

Experts ‘finding 15 times as much child abuse material online as a decade ago’ (Yahoo News UK)

Yahoo News UK: Experts ‘finding 15 times as much child abuse material online as a decade ago’. “The amount of child sexual abuse material being found online by expert analysts is fifteen times higher than a decade ago, according to new figures from the Internet Watch Foundation (IWF). The online safety organisation has said its analysts are facing a ‘tidal wave’ of abuse material, as it called for the Government to ensure the Online Safety Bill is used to protect children online.”

Fox 8 Cleveland: Missing teen girl rescued after using TikTok hand gestures to let driver know she was in danger

Fox 8 Cleveland: Missing teen girl rescued after using TikTok hand gestures to let driver know she was in danger. “Investigators rescued a missing North Carolina teen and arrested the man with her during a traffic stop in Kentucky Thursday afternoon. According to the Laurel County Sheriff’s Office, a caller told 911 that the female passenger in the silver Toyota in front of them on I–75 was making hand gestures that are known on TikTok to represent ‘violence at home,’ ‘I need help’ and ‘domestic violence.’”

PRNewswire: The Coalition of Abused Scouts for Justice Secures Commitment From Boy Scouts of America to Appoint a Survivor on National Executive Board (PRESS RELEASE)

PRNewswire: The Coalition of Abused Scouts for Justice Secures Commitment From Boy Scouts of America to Appoint a Survivor on National Executive Board (PRESS RELEASE). “Additionally, the Coalition announced the launch of its new website, scoutingabusesurvivors.com, to share critical information and updates to the survivor community as they vote from now until December 14, 2021 to approve the Reorganization Plan, which includes the largest sexual abuse settlement fund in history – $1.887 billion and growing.”

Mashable: Apple delays controversial plan to check iPhones for child exploitation images

Mashable: Apple delays controversial plan to check iPhones for child exploitation images. “Apple said Friday that it is delaying the previously announced system that would scan iPhone users’ photos for digital fingerprints that indicated the presence of known Child Sexual Abuse Material (CSAM). The change is in response to criticism from privacy advocates and public outcry against the idea.”

9to5Mac: Apple already scans iCloud Mail for CSAM, but not iCloud Photos

9to5Mac: Apple already scans iCloud Mail for CSAM, but not iCloud Photos. “Apple has confirmed to me that it already scans iCloud Mail for CSAM, and has been doing so since 2019. It has not, however, been scanning iCloud Photos or iCloud backups. The clarification followed me querying a rather odd statement by the company’s anti-fraud chief: that Apple was ‘the greatest platform for distributing child porn.’”

Vice: Team Trump’s ‘Free Speech’ Platform Has a Child Abuse Problem

Vice: Team Trump’s ‘Free Speech’ Platform Has a Child Abuse Problem. “Gettr, the pro-Trump Twitter alternative launched last month by close Trump adviser Jason Miller, is allowing users to share child exploitation images. New research from the Stanford Internet Observatory’s Cyber Policy Center has laid bare the dangers of the platform’s almost complete lack of moderation and identified more than a dozen child abuse images being shared by Gettr users.”