CNN: ‘Watchdog moms’ on TikTok are trying to keep minors safe

CNN: ‘Watchdog moms’ on TikTok are trying to keep minors safe. “Seara Adair, a mother of two young daughters from Atlanta, revealed in a TikTok video last year that she was sexually abused by a family member as a child. In the next few weeks, her follower count grew by tens of thousands, many of whom appeared to be minors. After that, she started using the popular short-form video app to educate her followers about various digital dangers. She posted about the risks of being approached by strangers online and the problematic content found hidden in the deep corners of TikTok and other platforms.”

NBC News: Wickr, Amazon’s encrypted chat app, has a child sex abuse problem — and little is being done to stop it

NBC News: Wickr, Amazon’s encrypted chat app, has a child sex abuse problem — and little is being done to stop it. “Wickr Me, an encrypted messaging app owned by Amazon Web Services, has become a go-to destination for people to exchange images of child sexual abuse, according to court documents, online communities, law enforcement and anti-exploitation activists.”

The Conversation: Virtual child sexual abuse material depicts fictitious children – but can be used to disguise real abuse

The Conversation: Virtual child sexual abuse material depicts fictitious children – but can be used to disguise real abuse. “Child sexual abuse material specifically refers to the possession, viewing, sharing, and creation of images or videos containing sexual or offensive material involving children. But less publicised is another form of child sexual abuse material: virtual child sexual abuse material (VCSAM).”

Monash University: Crowdsourcing to combat child abuse

Monash University: Crowdsourcing to combat child abuse. “Launched today, AiLECS researchers are asking persons aged 18 and above to contribute photographs of themselves as children through the My Pictures Matter crowdsourcing campaign. These pictures will be used to train AI models to recognise the presence of children in ‘safe’ situations, to help identify ‘unsafe’ situations and potentially flag child exploitation material.”

Reuters: Google, Meta must find and remove online child porn, say EU draft rules

Reuters: Google, Meta must find and remove online child porn, say EU draft rules. “Companies that fail to comply with the rules face fines up to 6% of their annual income or global turnover, which will be set by EU countries. The EU executive said its proposal announced on Wednesday aimed to replace the current system of voluntary detection and reporting by companies which has proven to be insufficient to protect children.”

ITV: Online child sexual abuse at record high levels – with some exploited within minutes

ITV: Online child sexual abuse at record high levels – with some exploited within minutes. “An Internet Watch Foundation (IWF) report says the greatest threat to children online is self-generated content where perpetrators groom and coerce children into creating images and videos of themselves. The offender records that content and shares it on the web. The IWF, which searches and removes vile abuse, says it has seen an ‘explosion’ in this type of crime over the past two years, with an increase of 374%.”

Ars Technica: TikTok under US government investigation over child sexual abuse material

Ars Technica: TikTok under US government investigation over child sexual abuse material. “TikTok is under investigation by US government agencies over its handling of child sexual abuse material, as the burgeoning short-form video app struggles to moderate a flood of new content. Dealing with sexual predators has been an enduring challenge for social media platforms, but TikTok’s young user base has made it vulnerable to being a target.”

Fox 8 Cleveland: Missing teen girl rescued after using TikTok hand gestures to let driver know she was in danger

Fox 8 Cleveland: Missing teen girl rescued after using TikTok hand gestures to let driver know she was in danger. “Investigators rescued a missing North Carolina teen and arrested the man with her during a traffic stop in Kentucky Thursday afternoon. According to the Laurel County Sheriff’s Office, a caller told 911 that the female passenger in the silver Toyota in front of them on I–75 was making hand gestures that are known on TikTok to represent ‘violence at home,’ ‘I need help’ and ‘domestic violence.’”

Mashable: Apple delays controversial plan to check iPhones for child exploitation images

Mashable: Apple delays controversial plan to check iPhones for child exploitation images. “Apple said Friday that it is delaying the previously announced system that would scan iPhone users’ photos for digital fingerprints that indicated the presence of known Child Sexual Abuse Material (CSAM). The change is in response to criticism from privacy advocates and public outcry against the idea.”

9to5Mac: Apple already scans iCloud Mail for CSAM, but not iCloud Photos

9to5Mac: Apple already scans iCloud Mail for CSAM, but not iCloud Photos. “Apple has confirmed to me that it already scans iCloud Mail for CSAM, and has been doing so since 2019. It has not, however, been scanning iCloud Photos or iCloud backups. The clarification followed me querying a rather odd statement by the company’s anti-fraud chief: that Apple was ‘the greatest platform for distributing child porn.’”

The Independent: Apple responds to growing alarm over iPhone photo scanning feature

The Independent: Apple responds to growing alarm over iPhone photo scanning feature. “Apple has responded to growing alarm over its new iPhone scanning feature from privacy experts and competitors. Last week, the company announced that it would be rolling out new tools that would be able to look through the files on a users’ phone and check whether they included child sexual abuse material, or CSAM.”