Boing Boing: Announcement of Tumblr’s sale to WordPress classified as pornography by Tumblr’s notorious “adult content” filter

Boing Boing: Announcement of Tumblr’s sale to WordPress classified as pornography by Tumblr’s notorious “adult content” filter. “The filter is a piece of unadulterated, unsalvageable garbage. Its awfulness is hard to overstate, but it can be neatly illustrated by this Bruce Sterling post, which reveals that the Tumblr porn filter blocked Sterling’s post of a screenshot of a news story about the acquisition, which includes the happy coda, ‘This decision cannot be appealed.'”

CNET: Dating app 3Fun had users’ data, location and pictures exposed, report says

CNET: Dating app 3Fun had users’ data, location and pictures exposed, report says. “Dating app 3Fun, which describes itself as an app designed ‘for meeting local kinky, open-minded people for 3some & swinger lifestyle’ and claims over 1.5 million users, appears to have been open to more than just relationships. In a new post from security firm Pen Test Partners this week, it also apparently exposed sensitive user data including the ‘near real time location’ of its members, their photos and information including birthdates, sexual preferences and chats.”

The Verge: Facebook open-sources algorithms for detecting child exploitation and terrorism imagery

The Verge: Facebook open-sources algorithms for detecting child exploitation and terrorism imagery. “Facebook will open-source two algorithms it uses to identify child sexual exploitation, terrorist propaganda, and graphic violence, the company said today. PDQ and TMK+PDQF, a pair of technologies that store files as digital hashes and compare them with known examples of harmful content, have been released on Github, Facebook said in a blog post.”

Washington Post: Social media companies are outsourcing their dirty work to the Philippines. A generation of workers is paying the price.

Washington Post: Social media companies are outsourcing their dirty work to the Philippines. A generation of workers is paying the price.. “A year after quitting his job reviewing some of the most gruesome content the Internet has to offer, Lester prays every week that the images he saw can be erased from his mind. First as a contractor for YouTube and then for Twitter, he worked on a high-up floor of a mall in this traffic-clogged Asian capital, where he spent up to nine hours each day weighing questions about the details in those images. He made decisions about whether a child’s genitals were being touched accidentally or on purpose, or whether a knife slashing someone’s neck depicted a real-life killing — and if such content should be allowed online.”

New York Times: Facebook and Google Trackers Are Showing Up on Porn Sites

New York Times: Facebook and Google Trackers Are Showing Up on Porn Sites. “Trackers from tech companies like Google and Facebook are logging your most personal browsing details, according to a forthcoming New Media & Society paper, which scanned 22,484 pornography websites. Where that data ultimately goes is not always clear.”

Ars Technica: Deepfake revenge porn distribution now a crime in Virginia

Ars Technica: Deepfake revenge porn distribution now a crime in Virginia. “The new law amends existing law in the Commonwealth that defines distribution of nudes or sexual imagery without the subject’s consent⁠—often called revenge porn⁠—as a Class 1 misdemeanor. The new bill updated the law by adding a category of ‘falsely created videographic or still image’ to the text.”

Motherboard: This Horrifying App Undresses a Photo of Any Woman With a Single Click

Motherboard; This Horrifying App Undresses a Photo of Any Woman With a Single Click. “The software, called DeepNude, uses a photo of a clothed person and creates a new, naked image of that same person. It swaps clothes for naked breasts and a vulva, and only works on images of women. When Motherboard tried using an image of a man, it replaced his pants with a vulva. While DeepNude works with varying levels of success on images of fully clothed women, it appears to work best on images where the person is already showing a lot of skin. We tested the app on dozens of photos and got the most convincing results on high resolution images from Sports Illustrated Swimsuit issues.” The app has since been taken down.