Quartz: Facebook has been thinking about moderation all wrong

Quartz: Facebook has been thinking about moderation all wrong. “With 2.3 billion monthly users, Facebook constitutes the world’s biggest single entity—larger than the population of China and the total number of Christians. But China and Christianity are different from Facebookbecause their members have a sense of shared identity. And Facebook’s deficiencies on this point are directly linked to its failure to ethically moderate content on its platform.”

New York Times: Inside Facebook’s Secret Rulebook for Global Political Speech

New York Times: Inside Facebook’s Secret Rulebook for Global Political Speech. “In a glass conference room at its California headquarters, Facebook is taking on the bonfires of hate and misinformation it has helped fuel across the world, one post at a time. The social network has drawn criticism for undermining democracy and for provoking bloodshed in societies small and large. But for Facebook, it’s also a business problem.”

CNET: Facebook’s and social media’s fight against fake news may get tougher

CNET: Facebook’s and social media’s fight against fake news may get tougher. “The shift toward ephemeral content and messaging could fundamentally alter how we use Facebook and other social media, while also making it harder to combat misinformation, election interference and hate speech, some experts say. After all, it’s hard for companies to crack down when they can’t see what’s being shared in encrypted messages, or when photos and videos disappear after 24 hours. And while Facebook and others are investing in AI to spot and remove messages that violate their online rules, they still face a tough road ahead.”

Nipples Are Banned, but Animal Abuse and Brutal Violence Are OK: Instagram Is Broken (Fstoppers)

Fstoppers: Nipples Are Banned, but Animal Abuse and Brutal Violence Are OK: Instagram Is Broken. “As a photographer absorbed with curating my profile and admiring the work of some amazing artists, it’s not always apparent how much of Instagram is filled with truly terrible things. I’ve written before about how Instagram is a cesspit of populist content that is driven by clicks as opposed to quality. I’ve also complained at length about Instagram’s clear reluctance to combat freebooting on its platform, happy to see content stolen as long as users stay in the app, consuming its adverts. What I failed to realize was how much of Instagram is violent, graphic, and seemingly free of moderation. Around the world, thousands of 13-year-olds will be receiving new electronic devices this Christmas, many of them no doubt opening new Instagram accounts. Terrifyingly, those children, with all the parental controls in place, could in just a few clicks be watching footage of animals being abused, or, as I just discovered, people being executed. In [Mason] Gentry’s experience, reporting this content seems to make little difference.”

“So many times we forget to listen”: How Spaceship Media moderated a Facebook group of 400 political women without it going off the rails (Nieman Lab)

Nieman Lab: “So many times we forget to listen”: How Spaceship Media moderated a Facebook group of 400 political women without it going off the rails. “When I spoke with Spaceship Media’s cofounders a year ago, they were about to embark on creating arguably the most ambitious news-centric Facebook group in existence: A goal of 5,000 women with diverse views in one group, talking about politics without everything self-imploding.” Forum moderation is an incredibly tough and thankless job, but when it’s done well, it’s amazing.

TechDirt: Google Moderation Team Decides My Piece About The Impossible Nature Of Content Moderation Is ‘Dangerous Or Derogatory’

Techdirt: Google Moderation Team Decides My Piece About The Impossible Nature Of Content Moderation Is ‘Dangerous Or Derogatory’. “Well, well. A few weeks back I had a big post all about the impossibility of moderating large content platforms at scale. It got a fair bit of attention, and has kicked off multiple discussions that are continuing to this day. However, earlier this week, it appears that Google’s ad content moderation team decided to help prove my point about the impossibility of moderating content at scale when… it decided that post was somehow ‘dangerous or derogatory.'” Hmmm. I think ResearchBuzz needs a “womp womp” tag.