BuzzFeed News, and I beg your pardon for the headline: Facebook Won’t Remove This Woman’s Butthole As A Business Page. “The exact street address of the so-called business isn’t listed, but the pin on the map shows the precise location of her former home (she and her family no longer live there). What has really vexed [Samantha] Jespersen is that she’s been unable to get it taken down. Since she discovered the Page in 2015, she’s reported it several times — but Facebook has said it isn’t in violation of its community standards (Facebook removed the Page after this article was published).” What drives me crazy about this is that Facebook removes legitimate businesses at the drop of a hat, but this lady had to endure what is (intentionally or not) essentially harassment for years.
CNET: Facebook’s oversight board will offer you another way to appeal content removals. “Facebook on Tuesday unveiled more details about the likely workings of a new independent board that’ll oversee content-moderation decisions, outlining a new appeals process users would go through to request an additional review of takedowns. Users of Facebook and its Instagram photo service can ask the board to review their case after appealing to the social network first. You’ll have 15 days to fill out a form on the board’s website after Facebook’s decision.”
The Verge: The Terror Queue. “Peter, who has done this job for nearly two years, worries about the toll that the job is taking on his mental health. His family has repeatedly urged him to quit. But he worries that he will not be able to find another job that pays as well as this one does: $18.50 an hour, or about $37,000 a year. Since he began working in the violent extremism queue, Peter noted, he has lost hair and gained weight. His temper is shorter. When he drives by the building where he works, even on his off days, a vein begins to throb in his chest.” This is about moderating content at Google and YouTube. It left me in tears. A disturbing but important read.
BuzzFeed News: Running A Neighbourhood Facebook Group Has Become A Seriously Complicated Job. “Local Facebook groups increasingly serve as a local area’s town square, classifieds section, Neighbourhood Watch, and emergency information centre all rolled into one. But, for the most part, they are run by volunteers who in 2019 are devoting huge chunks of time figuring out how to enforce rules, referee disputes, and avoid getting sued in the process.” No matter what the platform, content moderation is no joke. Be kind to your local moderators.
Tubefilter: TikTok Moderators Were Told To Suppress Videos Made By Marginalized Users Because They Might Be Bullied . “In a section of its moderation guidelines called ‘Imagery Depicting A Subject Highly Vulnerable To Cyberbullying,’ the platform instructs moderators to mark people ‘susceptible to harassment or cyberbullying based on their physical or mental condition’ as ‘Risk 4,’ a level that restricts their content to only being viewed by users within their own countries. That’s the geographic level.”
Ars Technica: Why can’t Internet companies stop awful content?. “Many of us are baffled by the degradation of the Internet. We have the ingenuity to put men on the Moon (unfortunately, only men so far), so it defies logic that the most powerful companies on Earth can’t fix this. With their wads of cash and their smart engineers, they should nerd harder. So why does the Internet feel like it’s getting worse, not better? And, more importantly, what do we do about it?”
NBC News: Inside Facebook’s efforts to stop revenge porn before it spreads. “In interviews with NBC News, members of Facebook’s team tasked with clamping down on revenge porn spoke publicly about their work for the first time. They recounted a number of missteps, including a poorly communicated pilot program inviting people to pre-emptively submit their nude photos to Facebook.”