Medium: How to Handle Toxic People as an Online Community Manager. “Internet reality plays by its own rules and you’re obliged to know them and understand what to do when dealing with people online. I have been working as a community support manager for more than two years now. It’s a massive period of time measured by the amount of communication I initiate every single day, sending cooperation offers to designers, replying to comments and reviews, creating social media posts and updates.”
E&T: Google faces challenge of ‘brittle’ and opaque AI, says internet pioneer. “Internet pioneer and Google vice-president Vint Cerf has appeared before a House of Lords committee, defending the approach Google takes towards search ranking and content moderation.”
Neowin: Facebook publishes white paper focused on online content regulation. “The paper aims to establish some guidelines for how regulation around online content needs to be created and what factors need to be taken into account. CEO Mark Zuckerberg had already called for internet regulation last year, but this paper includes more tangible guidelines for how this can be done.”
BuzzFeed News, and I beg your pardon for the headline: Facebook Won’t Remove This Woman’s Butthole As A Business Page. “The exact street address of the so-called business isn’t listed, but the pin on the map shows the precise location of her former home (she and her family no longer live there). What has really vexed [Samantha] Jespersen is that she’s been unable to get it taken down. Since she discovered the Page in 2015, she’s reported it several times — but Facebook has said it isn’t in violation of its community standards (Facebook removed the Page after this article was published).” What drives me crazy about this is that Facebook removes legitimate businesses at the drop of a hat, but this lady had to endure what is (intentionally or not) essentially harassment for years.
CNET: Facebook’s oversight board will offer you another way to appeal content removals. “Facebook on Tuesday unveiled more details about the likely workings of a new independent board that’ll oversee content-moderation decisions, outlining a new appeals process users would go through to request an additional review of takedowns. Users of Facebook and its Instagram photo service can ask the board to review their case after appealing to the social network first. You’ll have 15 days to fill out a form on the board’s website after Facebook’s decision.”
The Verge: The Terror Queue. “Peter, who has done this job for nearly two years, worries about the toll that the job is taking on his mental health. His family has repeatedly urged him to quit. But he worries that he will not be able to find another job that pays as well as this one does: $18.50 an hour, or about $37,000 a year. Since he began working in the violent extremism queue, Peter noted, he has lost hair and gained weight. His temper is shorter. When he drives by the building where he works, even on his off days, a vein begins to throb in his chest.” This is about moderating content at Google and YouTube. It left me in tears. A disturbing but important read.
BuzzFeed News: Running A Neighbourhood Facebook Group Has Become A Seriously Complicated Job. “Local Facebook groups increasingly serve as a local area’s town square, classifieds section, Neighbourhood Watch, and emergency information centre all rolled into one. But, for the most part, they are run by volunteers who in 2019 are devoting huge chunks of time figuring out how to enforce rules, referee disputes, and avoid getting sued in the process.” No matter what the platform, content moderation is no joke. Be kind to your local moderators.