Slate: Confederate Groups Are Thriving on Facebook. What Does That Mean for the Platform?

Slate: Confederate Groups Are Thriving on Facebook. What Does That Mean for the Platform?. “In the wake of Black Lives Matter protests, demands for Facebook to address hate speech have escalated, coinciding with a nationwide movement to remove Confederate statues and flags from cities, states, and institutions long imbued with Confederate symbolism…. These movements, intertwined and mutually reinforcing, pose a particular threat to those who consider themselves present-day Confederates. From their perspective, Facebook has become more essential than ever to amplifying their message at a critical moment in history—just as Facebook has shown a new willingness to police their speech.”

Los Angeles Times: Reddit moderators spent years asking for help fighting hate. The company may finally be listening

Los Angeles Times: Reddit moderators spent years asking for help fighting hate. The company may finally be listening. “When [Jefferson] Kelley, a Reddit moderator, booted hateful users off threads where Black people discussed sensitive personal experiences, racial slurs piled up in his inbox. Crude remarks about women filled the comment sections under his favorite ‘Star Trek’ GIFs. The proliferation of notorious forums, including one that perpetuated a vicious racist stereotype about Black fathers, stung Kelley, a Black father himself. Kelley and other moderators repeatedly pleaded with the company to back them up and take stronger action against harassment and hate speech. But Reddit never quite came through. Then, all of a sudden, that seemed to change.

CNN: Facebook’s future keeps getting murkier

CNN: Facebook’s future keeps getting murkier. “It’s not the first time Facebook’s content moderation policies have been under the microscope, but this time feels different. Voices inside the company have publicly expressed dismay over its actions, and hundreds of corporations are using the power of their ad dollars to lobby for change from the outside. The pressure could challenge CEO Mark Zuckerberg’s long held desire to preserve free expression on the platform, especially by public figures. But Facebook’s size and power — and Zuckerberg’s outsized influence within the company — mean it’s not yet clear to what extent things might change.”

E&E News: Denial expands on Facebook as scientists face restrictions

E&E News: Denial expands on Facebook as scientists face restrictions. “A climate scientist says Facebook is restricting her ability to share research and fact-check posts containing climate misinformation. Those constraints are occuring as groups that reject climate science increasingly use the platform to promote misleading theories about global warming. The groups are using Facebook to mischaracterize mainstream research by claiming that reduced consumption of fossil fuels won’t help address climate change. Some say the planet and people are benefitting from the rising volume of carbon dioxide that’s being released into the atmosphere.”

The Drum: We’re having the wrong conversation to fix social media

The Drum: We’re having the wrong conversation to fix social media. “Debates are raging around social media boycotts, algorithmic biases, and content moderation. While most people seem to agree that they want ‘bad content’ removed, it’s less clear what ‘bad’ actually is and what the consequence of that removal would be. Clearly things need to change, and systemic reforms are needed yet the problem is, we’re all debating the wrong issue. We need to stop arguing about freedom of speech vs. content moderation. The real problem is freedom of reach.”

CNET: Coronavirus, BLM protest conspiracy theories collide on Facebook and Twitter

CNET: Coronavirus, BLM protest conspiracy theories collide on Facebook and Twitter. “A pandemic, societal protests and a contentious election have created an especially challenging environment for Facebook, Twitter and other social networks. Content moderators and fact-checkers are struggling to prevent the spread of obvious misinformation while giving users space to voice their opinions. The problem has gotten knottier for the online platforms as false claims about both the health crisis and Floyd’s killing collide, making content moderation decisions — taxing in the best of situations — even tougher.”

Techdirt: Just Like Every Other Platform, Parler Will Take Down Content And Face Impossible Content Moderation Choices

Techdirt: Just Like Every Other Platform, Parler Will Take Down Content And Face Impossible Content Moderation Choices . “Like Gab before it, the hot new Twitter-wannabe service for assholes and trolls kicked off of Twitter is Parler. The President and a bunch of his supporters have hyped it up, and the latest is that Senator Ted Cruz (and Rep. Devin Nunes) have recently joined it, and like others before them they have hyped up the misleading claim that Parler supports free speech unlike Twitter…. But, I did want to take a closer look at the claims that Parler supports free speech, because it does so in basically the same way every other platform — including the way Twitter, Youtube and Facebook do: by saying that they can remove your content for any reason they want.”

Slate: Why We Should Care That Facebook Accidentally Deplatformed Hundreds of Users

Slate: Why We Should Care That Facebook Accidentally Deplatformed Hundreds of Users. “The deplatforming incident comes as social media companies have increased their efforts to regulate content in response to the dual pressures of the presidential election and, especially, the coronavirus pandemic. Just last November, Facebook was criticized for refusing to ban white nationalists and other hate groups despite promises to do so. And while the company hasn’t exactly abandoned its laissez-faire approach to content moderation, Facebook, among other platforms, has culled and flagged misinformation, hate speech, and harmful content at unprecedented rates in the months since. Last week, for instance, Facebook removed nearly 200 accounts tied to white supremacist groups. Anti-racist skinheads and musicians are just the latest victims of these policies.”

The Technology 202: NYU report calls social media titans to stop outsourcing content moderation (Washington Post)

Washington Post: The Technology 202: NYU report calls social media titans to stop outsourcing content moderation. “The report says big social media companies like Facebook, Twitter and YouTube need to use more of their own employees – instead of the outside contractors on which they currently largely depend – to make calls about what posts and photos should be removed. Misinformation is becoming an increasingly big problem on tech platforms during the protests against racial injustice and the novel coronavirus pandemic, and both are happening during an election year in which the industry is already braced for action by bad actors.”

NPR: In Settlement, Facebook To Pay $52 Million To Content Moderators With PTSD

NPR: In Settlement, Facebook To Pay $52 Million To Content Moderators With PTSD. “Facebook will pay $52 million to thousands of current and former contract workers who viewed and removed graphic and disturbing posts on the social media platform for a living, and consequently suffered from post-traumatic stress disorder, according to a settlement agreement announced on Tuesday between the tech giant and lawyers for the moderators.”

Medium: How to Handle Toxic People as an Online Community Manager

Medium: How to Handle Toxic People as an Online Community Manager. “Internet reality plays by its own rules and you’re obliged to know them and understand what to do when dealing with people online. I have been working as a community support manager for more than two years now. It’s a massive period of time measured by the amount of communication I initiate every single day, sending cooperation offers to designers, replying to comments and reviews, creating social media posts and updates.”

E&T: Google faces challenge of ‘brittle’ and opaque AI, says internet pioneer

E&T: Google faces challenge of ‘brittle’ and opaque AI, says internet pioneer. “Internet pioneer and Google vice-president Vint Cerf has appeared before a House of Lords committee, defending the approach Google takes towards search ranking and content moderation.”

Neowin: Facebook publishes white paper focused on online content regulation

Neowin: Facebook publishes white paper focused on online content regulation. “The paper aims to establish some guidelines for how regulation around online content needs to be created and what factors need to be taken into account. CEO Mark Zuckerberg had already called for internet regulation last year, but this paper includes more tangible guidelines for how this can be done.”

BuzzFeed News: Facebook Won’t Remove This Woman’s Butthole As A Business Page

BuzzFeed News, and I beg your pardon for the headline: Facebook Won’t Remove This Woman’s Butthole As A Business Page. “The exact street address of the so-called business isn’t listed, but the pin on the map shows the precise location of her former home (she and her family no longer live there). What has really vexed [Samantha] Jespersen is that she’s been unable to get it taken down. Since she discovered the Page in 2015, she’s reported it several times — but Facebook has said it isn’t in violation of its community standards (Facebook removed the Page after this article was published).” What drives me crazy about this is that Facebook removes legitimate businesses at the drop of a hat, but this lady had to endure what is (intentionally or not) essentially harassment for years.

CNET: Facebook’s oversight board will offer you another way to appeal content removals

CNET: Facebook’s oversight board will offer you another way to appeal content removals. “Facebook on Tuesday unveiled more details about the likely workings of a new independent board that’ll oversee content-moderation decisions, outlining a new appeals process users would go through to request an additional review of takedowns. Users of Facebook and its Instagram photo service can ask the board to review their case after appealing to the social network first. You’ll have 15 days to fill out a form on the board’s website after Facebook’s decision.”