ABC News (Australia): Chris sorted through the ‘blood and gore’ on social media. Now he’s suing Facebook over PTSD

ABC News (Australia): Chris sorted through the ‘blood and gore’ on social media. Now he’s suing Facebook over PTSD. “They see the worst the internet serves up so, in theory, you don’t have to. People post shocking violence, sexual abuse, and vitriol-filled hate speech. But those whose job it is to sift through and scrutinise the dark side of social media say the work is taking a heavy toll.”

Washington Post: Father of slain journalist Alison Parker takes on YouTube over alleged refusal to remove graphic videos

Washington Post: Father of slain journalist Alison Parker takes on YouTube over alleged refusal to remove graphic videos. “It has been more than four years since journalist Alison Parker, doing a live television interview in southern Virginia, was killed when a former colleague walked up and shot her and videographer Adam Ward. Despite repeated requests from her father and others, videos of the slaying remain on YouTube, as do countless other graphic videos that show people dying or that promote various outlandish hoaxes.”

BBC: Facebook and YouTube moderators sign PTSD disclosure

BBC: Facebook and YouTube moderators sign PTSD disclosure. “Content moderators are being asked to sign forms stating they understand the job could cause post-traumatic stress disorder (PTSD), according to reports. The Financial Times and The Verge reported moderators for Facebook and YouTube, hired by the contractor Accenture, were sent the documents.”

The Verge: YouTube’s CEO explains why it leaves up ‘controversial or even offensive’ videos

The Verge: YouTube’s CEO explains why it leaves up ‘controversial or even offensive’ videos. “The concerns around YouTube moderation aren’t going away anytime soon. YouTube is still developing and revising policies to prevent major issues — its updated creator-on-creator harassment policy is still in the works, for instance — and bad actors will continue to push against the limits of those rules.”

Tubefilter: YouTube Updates Child Safety Policies To Remove Adult-Themed Videos Aimed At Kids

Tubefilter: YouTube Updates Child Safety Policies To Remove Adult-Themed Videos Aimed At Kids. “YouTube is updating its child safety policies to ban videos that contain ‘mature or violent themes’ and yet explicitly target minor or family viewers. Prior to this update, YouTube simply age-restricted videos that were found to contain things like sex, death, and graphic violence, but indicated (via the title, description, and/or tags) that they were made for kids or intended for families to watch together.” Wow, this took long enough.

New York Times: How to Force 8Chan, Reddit and Others to Clean Up

New York Times: How to Force 8Chan, Reddit and Others to Clean Up. “Though it may seem that there is little that platforms and politicians can do to stop the spread of online hatred, a great deal could be accomplished with one simple tweak to the existing Communications Decency Act: revise the safe harbor provisions of the law.”

Washington Post: Social media companies are outsourcing their dirty work to the Philippines. A generation of workers is paying the price.

Washington Post: Social media companies are outsourcing their dirty work to the Philippines. A generation of workers is paying the price.. “A year after quitting his job reviewing some of the most gruesome content the Internet has to offer, Lester prays every week that the images he saw can be erased from his mind. First as a contractor for YouTube and then for Twitter, he worked on a high-up floor of a mall in this traffic-clogged Asian capital, where he spent up to nine hours each day weighing questions about the details in those images. He made decisions about whether a child’s genitals were being touched accidentally or on purpose, or whether a knife slashing someone’s neck depicted a real-life killing — and if such content should be allowed online.”

Mashable: Instagram won’t say why teen’s dead body still shows up in profile pics despite image-blocking tech

Mashable: Instagram won’t say why teen’s dead body still shows up in profile pics despite image-blocking tech. “Instagram is taking steps to block users from posting horrific photos of a teen’s body after her brutal murder, but the platform’s filters seem to have blind spots: profile pictures and videos. Days after heavy backlash for not taking down gruesome photos of Bianca Devins’ body quickly enough, Instagram still struggles to get a handle on content moderation.”

Mashable: Instagram can’t stop flood of grisly photos from teen’s murder so users step up

Mashable: Instagram can’t stop flood of grisly photos from teen’s murder so users step up. “Instagram users are stepping up to stanch the flow of photos showing a popular teen e-girl’s murder as the platform fails to quickly remove the images. As bad actors upload grisly photos showing the teen’s slit neck with certain hashtags, some users are working to bury those posts under images of pink clouds and cats wearing flower crowns with those same hashtags and tagging the victim’s account. The inventive approach makes it harder to search for pictures of Bianca Devins’ dead body.”

Bianca Devins: Photos of Utica Teen’s Body Posted on Instagram After Murder (Heavy)

Heavy: Bianca Devins: Photos of Utica Teen’s Body Posted on Instagram After Murder. “Bianca Devins was a 17-year-old girl from Utica, New York, who was murdered on July 14. Photos of Bianca’s body were posted on social media after her death by the suspected killer, a 21-year-old New York man identified by his family as Brandon Clark, who also goes by Brandon Kuwaliski. Heavy is not publishing the gruesome photos posted to Instagram or linking to them. The photos remained on Clark’s Instagram page for several hours before they were removed.”

The Verge: Twitch is closing in on its Christchurch trolls

The Verge: Twitch is closing in on its Christchurch trolls. “For just over a month, Twitch has been trying to track down a group of anonymous trolls who spammed the platform with violent footage of the Christchurch shooting in the wake of the attack. That hunt kicked off in earnest when Twitch filed suit against the trolls earlier in June, but new filings show the company has more clues to the perpetrators’ identity than anyone suspected, including specific email addresses for at least three people and Discord logs where the attack was organized.”

The Verge: Bodies in Seats

The Verge: Bodies in Seats . “In May, I traveled to Florida to meet with these Facebook contractors. This article is based on interviews with 12 current and former moderators and managers at the Tampa site. In most cases, I agreed to use pseudonyms to protect the employees from potential retaliation from Facebook and Cognizant. But for the first time, three former moderators for Facebook in North America agreed to break their nondisclosure agreements and discuss working conditions at the site on the record.” I want you to read this article. At the same time I don’t want you to read this article because just reading it made me nauseated.

Tubefilter: Twitch Suing 100 Users Who Spammed ‘Artifact’ Game Category With Pornographic And Violent Content

Tubefilter: Twitch Suing 100 Users Who Spammed ‘Artifact’ Game Category With Pornographic And Violent Content. “Twitch is suing 100 people for flooding its site with pornography, gore, and graphic imagery, including livestreams of the Christchurch massacre footage. There’s only one problem: the Amazon-owned platform has no idea who these 100 people are.”

New York Times: Facebook’s A.I. Whiz Now Faces the Task of Cleaning It Up. Sometimes That Brings Him to Tears.

New York Times: Facebook’s A.I. Whiz Now Faces the Task of Cleaning It Up. Sometimes That Brings Him to Tears.. “Mike Schroepfer, Facebook’s chief technology officer, was tearing up. For half an hour, we had been sitting in a conference room at Facebook’s headquarters, surrounded by whiteboards covered in blue and red marker, discussing the technical difficulties of removing toxic content from the social network. Then we brought up an episode where the challenges had proved insurmountable: the shootings in Christchurch, New Zealand.”

CNN: Seven weeks later, videos of New Zealand attack still circulating on Facebook and Instagram

CNN: Seven weeks later, videos of New Zealand attack still circulating on Facebook and Instagram. “Almost seven weeks after a terrorist attack on a New Zealand mosque was streamed live on Facebook, copies of the video are still circulating on Facebook and Instagram. The existence of the videos, some of which have been on the platforms since the day of the attack, are indicative of the challenge tech companies face in combating the spread of white supremacist and other terror-related content on their platforms and raises questions about the effectiveness of Facebook’s efforts to do so in particular.”