Tubefilter: Former Algorithm Engineer Guillaume Chaslot Calls YouTube’s Decision To Stop Recommending Conspiracy Videos “A Historic Victory”

Tubefilter: Former Algorithm Engineer Guillaume Chaslot Calls YouTube’s Decision To Stop Recommending Conspiracy Videos “A Historic Victory”. “Former YouTube engineer Guillaume Chaslot is praising his one-time employer’s decision to stop recommending conspiracy theory videos. ‘YouTube’s announcement is a great victory which will save thousands,’ he tweeted as part of a lengthy thread. ‘It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable.'”

BuzzFeed News: We Followed YouTube’s Recommendation Algorithm Down The Rabbit Hole

BuzzFeed News: We Followed YouTube’s Recommendation Algorithm Down The Rabbit Hole. “How many clicks through YouTube’s ‘Up Next’ recommendations does it take to go from an anodyne PBS clip about the 116th United States Congress to an anti-immigrant video from a designated hate organization? Thanks to the site’s recommendation algorithm, just nine.”

Motherboard: Why Did YouTube Mass Recommend That People Watch News Footage of the 9/11 Attacks?

Motherboard: Why Did YouTube Mass Recommend That People Watch News Footage of the 9/11 Attacks?. “Earlier this month, a two-hour newscast from CNN on the morning of the 9/11 World Trade Center attacks started showing up in the recommended section of many users’ feeds, prompting people to question, ‘What did I watch for this to be recommended to me?’ The video itself was uploaded more than five years ago by an account exclusively full of other videos from Sept. 11, 2001 and news coverage of the attacks from that day.”

Washington Post: Searching for news on RBG? YouTube offered conspiracy theories about the Supreme Court justice instead.

Washington Post: Searching for news on RBG? YouTube offered conspiracy theories about the Supreme Court justice instead.. “Conspiracy theories about the health of Supreme Court Justice Ruth Bader Ginsburg have dominated YouTube this week, illustrating how the world’s most popular video site is failing to prevent its algorithm from helping popularize viral hoaxes and misinformation. More than half of the top 20 search results for her initials, ‘RBG,’ on Wednesday pointed to false far-right videos, some claiming doctors are using mysterious illegal drugs to keep her alive, according to a review by The Washington Post.”

Washington Post: Two years after #Pizzagate showed the dangers of hateful conspiracies, they’re still rampant on YouTube

Washington Post: Two years after #Pizzagate showed the dangers of hateful conspiracies, they’re still rampant on YouTube. “A year after YouTube’s chief executive promised to curb ‘problematic’ videos, it continues to harbor and even recommend hateful, conspiratorial videos, allowing racists, anti-Semites and proponents of other extremist views to use the platform as an online library for spreading their ideas.”

Washington Post: Facebook, Twitter crack down on AI babysitter-rating service

Washington Post: Facebook, Twitter crack down on AI babysitter-rating service. “Predictim, a California-based start-up, analyzes babysitters’ online histories, including on Facebook and Twitter, and offers ratings of whether they are at risk of drug abuse, bullying or having a ‘bad attitude.’ Facebook said it dramatically limited Predictim’s access to users’ information on Instagram and Facebook a few weeks ago for violating a ban on developers’ use of personal data to evaluate a person for decisions on hiring or eligibility.”

Fortune: Dealers Are Using Social Media to Sell Illegal Drugs — And Getting Away With It

Fortune: Dealers Are Using Social Media to Sell Illegal Drugs — And Getting Away With It. “The social platforms can’t keep up with their own algorithms, which actively promote the problematic content once users express interest by following a drug dealer or liking a drug-related image, according to the report. The system is meant to advertise accounts and provide new content personalized to the user’s interests, but this can backfire when the interests are illegal.”