Wired: How Recommendation Algorithms Run the World

Wired: How Recommendation Algorithms Run the World. “What should you watch? What should you read? What’s news? What’s trending? Wherever you go online, companies have come up with very particular, imperfect ways of answering these questions. Everywhere you look, recommendation engines offer striking examples of how values and judgments become embedded in algorithms and how algorithms can be gamed by strategic actors.”

BuzzFeed News: YouTube’s New Fact-Check Tool Flagged Notre Dame Fire Coverage And Attached An Article About 9/11

BuzzFeed News: YouTube’s New Fact-Check Tool Flagged Notre Dame Fire Coverage And Attached An Article About 9/11. “As the Notre Dame Cathedral went up in flames on Monday, YouTube flagged livestreams of the incident as possible sources of misinformation and then started showing people articles about the 9/11 attacks.”

New York Times: YouTube’s Product Chief on Online Radicalization and Algorithmic Rabbit Holes

New York Times: YouTube’s Product Chief on Online Radicalization and Algorithmic Rabbit Holes. “The recommendation engine is a growing liability for YouTube, which has been accused of steering users toward increasingly extreme content. After the recent mass shooting in Christchurch, New Zealand — the work of a gunman who showed signs of having been radicalized online — critics asked whether YouTube and other platforms were not just allowing hateful and violent content to exist but actively promoting it to their users.”

Wired: How Amazon’s Algorithms Curated a Dystopian Bookstore

Wired: How Amazon’s Algorithms Curated a Dystopian Bookstore. “Once relegated to tabloids and web forums, health misinformation and conspiracies have found a new megaphone in the curation engines that power massive platforms like Amazon, Facebook, and Google. Search, trending, and recommendation algorithms can be gamed to make fringe ideas appear mainstream. This is compounded by an asymmetry of passion that leads truther communities to create prolific amounts of content, resulting in a greater amount available for algorithms to serve up … and, it seems, resulting in real-world consequences.”

CNET: YouTube recommendations for ‘alt-right’ videos have dropped dramatically, study shows

CNET: YouTube recommendations for ‘alt-right’ videos have dropped dramatically, study shows. “Google has made ‘major changes’ to its recommendations system on YouTube that have reduced the amount of ‘alt-right’ videos recommended to users, according to a study led by Nicolas Suzor, an associate professor at Queensland University of Technology. During the first two weeks of February, alt-right videos appeared in YouTube’s ‘Up Next’ recommendations sidebar 7.8 percent of the time (roughly one in 13). From Feb. 15 onward, that number dropped to 0.4 percent (roughly one in 250).”

Wired: When Algorithms Think You Want to Die

Wired: When Algorithms Think You Want to Die. “Social media platforms not only host this troubling content, they end up recommending it to the people most vulnerable to it. And recommendation is a different animal than mere availability. A growing academic literature bears this out: Whether its self-harm, misinformation, terrorist recruitment, or conspiracy, platforms do more than make this content easily found—in important ways they help amplify it.”

Tubefilter: Former Algorithm Engineer Guillaume Chaslot Calls YouTube’s Decision To Stop Recommending Conspiracy Videos “A Historic Victory”

Tubefilter: Former Algorithm Engineer Guillaume Chaslot Calls YouTube’s Decision To Stop Recommending Conspiracy Videos “A Historic Victory”. “Former YouTube engineer Guillaume Chaslot is praising his one-time employer’s decision to stop recommending conspiracy theory videos. ‘YouTube’s announcement is a great victory which will save thousands,’ he tweeted as part of a lengthy thread. ‘It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable.'”