The Next Web: Novel ways to find what book to read next, curated by a huge word nerd

The Next Web: Novel ways to find what book to read next, curated by a huge word nerd. “Hi, I’m a huge word nerd. And I’m sure you’ll agree: reading is great! It’s like a word-movie in your mind, but without the crushing self-doubt that comes from looking at people too attractive to be right. Thing is, how do you know what to read next? You can only read so many books in your lifetime, how do you know they’re gonna be good? Well, I’ve got some tips to help you find your next favorite word splurge. Here are the best ways to do just that.”

Wired: How Amazon’s Algorithms Curated a Dystopian Bookstore

Wired: How Amazon’s Algorithms Curated a Dystopian Bookstore. “Once relegated to tabloids and web forums, health misinformation and conspiracies have found a new megaphone in the curation engines that power massive platforms like Amazon, Facebook, and Google. Search, trending, and recommendation algorithms can be gamed to make fringe ideas appear mainstream. This is compounded by an asymmetry of passion that leads truther communities to create prolific amounts of content, resulting in a greater amount available for algorithms to serve up … and, it seems, resulting in real-world consequences.”

CNET: YouTube recommendations for ‘alt-right’ videos have dropped dramatically, study shows

CNET: YouTube recommendations for ‘alt-right’ videos have dropped dramatically, study shows. “Google has made ‘major changes’ to its recommendations system on YouTube that have reduced the amount of ‘alt-right’ videos recommended to users, according to a study led by Nicolas Suzor, an associate professor at Queensland University of Technology. During the first two weeks of February, alt-right videos appeared in YouTube’s ‘Up Next’ recommendations sidebar 7.8 percent of the time (roughly one in 13). From Feb. 15 onward, that number dropped to 0.4 percent (roughly one in 250).”

Wired: When Algorithms Think You Want to Die

Wired: When Algorithms Think You Want to Die. “Social media platforms not only host this troubling content, they end up recommending it to the people most vulnerable to it. And recommendation is a different animal than mere availability. A growing academic literature bears this out: Whether its self-harm, misinformation, terrorist recruitment, or conspiracy, platforms do more than make this content easily found—in important ways they help amplify it.”

Tubefilter: Former Algorithm Engineer Guillaume Chaslot Calls YouTube’s Decision To Stop Recommending Conspiracy Videos “A Historic Victory”

Tubefilter: Former Algorithm Engineer Guillaume Chaslot Calls YouTube’s Decision To Stop Recommending Conspiracy Videos “A Historic Victory”. “Former YouTube engineer Guillaume Chaslot is praising his one-time employer’s decision to stop recommending conspiracy theory videos. ‘YouTube’s announcement is a great victory which will save thousands,’ he tweeted as part of a lengthy thread. ‘It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable.'”

New York Times: YouTube Moves to Make Conspiracy Videos Harder to Find

New York Times: YouTube Moves to Make Conspiracy Videos Harder to Find. “Whether it is a video claiming the earth is flat or the moon landing was faked, conspiracy theories are not hard to find on Google’s YouTube. But in a significant policy change, YouTube said on Friday that it planned to stop recommending them.”

Motherboard: Why Did YouTube Mass Recommend That People Watch News Footage of the 9/11 Attacks?

Motherboard: Why Did YouTube Mass Recommend That People Watch News Footage of the 9/11 Attacks?. “Earlier this month, a two-hour newscast from CNN on the morning of the 9/11 World Trade Center attacks started showing up in the recommended section of many users’ feeds, prompting people to question, ‘What did I watch for this to be recommended to me?’ The video itself was uploaded more than five years ago by an account exclusively full of other videos from Sept. 11, 2001 and news coverage of the attacks from that day.”