Reclaim the Net: New tool “De-Mainstream” allows YouTube users to blacklist mainstream media for a more authentic experience. “The extension blocks certain media outlets from YouTube search and recommendations while also making YouTube Trending showcase the most popular videos based on view counts.” The extension doesn’t have many Chrome users yet. I did check, and it limits its data processing to YouTube sites only. The project is also on GitHub.
Engadget: China internet rules call for algorithms that recommend ‘positive’ content. “China is once more tightening its grip on internet content, and this time algorithms are in the spotlight. The Cyberspace Administration of China has published upcoming rules that dictate how internet companies manage content, including a push for recommendation algorithms that promote ‘positive’ ideas (read: government policies) while excluding ‘bad’ material.”
The Guardian: Uncovered: reality of how smartphones turned election news into chaos. “Ask the average 2019 voter where the problems with political news lie, and you might hear a few familiar claims: fake news. Russian interference. The biased BBC. But take a look at their smartphones, and you might discover a different, more chaotic world – in which news is being shaped less by publishers or foreign agents but by social media algorithms and friendship groups.”
CNET: YouTube CEO defends site’s recommendation system amid scrutiny. “As YouTube deals with an onslaught of controversies, from the spread of extremism to child sexual exploitation issues, critics have called out the site’s powerful recommendation system, which uses algorithms to drive people to new content.”
CNET: Google partners with publishers to bring audio news feeds to the Assistant. “Google on Tuesday said it’s bringing personalized audio news playlists to its Assistant software. The new feature will use the search giant’s algorithms and vast amounts of user data to tee up a feed of news stories tailor-made for individual people, based on their interests.”
CNET: Mozilla is sharing YouTube horror stories to prod Google for more transparency. “Mozilla is publishing anecdotes of YouTube viewing gone awry — anonymous stories from people who say they innocently searched one thing but eventually ended up in a dark rabbit hole of videos. It’s a campaign aimed at pressuring Google’s massive video site to make itself more accessible to independent researchers trying to study its algorithms.”
Ars Technica: WSJ: Amazon changed search results to boost profits despite internal dissent. “The goal was to favor Amazon-made products as well as third-party products that rank high in ‘what the company calls “contribution profit,” considered a better measure of a product’s profitability because it factors in non-fixed expenses such as shipping and advertising, leaving the amount left over to cover Amazon’s fixed costs,’ the WSJ said.”
Search Engine Land: Apple accused of favoring its own properties in App Store results. “Apple has helped make developers billions of dollars but it has also been accused of favoring its own apps in search results to the detriment of competitors. A New York Times analysis (performed by Sensor Tower) generally confirms this ‘search bias.’ A Wall Street Journal analysis (using App Annie) found something similar in July.”
BuzzFeed News: Google Is Promoting Climate Change Denialism On Its Apps And Mobile Homepage. “In July, Tommaso Boggia, a climate activist turned programmer, swiped to the Google Discover tab on his phone to scan the headlines the company had algorithmically selected for him. He was shocked to find a climate change denial website prominently featured in his feed. The next day, it happened again.”
Tubefilter: The YouTube Radicalization Pipeline Exists, And It’s Driving Users Toward Increasingly Alt-Right Content (Study). “A new study out of Cornell University has found ‘strong evidence for radicalization among YouTube users,’ and concludes that viewers who consume ‘mild’ radical right-wing content (it cites Joe Rogan, a YouTuber with a talk show and 6.1 million subscribers, as a creator of such ‘mild’ content) often migrate to viewing much more radical alt-right content.”
Poynter: Netflix’s algorithms seem to be a new entry point for conspiracy theories. Be aware!. “When the spread of disinformation became a major topic of debate in late 2016, it was discussed mainly in reference to social networks such as Facebook and Twitter. In the following months, serious problems related to the diffusion of pseudoscientific beliefs, conspiracy theories and disinformation emerged on YouTube and WhatsApp. Until now, the popular video streaming service Netflix had managed to stay out of the picture. Not anymore.”
Wired: AI Algorithms Need FDA-style Drug Trials. “Intelligent systems at scale need regulation because they are an unprecedented force multiplier for the promotion of the interests of an individual or a group. For the first time in history, a single person can customize a message for billions and share it with them within a matter of days. A software engineer can create an army of AI-powered bots, each pretending to be a different person, promoting content on behalf of political or commercial interests. Unlike broadcast propaganda or direct marketing, this approach also uses the self-reinforcing qualities of the algorithm to learn what works best to persuade and nudge each individual.”
Motherboard: TikTok Users Are Inventing Wild Theories to Explain Its Mysterious Algorithm. “TikTok users, without verifiable information from TikTok, are aggressively postulating their theories about how the For You page actually works on the platform. Speculation about the For You page has become prevalent that it’s practically adopted status as a meme on the platform. If users aren’t theorizing about it, then they’re making irreverent jokes about it.” I find it fascinating that users are speculating and trying to game this algorithm from the get. Algorithmic recommendation systems are no longer mysterious concepts or nerdy. They just are.
Ars Technica: YouTube should stop recommending garbage videos to users. “Ostensibly, YouTube’s recommendation algorithms are politically neutral. However, they’re optimized to boost ‘engagement,’ and in practice that means promoting videos with extremist and conspiratorial points of view. In Brazil, that has meant bringing a cadre of far-right social media stars to prominence, ultimately helping them gain power in national politics.”
CNET: YouTube tweaks kids videos’ algorithm to favor ‘quality’ content. “YouTube changed its recommendation algorithm for kid-oriented videos to prioritize ‘quality’ content, the company said Wednesday. The tweak last month diverted traffic away from some channels and flooded others, according to a Bloomberg article that first reported the news.”