Wired: AI Algorithms Need FDA-style Drug Trials. “Intelligent systems at scale need regulation because they are an unprecedented force multiplier for the promotion of the interests of an individual or a group. For the first time in history, a single person can customize a message for billions and share it with them within a matter of days. A software engineer can create an army of AI-powered bots, each pretending to be a different person, promoting content on behalf of political or commercial interests. Unlike broadcast propaganda or direct marketing, this approach also uses the self-reinforcing qualities of the algorithm to learn what works best to persuade and nudge each individual.”
Motherboard: TikTok Users Are Inventing Wild Theories to Explain Its Mysterious Algorithm. “TikTok users, without verifiable information from TikTok, are aggressively postulating their theories about how the For You page actually works on the platform. Speculation about the For You page has become prevalent that it’s practically adopted status as a meme on the platform. If users aren’t theorizing about it, then they’re making irreverent jokes about it.” I find it fascinating that users are speculating and trying to game this algorithm from the get. Algorithmic recommendation systems are no longer mysterious concepts or nerdy. They just are.
Ars Technica: YouTube should stop recommending garbage videos to users. “Ostensibly, YouTube’s recommendation algorithms are politically neutral. However, they’re optimized to boost ‘engagement,’ and in practice that means promoting videos with extremist and conspiratorial points of view. In Brazil, that has meant bringing a cadre of far-right social media stars to prominence, ultimately helping them gain power in national politics.”
CNET: YouTube tweaks kids videos’ algorithm to favor ‘quality’ content. “YouTube changed its recommendation algorithm for kid-oriented videos to prioritize ‘quality’ content, the company said Wednesday. The tweak last month diverted traffic away from some channels and flooded others, according to a Bloomberg article that first reported the news.”
BBC: YouTube: ‘We don’t take you down the rabbit hole’. “On Thursday, a BBC report explored how YouTube had helped the Flat Earth conspiracy theory spread. But the company’s new managing director for the UK, Ben McOwen Wilson, said YouTube ‘does the opposite of taking you down the rabbit hole’.”
The Next Web: ‘YouTube recommendations are toxic,’ says dev who worked on the algorithm. “…as YouTube has become the place for videos on the web, it’s led to a raft of new problems. Content moderation is a constant struggle and YouTube can do better, but there will likely always be some amount of offensive videos that people can seek out. However, the real issue is the videos we don’t seek out: YouTube’s recommendations.”
Lifehacker: How To Outsmart Algorithms And Take Control Of Your Information Diet. This is like a roundup of other useful Lifehacker articles, but it’s still good. “‘Certain algorithms,’ says Tim Cook, ‘pull you toward the things you already know, believe or like, and they push away everything else. Push back.’ In a commencement speech to Tulane University, the Apple CEO tells graduates to take charge of their information diet. And much as we want to sneer at the irony of a phone maker telling us to beware of algorithms, we have to admit that Apple’s Screen Time app is one good tool for improving your tech habits. Here are the best posts we’ve already written on pushing back against the algorithms.”