Mother Jones: Facebook Manipulated the News You See to Appease Republicans, Insiders Say

Mother Jones: Facebook Manipulated the News You See to Appease Republicans, Insiders Say. “To be perfectly clear: Facebook used its monopolistic power to boost and suppress specific publishers’ content—the essence of every Big Brother fear about the platforms, and something Facebook and other companies have been strenuously denying for years. It’s also, ironically, what conservatives have consistently accused Facebook of doing to them, with the perverse but entirely intended effect of causing it to bend over backward for them instead.”

NiemanLab: Is Facebook too big to know? The Markup has a plan (and a browser) to wrap its arms around it

NiemanLab: Is Facebook too big to know? The Markup has a plan (and a browser) to wrap its arms around it. “The Citizen Browser Project will pay 1,200 Americans to let The Markup monitor the choices that tech company algorithms are making for them. ‘What are they choosing to amplify? And what are they choosing not to amplify?'”

New Zealand Herald: How Facebook, Google algorithms feed on hate speech, rage

New Zealand Herald: How Facebook, Google algorithms feed on hate speech, rage. “Notice how those unsavoury posts liked by some long-forgotten friend always seem to float to the top of your curated social media feeds Wonder how an incitement to violence can stay on your screen for days? What about that infuriating conspiracy that keeps getting forced down your throat According to an Australian digital security researcher, it’s no bug. It’s a feature. It’s a subliminal mechanism designed to extract maximum revenue out of your inbox.”

Engadget: Facebook and Instagram reveal content ‘recommendation guidelines’

Engadget: Facebook and Instagram reveal content ‘recommendation guidelines’. “The guidelines are essentially Facebook’s internal rulebook for determining what type of content is ‘eligible’ to appear prominently in the app, such as in Instagram’s Explore section or in Facebook’s recommendations for groups or events. The suggestions are algorithmically generated and have been a source of speculation and scrutiny.”

First Draft News: Why we need a Google Trends for Facebook, Instagram, Twitter, TikTok and Reddit

First Draft News: Why we need a Google Trends for Facebook, Instagram, Twitter, TikTok and Reddit. “When it comes to data voids, a distinction is usually drawn between search engines and social media platforms. Whereas the primary interface of search engines is the search bar, the primary interface of social media platforms is the feed: algorithmic encounters with posts based on general interest, not a specific question you’re searching to answer. It’s therefore easy to miss the fact that data voids exist here, too: Even though search isn’t the primary interface, it’s still a major feature. And with billions of users, they may be creating major social vulnerabilities.”

CNET: Teens have figured out how to mess with Instagram’s tracking algorithm

CNET: Teens have figured out how to mess with Instagram’s tracking algorithm. “Like about a billion other people, 17-year-old Samantha Mosley spent her Saturday afternoon perusing Instagram….But unlike many of Instagram’s users, Mosley and her high school friends in Maryland had figured out a way to fool tracking by the Facebook-owned social network. On the first visit, her Explore tab showed images of Kobe Bryant. Then on a refresh, cooking guides, and after another refresh, animals.”

Search Engine Land: Senate bill seeks to compel tech giants to offer ‘unfiltered’ versions of their content

Search Engine Land: Senate bill seeks to compel tech giants to offer ‘unfiltered’ versions of their content. “There’s a new bill circulating in the Senate that would require large internet companies to disclose that their results are using ‘opaque algorithms’ and offer consumers an option to see non-personalized search results or content, the Wall Street Journal (WSJ) first reported. It’s called ‘The Filter Bubble Transparency Act.'”

TechCrunch: Facebook isn’t free speech, it’s algorithmic amplification optimized for outrage

TechCrunch: Facebook isn’t free speech, it’s algorithmic amplification optimized for outrage. “The problem is that Facebook doesn’t offer free speech; it offers free amplification. No one would much care about anything you posted to Facebook, no matter how false or hateful, if people had to navigate to your particular page to read your rantings, as in the very early days of the site. But what people actually read on Facebook is what’s in their News Feed … and its contents, in turn, are determined not by giving everyone an equal voice, and not by a strict chronological timeline.”

Boing Boing: A plugin to force Twitter to respect your settings and stop showing you “top” tweets

Boing Boing, with a couple bad words, which I am censoring because I’d like this to actually get to your inbox: A plugin to force Twitter to respect your settings and stop showing you “top” tweets. “Twitter has a setting that (nominally) allows you to turn off its default of showing you ‘top’ tweets (as selected by its engagement-maximizing, conflict-seeking algorithm), but periodically, Twitter just ignores that setting…”

Nieman Journalism Lab: Should Facebook have a “quiet period” of no algorithm changes before a major election?

Nieman Journalism Lab: Should Facebook have a “quiet period” of no algorithm changes before a major election?. “Several Facebook News Feed updates leading up to the 2016 U.S. election disadvantaged traditional news sources and favored less reliable information shared by your uncle. Should regulation keep the playing field static?”

Lifehacker: How To Outsmart Algorithms And Take Control Of Your Information Diet

Lifehacker: How To Outsmart Algorithms And Take Control Of Your Information Diet. This is like a roundup of other useful Lifehacker articles, but it’s still good. “‘Certain algorithms,’ says Tim Cook, ‘pull you toward the things you already know, believe or like, and they push away everything else. Push back.’ In a commencement speech to Tulane University, the Apple CEO tells graduates to take charge of their information diet. And much as we want to sneer at the irony of a phone maker telling us to beware of algorithms, we have to admit that Apple’s Screen Time app is one good tool for improving your tech habits. Here are the best posts we’ve already written on pushing back against the algorithms.”

TechCrunch: New Facebook tool answers the question ‘Why am I seeing this post?’

TechCrunch: New Facebook tool answers the question ‘Why am I seeing this post?’. “Facebook announced today that it is adding to News Feeds a feature called “Why am I seeing this post?” Similar to ‘Why am I seeing this ad?,’ which has appeared next to advertisements since 2014, the new tool has a drop-down menu that gives users information about why that post appeared in their News Feed, along with links to personalization controls.”

CNN: How Twitter’s algorithm is amplifying extreme political rhetoric

CNN: How Twitter’s algorithm is amplifying extreme political rhetoric. “Imagine opening up the Twitter app on your phone and scrolling through your feed. Suddenly, you come across a hyper-partisan tweet calling Hillary Clinton the ‘godmother of ISIS.’ It’s from a user you do not follow, and it’s not in your feed by virtue of a retweet from a user you do follow. So how did it get there?”