National Security Archive: Exploring the Russian Social Media Campaign in Charlottesville. “The existence of social media campaigns connected by U.S. intelligence to the Russian Government and aimed at destabilizing American politics continues to be the topic of much discussion and study, but case studies accessible to most social media users in America are difficult to produce given the scope of these operations. This posting seeks to provide such a case study as it relates to the IRA’s tactic of playing up both sides of a critical issue. While Russian support to the Trump campaign on social media and through the release of information obtained through cyberattack is well recognized, less well known is IRA amplification of political beliefs and voices specifically selected to increase polarization in American discourse.”
Nieman Lab: A little knowledge is a dangerous thing — no, seriously, it is, according to this new research. “People who’ve scanned Facebook for news gain a little knowledge. Why do some of them think they’ve gained a lot? Consider statements like ‘I feel that I need to experience strong emotions regularly’ and ‘I feel like I need a good cry every now and then.’ How much do these statements apply to you?”
Virginia Gazette: W&M professor studies polarizing effects of social media. “Jaime Settle is an assistant professor of government at the College of William and Mary. She is co-director of the Social Science Research Methods Center; she founded and directs the Social Networks and Political Psychology Lab at the college. She is also the author of the path-breaking new book, ‘Frenemies: How Social Media Polarizes America.'”
Nieman Lab: Republicans who follow liberal Twitter bots actually become more conservative. “Social media companies have been big on injecting “alternative views” into users’ feeds — the idea, seemingly, being that exposing people to values and beliefs that conflict with their own will expand their worldviews or making them more tolerant. (See also: a zillion different ‘burst your bubble’ efforts. In some ways, this makes all the sense in the world. On the other hand, changing people’s minds is hard.” There are limitations to this study and I’m not here to make RB political. However I have severe problems with those folks who say, “All you have to do is explain your side and people will understand.” Would that were true, but it’s not.
MIT Technology Review: This is what filter bubbles actually look like. “American public life has become increasingly ideologically segregated as newspapers have given way to screens. But societies have experienced extremism and fragmentation without the assistance of Silicon Valley for centuries. And the polarization in the US began long ago, with the rise of 24-hour cable news. So just how responsible is the internet for today’s divisions? And are they really as bad as they seem?”
EurekAlert: 2.7 billion tweets confirm: Echo chambers on Twitter are very real . “A recent study of more than 2.7 billion tweets between 2009 and 2016 confirms that Twitter users are exposed mainly to political opinions that agree with their own. It is the largest study to characterise echo chambers by both the content in them and the networks they comprise. The findings indicate a strong correlation between biases in the content people both produce and consume. In other words, echo chambers are very real on Twitter.”
The Verge: The Mueller indictment exposes the danger of Facebook’s focus on Groups. “A year ago this past Friday, Mark Zuckerberg published a lengthy post titled ‘Building a Global Community.’ It offered a comprehensive statement from the Facebook CEO on how he planned to move the company away from its longtime mission of making the world “more open and connected” to instead create “the social infrastructure … to build a global community.” He identified a number of challenges to realizing his mission, and ranking high among them was the political polarization of his user base.”