The Conversation: I watched hundreds of flat-Earth videos to learn how conspiracy theories spread. “By studying how flat Earthers talk about their beliefs, we can learn how they make their arguments engaging to their audience, and in turn, learn what makes disinformation spread online. In a recent study, my colleague Tomas Nilsson at Linnaeus University and I analysed hundreds of YouTube videos in which people argue that the Earth is flat. We paid attention to their debating techniques to understand the structure of their arguments and how they make them appear rational.”
NPR: Their mom died of COVID. They say conspiracy theories are what really killed her. “As America approaches a million deaths from COVID-19, many thousands of families have been left wondering whether available treatments and vaccines could have saved their loved ones. According to the Kaiser Family Foundation, more than 230,000 deaths could have been avoided if individuals had gotten vaccinated. Not everyone who refuses a vaccine believes in elaborate conspiracy theories, but many likely do. Anti-vaccine advocates have leveraged the pandemic to sow mistrust and fear about the vaccines. Local papers across the country are dotted with stories of those who refused vaccination, only to find themselves fighting for their very lives against the disease.”
NiemanLab: A new magazine delves into the ways that people consume wrong information. “There’s a new magazine in town, one dedicated to pieces about misinformation, disinformation, conspiracy theories, and other ways that people consume wrong information. OpenMind Magazine (whose tagline is ‘tackling science controversies and deceptions’) was officially launched in mid-March and was really the result of old friends wanting to launch a magazine together.” The article notes that everything published in OpenMind Magazine is made available under a Creative Commons license, but I can’t find that information on the site itself.
Google autocomplete helps mislead public, legitimize conspiracy theorists: SFU study (Simon Fraser University)
Simon Fraser University: Google autocomplete helps mislead public, legitimize conspiracy theorists: SFU study. “Google algorithms place innocuous subtitles on prominent conspiracy theorists, which mislead the public and amplify extremist views, according to Simon Fraser University researchers.”