Fake news ‘vaccine’: Online game may ‘inoculate’ by simulating propaganda tactics (Phys .org)

Phys .org: Fake news ‘vaccine’: Online game may ‘inoculate’ by simulating propaganda tactics . “A new online game puts players in the shoes of an aspiring propagandist to give the public a taste of the techniques and motivations behind the spread of disinformation—potentially ‘inoculating’ them against the influence of so-called fake news in the process.”

Columbia Journalism Review: Erasing history

Columbia Journalism Review: Erasing history. “In the 21st century, more and more information is ‘born digital’ and will stay that way, prone to decay or disappearance as servers, software, Web technologies, and computer languages break down. The task of internet archivists has developed a significance far beyond what anyone could have imagined in 2001, when the Internet Archive first cranked up the Wayback Machine and began collecting Web pages; the site now holds more than 30 petabytes of data dating back to 1996. (One gigabyte would hold the equivalent of 30 feet of books on a shelf; a petabyte is a million of those.) Not infrequently, the Wayback Machine and other large digital archives, such as those in the care of the great national and academic libraries, find themselves holding the only extant copy of a given work on the public internet. This responsibility is increasingly fraught with political, cultural, and even legal complications.”

TechCrunch: Fake news is an existential crisis for social media

TechCrunch: Fake news is an existential crisis for social media . “The claim and counter claim that spread out around ‘fake news’ like an amorphous cloud of meta-fakery, as reams of additional ‘information’ — some of it equally polarizing but a lot of it more subtle in its attempts to mislead (for e.g., the publicly unseen ‘on background’ info routinely sent to reporters to try to invisible shape coverage in a tech firm’s favor) — are applied in equal and opposite directions in the interests of obfuscation; using speech and/or misinformation as a form of censorship to fog the lens of public opinion. This bottomless follow-up fodder generates yet more FUD in the fake news debate. Which is ironic, as well as boring, of course. But it’s also clearly deliberate.” One of those articles that deserves a better headline than it gets. A deep dive with lots of links to other news articles and background. Very good stuff.

Wired: Inside The Two Years That Shook Facebook—and The World

Wired: Inside The Two Years That Shook Facebook—and The World. “[Benjamin] Fearnow, a recent graduate of the Columbia Journalism School, worked in Facebook’s New York office on something called Trending Topics, a feed of popular news subjects that popped up when people opened Facebook. The feed was generated by an algorithm but moderated by a team of about 25 people with backgrounds in journalism. If the word ‘Trump’ was trending, as it often was, they used their news judgment to identify which bit of news about the candidate was most important. If The Onion or a hoax site published a spoof that went viral, they had to keep that out. If something like a mass shooting happened, and Facebook’s algorithm was slow to pick up on it, they would inject a story about it into the feed.”

Quartz: Facebook “likes” are a powerful tool for authoritarian rulers, court petition says

Quartz: Facebook “likes” are a powerful tool for authoritarian rulers, court petition says. “A Cambodian opposition leader has filed a petition in a California court against Facebook, demanding the company disclose its transactions with his country’s authoritarian prime minister, whom he accuses of falsely inflating his popularity through purchased ‘likes’ and spreading fake news.”

LMT Online: U of I team studying spread of information on social media

LMT Online: U of I team studying spread of information on social media. “University of Illinois researchers are using a $4 million grant to study how information moves across social media, affecting people’s beliefs and shaping events. Computer Science Professor Tarek Abdelzaher (TAR’-ek AHB’-del-zah-hair) is leading a team that received a five-year grant from the Defense Advanced Research Projects Agency. The team has been modeling information spread on Instagram and Twitter and will study other platforms.”

TechCrunch: The real consequences of fake porn and news

TechCrunch: The real consequences of fake porn and news. “For a democratic society in which the presumption of truth is generally the default response to most content, we will quite soon live in a world where everything must be considered fake without evidence to the contrary. It’s as if we suddenly moved to an authoritarian country and needed to constantly dismiss the propaganda we see every day. When it comes to policy problems facing startups, tech companies, political parties and governments together, this challenge is about as thorny as they come.”