It’s not me, it’s you: Our Facebook fears are mostly about all those other gullible types (Nieman Lab)

Nieman Lab: It’s not me, it’s you: Our Facebook fears are mostly about all those other gullible types. “A number of prominent figures have called for some sort of regulation of Facebook — including one of the company’s co-founders and a venture capitalist who was one of Facebook’s early backers. Much of the criticism of Facebook relates to how the company’s algorithms target users with advertising, and the “echo chambers” that can show users ideologically slanted content. Despite the public criticism, the company has continued to post record profits. And billions of people — including more than two-thirds of American adults — continue to use the unregulated version of Facebook that exists now.”

The Instagram of Trust: How to Redesign the Architecture of Trust in Products (Hacker Noon)

Hacker Noon: The Instagram of Trust: How to Redesign the Architecture of Trust in Products. “More technology requires us to give up our privacy for the cost of better personalization. But how to fix the issue of ever growing lack of trust in our society? More and more brands are asking people for trust based on their promises and by being transparent about its policies. But the psychology of trust works quite differently. There have been many attempts and debates happening around the black box of algorithms and being transparent about how the algorithms work. But I would like to ask: Is transparency enough? Is it an effective way to build a long-lasting relationship with a customer? Is it going to build trust in a brand and in a product?”

Phys .org: People more likely to trust machines than humans with their private information

Phys .org: People more likely to trust machines than humans with their private information. “Not everyone fears our machine overlords. In fact, according to Penn State researchers, when it comes to private information and access to financial data, people tend to trust machines more than people, which could lead to both positive and negative online behaviors.”

Washington Post: Our devices steal our attention. We need to take it back.

Washington Post: Our devices steal our attention. We need to take it back.. “To explain why we should refocus our attention, [Jenny] Odell notes the tension between being connected online and disconnected in the real world. We tend to stay online too much, she suggests, because digital platforms are structured to keep us connected for their own profit. It is necessary to escape to engage in sensitive, actual human interaction. Though these are not necessarily new observations, it’s worthwhile to reiterate that, for all the social unity and disunity social media sites promote, the profit motive is the reason most of them exist.”

The Ohio State University: Tech fixes can’t protect us from disinformation campaigns

The Ohio State University: Tech fixes can’t protect us from disinformation campaigns. “More than technological fixes are needed to stop countries from spreading disinformation on social media platforms like Facebook and Twitter, according to two experts. Policymakers and diplomats need to focus more on the psychology behind why citizens are so vulnerable to disinformation campaigns, said Erik Nisbet and Olga Kamenchuk of The Ohio State University.”

Wired: The Rise and Fall of Facebook’s Memory Economy

Wired: The Rise and Fall of Facebook’s Memory Economy . “Facebook’s Memories feature—where it shows you pictures and posts from a day in the recent or far-gone past—used to be my favorite thing about the platform. I mean, I have posted some hilarious things that my son said when he was little, and that time I went on a reporting trip to Area 51 was seriously cool. Heck, I’ve reposted it three years in a row. Now, though, I think Memories is the platform’s most cynical element. It’s a cheap ploy to keep us creating new posts, keep us interested, at a time when our interest is starting to drift away.”

Northeastern University: It’s Time To Study Machines The Way We Study Humans

Northeastern University: It’s Time To Study Machines The Way We Study Humans. “Artificial intelligence and machine learning models can be found in almost every aspect of modern life. News-ranking algorithms determine which information we see online, compatibility algorithms influence the people we date, and ride-hailing algorithms affect the way we travel. Despite the pervasiveness of these life-changing algorithms, we don’t have a universal understanding of how they work or how they’re shaping our world. So, a team of researchers—including two Northeastern University professors—says that it’s time to study artificially intelligent machines the way we study humans.”