New York Times: Where Do Vaccine Doses Go, and Who Gets Them? The Algorithms Decide

New York Times: Where Do Vaccine Doses Go, and Who Gets Them? The Algorithms Decide. “The algorithms are intended to speed Covid-19 shots from pharmaceutical plants to people’s arms. The formulas generally follow guidelines from the Centers for Disease Control and Prevention recommending that frontline health care workers, nursing home residents, senior citizens and those with major health risks be given priority for the vaccines. Yet federal agencies, states, local health departments and medical centers have each developed different allocation formulas, based on a variety of ethical and political considerations. The result: Americans are experiencing wide disparities in vaccine access.”

HealthImaging: New database of FDA-cleared algorithms helps radiologists quickly navigate complex AI environment

HealthImaging: New database of FDA-cleared algorithms helps radiologists quickly navigate complex AI environment. “The American College of Radiology on Monday announced a new, searchable database of federally cleared algorithms to help radiologists navigate the complex artificial intelligence environment. The ACR Data Science Institute’s catalog includes 111 class 2 medical imaging AI algorithms cleared by the U.S. Food & Drug Administration. Radiologists can search for tools according to company, subspeciality, body area, modality, and clearance date to find what may best fit their clinical needs.”

TNW: Study shows how AI exacerbates recruitment bias against women

TNW: Study shows how AI exacerbates recruitment bias against women. “A new study from the University of Melbourne has demonstrated how hiring algorithms can amplify human gender biases against women. Researchers from the University of Melbourne gave 40 recruiters real-life resumés for jobs at UniBank, which funded the study. The resumés were for roles as a data analyst, finance officer, and recruitment officer, which Australian Bureau of Statistics data shows are respectively male-dominated, gender-balanced, and female-dominated positions.”

“There are still many questions that are not answered” – Nicolas Kayser-Bril on investigating algorithmic discrimination on Facebook (Online Journalism Blog)

Online Journalism Blog: “There are still many questions that are not answered” – Nicolas Kayser-Bril on investigating algorithmic discrimination on Facebook. “In a special guest post for OJB, Vanessa Fillis speaks to AlgorithmWatch’s Nicolas Kayser-Bril about his work on how online platforms optimise ad delivery, including his recent story on how Facebook draws on gender stereotypes.”

ScienceBlog: When Algorithms Compete, Who Wins?

ScienceBlog: When Algorithms Compete, Who Wins?. “James Zou, Stanford assistant professor of biomedical data science and an affiliated faculty member of the Stanford Institute for Human-Centered Artificial Intelligence, says that as algorithms compete for clicks and the associated user data, they become more specialized for subpopulations that gravitate to their sites. And that, he finds in a new paper with graduate student Antonio Ginart and undergraduate Eva Zhang, can have serious implications for both companies and consumers.”

Vice: How to Game Spotify and Instagram’s Algorithms to Help Artists

Vice: How to Game Spotify and Instagram’s Algorithms to Help Artists. “Now that in-person live music is no longer a reality, there are few ways to directly support musicians. You can subscribe to artist Patreons and donate through links on Spotify artist pages, but most importantly, you should be buying music and merch, especially through Bandcamp, during their monthly Bandcamp Friday 100 percent commission days. These are necessary and important steps to take to ensure touring artists can weather the pandemic. But there are also ways to give them a boost that don’t require spending any money: Simply follow the artists you like and save their songs on your streaming platform.”

Mother Jones: Facebook Manipulated the News You See to Appease Republicans, Insiders Say

Mother Jones: Facebook Manipulated the News You See to Appease Republicans, Insiders Say. “To be perfectly clear: Facebook used its monopolistic power to boost and suppress specific publishers’ content—the essence of every Big Brother fear about the platforms, and something Facebook and other companies have been strenuously denying for years. It’s also, ironically, what conservatives have consistently accused Facebook of doing to them, with the perverse but entirely intended effect of causing it to bend over backward for them instead.”

NiemanLab: Is Facebook too big to know? The Markup has a plan (and a browser) to wrap its arms around it

NiemanLab: Is Facebook too big to know? The Markup has a plan (and a browser) to wrap its arms around it. “The Citizen Browser Project will pay 1,200 Americans to let The Markup monitor the choices that tech company algorithms are making for them. ‘What are they choosing to amplify? And what are they choosing not to amplify?’”

The Conversation: Do social media algorithms erode our ability to make decisions freely? The jury is out

The Conversation: Do social media algorithms erode our ability to make decisions freely? The jury is out . “Social media algorithms, artificial intelligence, and our own genetics are among the factors influencing us beyond our awareness. This raises an ancient question: do we have control over our own lives? This article is part of The Conversation’s series on the science of free will.”

MIT Technology Review: Why kids need special protection from AI’s influence

MIT Technology Review: Why kids need special protection from AI’s influence. “Algorithms are also increasingly used to determine what their education is like, whether they’ll receive health care, and even whether their parents are deemed fit to care for them. Sometimes this can have devastating effects: this past summer, for example, thousands of students lost their university admissions after algorithms—used in lieu of pandemic-canceled standardized tests—inaccurately predicted their academic performance. Children, in other words, are often at the forefront when it comes to using and being used by AI, and that can leave them in a position to get hurt.”

PC Magazine: Want to Get Verified on Instagram? A Huge Follower Account Isn’t Enough

PC Magazine: Want to Get Verified on Instagram? A Huge Follower Account Isn’t Enough. “Instagram says it noticed that people were turning to the platform to raise awareness and promote the causes they were invested in, especially in the midst of the pandemic, racial tensions, and the 2020 election. So it created a new Instagram Equity team ‘that will focus on better understanding and addressing bias in our product development and people’s experiences on Instagram’—including fairness in algorithms.”

Not just A-levels: unfair algorithms are being used to make all sorts of government decisions (The Conversation)

The Conversation: Not just A-levels: unfair algorithms are being used to make all sorts of government decisions. “Algorithmic systems tend to be promoted for several reasons, including claims that they produce smarter, faster, more consistent and more objective decisions, and make more efficient use of government resources. The A-level fiasco has shown that this is not necessarily the case in practice. Even where an algorithm provides a benefit (fast, complex decision-making for a large amount of data), it may bring new problems (socio-economic discrimination).”

New Zealand Herald: How Facebook, Google algorithms feed on hate speech, rage

New Zealand Herald: How Facebook, Google algorithms feed on hate speech, rage. “Notice how those unsavoury posts liked by some long-forgotten friend always seem to float to the top of your curated social media feeds Wonder how an incitement to violence can stay on your screen for days? What about that infuriating conspiracy that keeps getting forced down your throat According to an Australian digital security researcher, it’s no bug. It’s a feature. It’s a subliminal mechanism designed to extract maximum revenue out of your inbox.”

Engadget: Facebook and Instagram reveal content ‘recommendation guidelines’

Engadget: Facebook and Instagram reveal content ‘recommendation guidelines’. “The guidelines are essentially Facebook’s internal rulebook for determining what type of content is ‘eligible’ to appear prominently in the app, such as in Instagram’s Explore section or in Facebook’s recommendations for groups or events. The suggestions are algorithmically generated and have been a source of speculation and scrutiny.”

EurekAlert: QUT algorithm could quash Twitter abuse of women

EurekAlert: QUT algorithm could quash Twitter abuse of women. “Online abuse targeting women, including threats of harm or sexual violence, has proliferated across all social media platforms but [Queensland University of Technology] researchers have developed a statistical model to help drum it out of the Twittersphere. Associate Professor Richi Nayak, Professor Nicolas Suzor and research fellow Dr Md Abul Bashar from QUT have developed a sophisticated and accurate algorithm to detect these posts on Twitter, cutting through the raucous rabble of millions of tweets to identify misogynistic content.”