Brookings: Assessing employer intent when AI hiring tools are biased

Brookings: Assessing employer intent when AI hiring tools are biased. “In this paper, I discuss how hiring is a multi-layered and opaque process and how it will become more difficult to assess employer intent as recruitment processes move online. Because intent is a critical aspect of employment discrimination law, I ultimately suggest four ways upon which to include it in the discussion surrounding algorithmic bias.”

Stanford: Search results not biased along party lines, Stanford scholars find

Stanford News: Search results not biased along party lines, Stanford scholars find . “According to newly published research by Stanford scholars, there appears to be no political favoritism for or against either major political party in the algorithm of a popular search engine.”

IFL Science: This Is Why Women Are Setting Their Gender To Male On Instagram

IFL Science: This Is Why Women Are Setting Their Gender To Male On Instagram. “The Instagram community guidelines state that nudity and inappropriate content is not allowed on the platform. ‘This includes photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks. It also includes some photos of female nipples, but photos of post-mastectomy scarring and women actively breastfeeding are allowed.’ However back in April, the Instagram algorithm changed to demote certain posts, even if they don’t technically break the rules set by the platform itself, HuffPost reports. “

NIST research effort to measure bias in results we get from search engines: ‘Fair Ranking’ (The Sociable)

The Sociable: NIST research effort to measure bias in results we get from search engines: ‘Fair Ranking’. “As part of its long-running Text Retrieval Conference (TREC), which is taking place this week at NIST’s Gaithersburg, Maryland, campus, NIST has launched the Fair Ranking track this year, which is an incubator for a new area of study that aims to bring fairness in research. The track has been proposed and organized by researchers from Microsoft, Boise State University and NIST, who hope to find strategies for removing bias, by finding apt ways to measure the amount of bias in data and search techniques.”

ZDNet: Google’s new AI tool could help decode the mysterious algorithms that decide everything

ZDNet: Google’s new AI tool could help decode the mysterious algorithms that decide everything. “While most people come across algorithms every day, not that many can claim that they really understand how AI actually works. A new tool unveiled by Google, however, hopes to help common humans grasp the complexities of machine learning.”

Harvard Business Review: When Algorithms Decide Whose Voices Will Be Heard

Harvard Business Review: When Algorithms Decide Whose Voices Will Be Heard. “Are we giving up our freedom of expression and action in the name of convenience? While we may have the perceived power to express ourselves digitally, our ability to be seen is increasingly governed by algorithms — with lines of codes and logic — programmed by fallible humans. Unfortunately, what dictates and controls the outcomes of such programs is more often than not a black box.”

The Daily Dot: Amazon’s facial recognition misidentified Boston athletes as criminals

The Daily Dot: Amazon’s facial recognition misidentified Boston athletes as criminals. “Amazon’s facial recognition technology falsely matched nearly 30 professional athletes to individuals in a mugshot database, the Massachusetts chapter of the American Civil Liberties Union (ACLU) said.”