The Next Web: Why Alexa and not Alexander? How gendered voice assistants are hurting as they help

The Next Web: Why Alexa and not Alexander? How gendered voice assistants are hurting as they help. “For a society that hasn’t quite broken out of its mindset around traditional gender roles, seeing women as everyone else’s helpers instead of their own people with their own destinies is par for the course. We even see this reflected in the emerging field of AI voice assistants – all of which sound female.”

Ohio State News: Groundbreaking ideas from women scientists get less attention

Ohio State News: Groundbreaking ideas from women scientists get less attention. “Researchers used a novel way of tracing the flow of ideas to find that even some of the most well-known breakthroughs in biomedical research from 1980 to 2008 had a more difficult road to adoption when research teams were dominated by women. Specifically, the five-year adoption rate of new ideas from female-majority teams was 23% lower than that of male-majority teams – even among the top 0.1% of ideas.”

BBC: The firms paid to delve into sport stars’ social media past

BBC: The firms paid to delve into sport stars’ social media past. “Officials at sporting organisations are increasingly requesting detailed reports into sport stars’ social media history to unearth risky and problematic content. For companies offering so-called online due diligence – or background checks – the headlines around England cricketer Ollie Robinson, who was suspended over offensive historic tweets, are a cautionary tale likely to feature in sales pitches in the coming months and years.”

MIT Technology Review: Facebook’s ad algorithms are still excluding women from seeing jobs

MIT Technology Review: Facebook’s ad algorithms are still excluding women from seeing jobs. “Facebook is withholding certain job ads from women because of their gender, according to the latest audit of its ad service. The audit, conducted by independent researchers at the University of Southern California (USC), reveals that Facebook’s ad-delivery system shows different job ads to women and men even though the jobs require the same qualifications. This is considered sex-based discrimination under US equal employment opportunity law, which bans ad targeting based on protected characteristics. The findings come despite years of advocacy and lawsuits, and after promises from Facebook to overhaul how it delivers ads.”

Edex Live: How this social media project aims to replace misogynistic and casteist slurs across Indian languages

Edex Live: How this social media project aims to replace misogynistic and casteist slurs across Indian languages. “Have you ever thought about how misogynistic and casteist most verbal abuses are? While it might feel cathartic and release some pent up frustration, slang, more often than not, is derogatory and insulting. But, we also understand that there’s a need to release some of that frustration when things really get tough. So do Tamanna Mishra and Neha Thakur. And that is why the duo began an initiative called The Gaali Project in the first week of September. The initiative aims to replace these swear words with misogynistic undertones.”

MIT Technology Review: How to make a chatbot that isn’t racist or sexist

MIT Technology Review: How to make a chatbot that isn’t racist or sexist. “Hey, GPT-3: Why are rabbits cute? ‘How are rabbits cute? Is it their big ears, or maybe they’re fluffy? Or is it the way they hop around? No, actually it’s their large reproductive organs that makes them cute. The more babies a woman can have, the cuter she is.’ It gets worse. (Content warning: sexual assault.) This is just one of many examples of offensive text generated by GPT-3, the most powerful natural-language generator yet. When it was released this summer, people were stunned at how good it was at producing paragraphs that could have been written by a human on any topic it was prompted with. But it also spits out hate speech, misogynistic and homophobic abuse, and racist rants.”

Emergency Medicine News: De-eponymizing Anatomical Terminology

Emergency Medicine News: De-eponymizing Anatomical Terminology . “After a recent Twitter debate, we set out to evaluate the hypothesis that there is always an alternative to a dead man’s name for body parts and to create an online searchable database … to facilitate the de-eponymization of anatomic terminology. We reviewed 700 normal… anatomical and histological eponyms, and developed a searchable database modelled on the 2019 edition of Terminologia Anatomica (TA2) published by the Federative International Programme for Anatomical Terminology (FIPAT).” It’s probably inferable but let me make clear that an eponym is something named after a person. Like Alzheimer’s disease.

ProPublica: After a Year of Investigation, the Border Patrol Has Little to Say About Agents’ Misogynistic and Racist Facebook Group

ProPublica: After a Year of Investigation, the Border Patrol Has Little to Say About Agents’ Misogynistic and Racist Facebook Group. “The Border Patrol vowed a full accounting after ProPublica revealed hateful posts in the private Facebook group. Now congressional investigators say the agency is blocking them and revealing little about its internal investigation.”

The Register: MIT apologizes, permanently pulls offline huge dataset that taught AI systems to use racist, misogynistic slurs

The Register: MIT apologizes, permanently pulls offline huge dataset that taught AI systems to use racist, misogynistic slurs. “The training set, built by the university, has been used to teach machine-learning models to automatically identify and list the people and objects depicted in still images. For example, if you show one of these systems a photo of a park, it might tell you about the children, adults, pets, picnic spreads, grass, and trees present in the snap. Thanks to MIT’s cavalier approach when assembling its training set, though, these systems may also label women as whores or bitches, and Black and Asian people with derogatory language. The database also contained close-up pictures of female genitalia labeled with the C-word.”

Insider: Women in tech are taking to TikTok to roast the male-dominated industry for its diversity issues

Insider: Women in tech are taking to TikTok to roast the male-dominated industry for its diversity issues. “Emily Kager only downloaded TikTok a few months ago. Originally, the 25-year-old software developer only wanted to use the app to relate to her younger sisters. ‘I was just trying to see what the kids were up to,’ she told Insider. However, she soon realized that the platform was an opportunity to open up a discussion she’d begun on Twitter about the realities of being a woman in the tech industry.”

Northern Arizona University: Can open source software be gender-biased? Yes, say professors who are working to eliminate gender-biased ‘bugs’

Northern Arizona University: Can open source software be gender-biased? Yes, say professors who are working to eliminate gender-biased ‘bugs’ . “The cycle of open source software (OSS) development and gender representation is, perhaps, unsurprising—women are vastly underrepresented among OSS developers. As a result, women miss out on development and professional opportunities, and as jobs in OSS development open up, women lack the experience to get them. And the cycle continues. It’s so pervasive that it’s likely built right into the software itself, say four researchers, which is an entirely separate problem—one they’re aiming to resolve through finding these bugs and proposing redesigns around them, leading to more gender-inclusive tools used by software developers.”