A former Google tech lead bragged on Twitter about how he used to trash women’s résumés in front of them: ‘Go have some kids’ (Business Insider)

Business Insider: A former Google tech lead bragged on Twitter about how he used to trash women’s résumés in front of them: ‘Go have some kids’. “A former Google programmer bragged on Twitter this week about how he used to trash the résumés of female interviewees in front of them. In his now-deleted tweets, Patrick Shyu recounted how he used to treat the women he interviewed. ‘So when I used to conduct interviews for Google, I rejected all women on the spot and trashed their résumés in front of them,’ Shyu wrote in a May 22 post seen by Insider.”

Politico: The woman behind the Gender Pay Gap Bot

Politico: The woman behind the Gender Pay Gap Bot. “The Twitter bot uses data on the pay gap that British companies are required to disclose by a regulation that went into effect in 2017. When the budget airline Ryanair sent out its International Women’s Day tweet, an image in the style of a movie poster with photos of women employees under the words ‘THE FLIGHT SQUAD,’ the Pay Gap Bot shot out a typically straightforward, cutting quote-tweet: ‘In this organisation, women’s median hourly pay is 68.6% lower than men’s.’”

Penn Today: Bridging Wikipedia’s gender gap, one article at a time

Penn Today: Bridging Wikipedia’s gender gap, one article at a time. “A new study co-authored by Isabelle Langrock, a Ph.D. candidate at the Annenberg School for Communication, and Annenberg associate professor Sandra González-Bailón evaluates the work of two prominent feminist movements, finding that while these movements have been effective in adding a large volume of biographical content about women to Wikipedia, such content remains more difficult to find due to structural biases.”

The Metaverse’s Dark Side: Here Come Harassment and Assaults (New York Times)

New York Times: The Metaverse’s Dark Side: Here Come Harassment and Assaults. “Harassment, assaults, bullying and hate speech already run rampant in virtual reality games, which are part of the metaverse, and there are few mechanisms to easily report the misbehavior, researchers said. In one popular virtual reality game, VRChat, a violating incident occurs about once every seven minutes, according to the nonprofit Center for Countering Digital Hate.”

Mashable: Instagram scores lowest on social media sexism report card

Mashable: Instagram scores lowest on social media sexism report card. “Compiled in partnership with the Institute for Strategic Dialogue, the report evaluates Facebook, Instagram, TikTok, Twitter, Reddit, and YouTube’s policies against UltraViolet’s 11 Policy Recommendations. It then averages each platform’s scores and assigns a letter grade according to Harvard University Graduate School of Education’s grading rubric. Predictably, nobody got a gold star for their work, with Instagram emerging as the dunce of the class with an abysmal F overall. But even Reddit, the highest scoring of the lot, only walked away with a C average.”

WBUR: Search Engines Like Google Are Powered By Racist, Misogynist Algorithms, Says MacArthur Fellow

WBUR: Search Engines Like Google Are Powered By Racist, Misogynist Algorithms, Says MacArthur Fellow. “Safiya Noble burst out in tears upon hearing the news of her MacArthur Fellowship — when she finally answered the phone after a week of believing the Chicago number was robocalling her. Noble studies internet bias, and how search engines like Google or Yahoo exacerbate racism and bias against women. She’s founder and co-director of the University of California Los Angeles’ new Center for Critical Internet Inquiry.”

The Next Web: Why Alexa and not Alexander? How gendered voice assistants are hurting as they help

The Next Web: Why Alexa and not Alexander? How gendered voice assistants are hurting as they help. “For a society that hasn’t quite broken out of its mindset around traditional gender roles, seeing women as everyone else’s helpers instead of their own people with their own destinies is par for the course. We even see this reflected in the emerging field of AI voice assistants – all of which sound female.”

Ohio State News: Groundbreaking ideas from women scientists get less attention

Ohio State News: Groundbreaking ideas from women scientists get less attention. “Researchers used a novel way of tracing the flow of ideas to find that even some of the most well-known breakthroughs in biomedical research from 1980 to 2008 had a more difficult road to adoption when research teams were dominated by women. Specifically, the five-year adoption rate of new ideas from female-majority teams was 23% lower than that of male-majority teams – even among the top 0.1% of ideas.”

BBC: The firms paid to delve into sport stars’ social media past

BBC: The firms paid to delve into sport stars’ social media past. “Officials at sporting organisations are increasingly requesting detailed reports into sport stars’ social media history to unearth risky and problematic content. For companies offering so-called online due diligence – or background checks – the headlines around England cricketer Ollie Robinson, who was suspended over offensive historic tweets, are a cautionary tale likely to feature in sales pitches in the coming months and years.”

MIT Technology Review: Facebook’s ad algorithms are still excluding women from seeing jobs

MIT Technology Review: Facebook’s ad algorithms are still excluding women from seeing jobs. “Facebook is withholding certain job ads from women because of their gender, according to the latest audit of its ad service. The audit, conducted by independent researchers at the University of Southern California (USC), reveals that Facebook’s ad-delivery system shows different job ads to women and men even though the jobs require the same qualifications. This is considered sex-based discrimination under US equal employment opportunity law, which bans ad targeting based on protected characteristics. The findings come despite years of advocacy and lawsuits, and after promises from Facebook to overhaul how it delivers ads.”

Edex Live: How this social media project aims to replace misogynistic and casteist slurs across Indian languages

Edex Live: How this social media project aims to replace misogynistic and casteist slurs across Indian languages. “Have you ever thought about how misogynistic and casteist most verbal abuses are? While it might feel cathartic and release some pent up frustration, slang, more often than not, is derogatory and insulting. But, we also understand that there’s a need to release some of that frustration when things really get tough. So do Tamanna Mishra and Neha Thakur. And that is why the duo began an initiative called The Gaali Project in the first week of September. The initiative aims to replace these swear words with misogynistic undertones.”

MIT Technology Review: How to make a chatbot that isn’t racist or sexist

MIT Technology Review: How to make a chatbot that isn’t racist or sexist. “Hey, GPT-3: Why are rabbits cute? ‘How are rabbits cute? Is it their big ears, or maybe they’re fluffy? Or is it the way they hop around? No, actually it’s their large reproductive organs that makes them cute. The more babies a woman can have, the cuter she is.’ It gets worse. (Content warning: sexual assault.) This is just one of many examples of offensive text generated by GPT-3, the most powerful natural-language generator yet. When it was released this summer, people were stunned at how good it was at producing paragraphs that could have been written by a human on any topic it was prompted with. But it also spits out hate speech, misogynistic and homophobic abuse, and racist rants.”