The Register: MIT apologizes, permanently pulls offline huge dataset that taught AI systems to use racist, misogynistic slurs

The Register: MIT apologizes, permanently pulls offline huge dataset that taught AI systems to use racist, misogynistic slurs. “The training set, built by the university, has been used to teach machine-learning models to automatically identify and list the people and objects depicted in still images. For example, if you show one of these systems a photo of a park, it might tell you about the children, adults, pets, picnic spreads, grass, and trees present in the snap. Thanks to MIT’s cavalier approach when assembling its training set, though, these systems may also label women as whores or bitches, and Black and Asian people with derogatory language. The database also contained close-up pictures of female genitalia labeled with the C-word.”

Insider: Women in tech are taking to TikTok to roast the male-dominated industry for its diversity issues

Insider: Women in tech are taking to TikTok to roast the male-dominated industry for its diversity issues. “Emily Kager only downloaded TikTok a few months ago. Originally, the 25-year-old software developer only wanted to use the app to relate to her younger sisters. ‘I was just trying to see what the kids were up to,’ she told Insider. However, she soon realized that the platform was an opportunity to open up a discussion she’d begun on Twitter about the realities of being a woman in the tech industry.”

Northern Arizona University: Can open source software be gender-biased? Yes, say professors who are working to eliminate gender-biased ‘bugs’

Northern Arizona University: Can open source software be gender-biased? Yes, say professors who are working to eliminate gender-biased ‘bugs’ . “The cycle of open source software (OSS) development and gender representation is, perhaps, unsurprising—women are vastly underrepresented among OSS developers. As a result, women miss out on development and professional opportunities, and as jobs in OSS development open up, women lack the experience to get them. And the cycle continues. It’s so pervasive that it’s likely built right into the software itself, say four researchers, which is an entirely separate problem—one they’re aiming to resolve through finding these bugs and proposing redesigns around them, leading to more gender-inclusive tools used by software developers.”

IFL Science: This Is Why Women Are Setting Their Gender To Male On Instagram

IFL Science: This Is Why Women Are Setting Their Gender To Male On Instagram. “The Instagram community guidelines state that nudity and inappropriate content is not allowed on the platform. ‘This includes photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks. It also includes some photos of female nipples, but photos of post-mastectomy scarring and women actively breastfeeding are allowed.’ However back in April, the Instagram algorithm changed to demote certain posts, even if they don’t technically break the rules set by the platform itself, HuffPost reports. “

Newswise: Women Have Substantially Less Influence on Twitter than Men in Academic Medicine

Newswise: Women Have Substantially Less Influence on Twitter than Men in Academic Medicine. “Women who are health policy or health services researchers face a significant disparity in social media influence compared to their male peers, according to a new study from researchers in the Perelman School of Medicine at the University of Pennsylvania. Although the average number of tweets among all researchers tend to be consistent, women trail behind men in follower counts, regardless of how active they are on Twitter.”

Content Moderation At Scale Is Impossible: Facebook Still Can’t Figure Out How To Deal With Naked Breasts (Techdirt)

Techdirt: Content Moderation At Scale Is Impossible: Facebook Still Can’t Figure Out How To Deal With Naked Breasts. “Like a teenaged heterosexual boy, it appears that Facebook has no clue how to deal with naked female breasts. Going back over a decade, the quintessential example used to show the impossibility of coming up with clear, reasonable rules for content moderation at scale is Facebook and breasts.”

Reveal News: CFPB moves to limit home loan data

Reveal News: CFPB moves to limit home loan data. “Want to know which banks target people of color for loans with high interest rates, steep fees, or reverse mortgages? Or which banks deny home loans to African Americans and Latinos even when their income shows they could easily cover the monthly payment? You won’t be able to find out if new regulations proposed by the Consumer Financial Protection Bureau go through.”

Business Wire: Pantene Launches S.H.E. – Search. Human. Equalizer. – to Shine a Light on Bias in Search (PRESS RELEASE)

Business Wire via AP: Pantene Launches S.H.E. – Search. Human. Equalizer. – to Shine a Light on Bias in Search (PRESS RELEASE). “S.H.E. is a search tool that shows us what a more equal world could look like by removing the bias in search. Available via Chrome extension, S.H.E. operates on the search back end by filtering results to produce less biased and more balanced results, ultimately giving the women behind some of the world’s greatest accomplishments and transformations the visibility they deserve.” I was COMPLETELY gobsmacked by this until I saw a little further down that it’s limited in what it can search. Still, though…. The Chrome extension wants really wide permissions and I can’t figure out which search engines specifically this is targeting.

The Conversation: Google’s algorithms discriminate against women and people of colour

The Conversation: Google’s algorithms discriminate against women and people of colour. “At the start of Black History Month 2019, Google designed its daily-changing homepage logo to include an image of African-American activist Sojourner Truth, the great 19th-century abolitionist and women’s rights activist. But what would Truth say about Google’s continual lack of care and respect toward people of colour? While bringing more attention to Sojourner Truth is venerable, Google can do better. As a professor and researcher of digital cultures, I have found that a lack of care and investment by tech companies towards users who are not white and male allows racism and sexism to creep into search engines, social networks and other algorithmic technologies.”

Business Insider: YouTube’s algorithm is under fire for boosting a sexist conspiracy theory about black hole researcher Katie Bouman

Business Insider: YouTube’s algorithm is under fire for boosting a sexist conspiracy theory about black hole researcher Katie Bouman. “As news of Dr. Katie Bouman’s role in capturing the first image of a black hole went viral earlier this week, another group was creating their own version of the story that accused Bouman of profiting off the hard work of a male colleague on the Event Horizon Telescope team. That false narrative quickly found its way to social media, and YouTube. Earlier this afternoon, people began to notice that the top result when searching Bouman’s name on YouTube produced a video by a user named Mr. Obvious.”

News@Northeastern: Your Gender And Race Might Be Determining Which Facebook Ads You See

News@Northeastern: Your Gender And Race Might Be Determining Which Facebook Ads You See. “The research was troubling. It showed that the group of users to whom Facebook chose to show ads can be skewed along gender and racial lines, in potential violation of federal laws that prevent discrimination in ads for employment, housing, and credit. A Northeastern team tested Facebook’s advertising system with a series of online advertisements. As the researchers tweaked the images, Facebook’s system presented the ads more predominantly to specific racial and gender groups.” This is not the researchers intentionally microtargeting. This is Facebook’s own algorithm doing this.

Quartz: There’s a way to find out if your Twitter interactions are sexist

Quartz: There’s a way to find out if your Twitter interactions are sexist. “Conversations in patriarchal societies tend to amplify men’s voices more than women’s, and Twitter is no exception to this rule. A study last year found male US political reporters retweet other men three times more than they do their female colleagues, while Harvard Business School researchers in 2009 found men in general were almost twice as likely to follow another man as a woman. Many Twitter users may not be intentionally ignoring women’s input, but these small biases add up. And there’s now a tool that shows whether you retweet and respond to men more than women on Twitter.” Just a note that the domain name in the article, at this writing, as an extra period at the end and will give you an error. Remove the period and you’ll get to the tweet analysis tool.