NBC News: Facial recognition’s ‘dirty little secret’: Millions of online photos scraped without consent. “As the algorithms get more advanced — meaning they are better able to identify women and people of color, a task they have historically struggled with — legal experts and civil rights advocates are sounding the alarm on researchers’ use of photos of ordinary people. These people’s faces are being used without their consent, in order to power technology that could eventually be used to surveil them. That’s a particular concern for minorities who could be profiled and targeted, the experts and advocates say.”
Biometric Update: Site using facial recognition to match photos from Russian social media network sued. “A new website enabling users to search the image database of Russian social media site VKontakte with facial biometrics has been discovered, and then threatened with legal action, prompting it to switch off some functions, TOL.org reports.”
CNET: Chinese facial recognition company left database of people’s locations exposed. “A Chinese facial recognition company left its database exposed online, revealing information about millions of people, a security researcher discovered. SenseNets, a company based in Shenzhen, China, offers facial recognition technology and crowd analysis, which the company boasted in a promotional video could track people across cities and pick them out in large groups.”
From the South China Morning Post: How a 14-year-old Hongkonger built an app to help Alzheimer’s patients connect with their loved ones . “At the age of 14, the Hong Kong-born [Emma] Yang has already created her own mobile app for Alzheimer’s patients, which has impressed the likes of Microsoft Corp founder Bill Gates and Alibaba Group Holding executive vice-chairman Joseph Tsai. The Timeless app, which Yang spent two years developing and refining, comes with several core features. It uses an artificial intelligence-powered facial recognition system, from Miami-based start-up Kairos, to help Alzheimer’s patients identify people in photos and remember who they are.” Thank you Emma Yang.
CNET: IBM hopes 1 million diverse faces can reduce bias in AI. “IBM Research on Tuesday released a new data set that contains 1 million images of diverse human faces, with an aim to help advance fairness and accuracy in facial recognition technology.”
New York Times: Amazon Is Pushing Facial Technology That a Study Says Could Be Biased. “In the study, published Thursday, Rekognition made no errors in recognizing the gender of lighter-skinned men. But it misclassified women as men 19 percent of the time, the researchers said, and mistook darker-skinned women for men 31 percent of the time. Microsoft’s technology mistook darker-skinned women for men just 1.5 percent of the time.”
BBC: Facial recognition tool tackles illegal chimp trade. “Wildlife criminals had better watch out! The same software that recognises you in a friend’s social media post is being adapted to tackle the illegal trade in chimpanzees.”