Opinion: Doctors should be allowed to give priority to vaccinated patients when resources are scarce (Washington Post)

Washington Post: Opinion: Doctors should be allowed to give priority to vaccinated patients when resources are scarce. “This conflicts radically with accepted medical ethics, I recognize. And under ordinary circumstances, I agree with those rules. The lung cancer patient who’s been smoking two packs a day for decades is entitled to the same treatment as the one who never took a puff. The drunk driver who kills a family gets a team doing its utmost to save him — although, not perhaps, a liver transplant if he needs one. Doctors are healers, not judges. But the coronavirus pandemic, the development of a highly effective vaccine, and the emergence of a core of vaccine resisters along with an infectious new variant have combined to change the ethical calculus.”

New York Times: Calls Grow to Discipline Doctors Spreading Virus Misinformation

New York Times: Calls Grow to Discipline Doctors Spreading Virus Misinformation. “The Federation of State Medical Boards, which represents the groups that license and discipline doctors, recommended last month that states consider action against doctors who share false medical claims, including suspending or revoking medical licenses. The American Medical Association says spreading misinformation violates the code of ethics that licensed doctors agree to follow.”

BNN Bloomberg: Fired From Google After Critical Work, AI Researcher Mitchell to Join Startup

BNN Bloomberg: Fired From Google After Critical Work, AI Researcher Mitchell to Join Startup. “The former co-head of Google’s Ethical AI research group, Margaret Mitchell, who was fired in February after a controversy over a critical paper she co-authored, will join artificial intelligence startup Hugging Face to create tools that help companies make sure their algorithms are fair.”

Mashable: The TikTok controversy over collecting human bones, explained

Mashable: The TikTok controversy over collecting human bones, explained. “Human bone collector and distributor Jon Ferry built a TikTok following of nearly 457,000 for his videos sharing facts about human anatomy, showing viewers how forensic anthropologists use bones in their research, and displaying his (literal) bone-chilling collection of human remains. Ferry’s pièce de résistance, which he refers to his ‘pride and joy,’ is a corner stacked floor to ceiling with human spines.”

Harvard Business Review: How to Practice Responsible AI

Harvard Business Review: How to Practice Responsible AI. “From predictive policing to automated credit scoring, algorithms applied on a massive scale, gone unchecked, represent a serious threat to our society. Dr. Rumman Chowdhury, director of Machine Learning Ethics, Transparency and Accountability at Twitter, joins Azeem Azhar to explore how businesses can practice responsible AI to minimize unintended bias and the risk of harm.” A podcast episode of just under 50 minutes. Unfortunately I did not see any reference to a transcript. I tweeted Harvard Business Review and I’ll update this if I hear anything back. UPDATE: Transcripts available only to paying subscribers.

AI: Ghost workers demand to be seen and heard (BBC)

BBC: AI: Ghost workers demand to be seen and heard. “Artificial intelligence and machine learning exist on the back of a lot of hard work from humans. Alongside the scientists, there are thousands of low-paid workers whose job it is to classify and label data – the lifeblood of such systems. But increasingly there are questions about whether these so-called ghost workers are being exploited. As we train the machines to become more human, are we actually making the humans work more like machines?”

CNN: Google offered a professor $60,000, but he turned it down. Here’s why

CNN: Google offered a professor $60,000, but he turned it down. Here’s why. “[Professor Luke] Stark is among a growing number of people in academia who are citing the exits of [Timnit] Gebru and [Margaret] Mitchell for recent decisions to forfeit funding or opportunities provided by the company. Some AI conference organizers are rethinking having Google as a sponsor. And at least one academic who has received a big check from Google in the past has since declared he won’t seek its financial support until changes are made at the company.”

WVIR: Online Center for Ethics now calls UVA home

WVIR: Online Center for Ethics now calls UVA home. “The University of Virginia’s School of Engineering and Applied Science is now the new home of the nationally-renowned Online Ethics Center, a digital library of resources focusing on how to use technology for good. The center hosts free information for the public to use, hoping to provide ethical insight to hard topics like how algorithms impact our politics, or the impact of plastic use on our environment.”

Genealogy’s Star: Genealogy: Ethics, Ownership, Work Product, Plagiarism, and Privacy, Part One

Genealogy’s Star: Genealogy: Ethics, Ownership, Work Product, Plagiarism, and Privacy, Part One. “Over the past almost 40 years of doing genealogical research and interacting with the genealogical community, I have encountered the same issues over and over again. These issues are summarized by concerns involving ethics, ownership, work product, plagiarism, and privacy…. This post is intended to explore all five of these issues. Over the course of this series, I hope to address some of the day-to-day considerations involving genealogists and the four interrelated topics.”

Study: It might be unethical to force AI to tell us the truth (The Next Web)

The Next Web: Study: It might be unethical to force AI to tell us the truth. “….it’s easy to see how building robots that can’t lie could make them patsies for humans who figure out how to exploit their honesty. If your client is negotiating like a human and your machine is bottom-lining everything, you could lose a deal over robo-human cultural differences, for example. None of that answers the question as to whether we should let machines lie to humans or each other. But it could be pragmatic.”

CNN: How one employee’s exit shook Google and the AI industry

CNN: How one employee’s exit shook Google and the AI industry. “[Timnit Gebru’s] ousting, and the fallout from it, reignites concerns about an issue with implications beyond Google: how tech companies attempt to police themselves. With very few laws regulating AI in the United States, companies and academic institutions often make their own rules about what is and isn’t okay when developing increasingly powerful software. Ethical AI teams, such as the one Gebru co-led at Google, can help with that accountability. But the crisis at Google shows the tensions that can arise when academic research is conducted within a company whose future depends on the same technology that’s under examination.”