University at Buffalo: How to spot deepfakes? Look at light reflection in the eyes

University at Buffalo: How to spot deepfakes? Look at light reflection in the eyes. “University at Buffalo computer scientists have developed a tool that automatically identifies deepfake photos by analyzing light reflections in the eyes. The tool proved 94% effective with portrait-like photos in experiments described in a paper accepted at the IEEE International Conference on Acoustics, Speech and Signal Processing to be held in June in Toronto, Canada.”

CyberScoop: FBI alert warns of Russian, Chinese use of deepfake content

CyberScoop: FBI alert warns of Russian, Chinese use of deepfake content. “The FBI warned in an alert Wednesday that malicious actors ‘almost certainly’ will be using deepfakes to advance their influence or cyber-operations in the coming weeks. The alert notes that foreign actors are already using deepfakes or synthetic media — manipulated digital content like video, audio, images and text — in their influence campaigns.”

Mashable: Listen to deepfake Gucci Mane read classic literature

Mashable: Listen to deepfake Gucci Mane read classic literature. “Mark Twain once said that the mark of a classic is that everyone wants to have read it but not actually read it. It makes sense: Classics must provide some artistic or cultural value to be considered ‘classic’ — but they’re just so boring. MSCHF just made the Western canon more exciting with Project Gucciberg. It’s Project Gutenberg (a collection of public domain Western literature) meets the rapper Gucci Mane. Using Artificial Intelligence, MSCHF recreated his voice to read classics from Pride and Prejudice to Don Quixote.” And Little Women. I think you might need headphones to appreciate this completely.

Gizmodo: ‘Deep Nostalgia’ Can Turn Old Photos of Your Relatives Into Moving Videos

Gizmodo: ‘Deep Nostalgia’ Can Turn Old Photos of Your Relatives Into Moving Videos. “It’s hard to feel connected to someone who’s gone through a static photo. So a company called MyHeritage who provides automatic AI-powered photo enhancements is now offering a new service that can animate people in old photos creating a short video that looks like it was recorded while they posed and prepped for the portrait.”

MIT Technology Review: The year deepfakes went mainstream

MIT Technology Review: The year deepfakes went mainstream. “The vast majority of them are still used for fake pornography. A female investigative journalist was severely harassed and temporarily silenced by such activity, and more recently, a female poet and novelist was frightened and shamed. There’s also the risk that political deepfakes will generate convincing fake news that could wreak havoc in unstable political environments. But as the algorithms for manipulating and synthesizing media have grown more powerful, they’ve also given rise to positive applications—as well as some that are humorous or mundane. Here is a roundup of some of our favorites in a rough chronological order, and why we think they’re a sign of what’s to come.”

The Register: US Senate approves deepfake bill to defend against manipulated media

The Register: US Senate approves deepfake bill to defend against manipulated media. “Introduced last year by US Senators Catherine Cortez Masto (D-NV) and Jerry Moran (R-KS), the Identifying Outputs of Generative Adversarial Networks Act (IOGAN Act) aims to promote research to detect and defend against realistic-looking fakery that can be used for purposes of deception, harassment, or misinformation.”

BuzzFeed News: Thousands Of Women Have No Idea A Telegram Network Is Sharing Fake Nude Images Of Them

BuzzFeed News: Thousands Of Women Have No Idea A Telegram Network Is Sharing Fake Nude Images Of Them. “Over 680,000 women have no idea their photos were uploaded to a bot on the messaging app Telegram to produce photo-realistic simulated nude images without their knowledge or consent, according to tech researchers. The tool allows people to create a deepfake, a computer-generated image, of a victim from a single photo.”

CNET: Your phone may help you fight off deepfakes before they’re even made

CNET: Your phone may help you fight off deepfakes before they’re even made. “Truepic, a San Diego startup, says it’s found a way to prevent deepfakes and doctored images before they can even show up online: by verifying the authenticity of videos and images at the time they’re captured. Now the company is working to put the technology, which it calls Truepic Foresight, in millions of smartphones around the globe by having it embedded it in the Qualcomm processors that power the majority of the world’s Android phones.”

Fast Company: Fake video threatens to rewrite history. Here’s how to protect it

Fast Company: Fake video threatens to rewrite history. Here’s how to protect it. “In an age of very little institutional trust, without a firm historical context that future historians and the public can rely on to authenticate digital media events of the past, we may be looking at the dawn of a new era of civilization: post-history. We need to act now to ensure the continuity of history without stifling the creative potential of these new AI tools.”

Gizmodo: A New Tool for Detecting Deepfakes Looks for What Isn’t There: an Invisible Pulse

Gizmodo: A New Tool for Detecting Deepfakes Looks for What Isn’t There: an Invisible Pulse. “In the endlessly escalating war between those striving to create flawless deepfake videos and those developing automated tools that make them easy to spot, the latter camp has found a very clever way to expose videos that have been digitally modified by looking for literal signs of life: a person’s heartbeat.”

Lawfare: Thirty-Six Hours of Cheapfakes

Lawfare: Thirty-Six Hours of Cheapfakes . “In the last days of August, with the clock ticking down until Election Day, senior Republican officials pulled off a disinformation hat trick: Over the course of two short days, figures affiliated with the GOP published three different deceptively edited videos on social media.” Not familiar with the term “cheapfakes”? Here’s some background.