NiemanLab: I create “convincing” manipulated images and videos — but quality may not matter much. “I’m proud of the work we’ve done, and hope it will help people keep track of the truth in a media-flooded world. But we’ve found that a key element of the battle between truth and propaganda has nothing to do with technology. It has to do with how people are much more likely to accept something if it confirms their beliefs.”
Meduza: Cops and corgis: New Russian social media bot puts riot police where you least expect them. “FreeOmon is a bot account based in the social network Telegram. Its sole function is to add Russian riot police to preexisting images. Any user who sends the bot an image file immediately receives a new-and-improved, highly secured version in return. “
Boing Boing: This AI turns your headshot into a portrait painted by a master. “AI Portraits does an amazing job of creating original portraits based on photos of faces.” I tried this with several images. Sometimes it was good and sometimes it was… not kind.
Ars Technica: Behind the 12-year-old Wii Sports hoax that briefly fooled the Internet. “Before his resignation in late 2017, Uber’s then-CEO Travis Kalanick faced more than his fair share of scandals. But by far the most (read: least) important of these was Kalanick’s oft-repeated claim that, at one point, he ‘held the world’s second-highest score for the Nintendo Wii Tennis video game,’ as a New York Times profile confidently stated without qualification.”
The Next Web: The world isn’t ready for deepfakes. Here’s what we need to do.. “But here’s the thing: deepfakes are getting so impossibly convincing, even the best discerners aided with the right technology are having trouble telling the difference between what’s faked and what’s real. This isn’t a parlor trick. In the right hands, deepfakes have the potential to destabilize entire societies—and we’re nowhere near ready to deal with the threat.”
Nieman Journalism Lab: Can you spot a fake photo online? Your level of experience online matters a lot more than contextual clues. “My collaborators and I recently studied how people evaluate the credibility of images that accompany online stories and what elements figure into that evaluation. We found that you’re far less likely to fall for fake images if you’re more experienced with the internet, digital photography, and online media platforms — if you have what scholars call ‘digital media literacy.'”
Motherboard; This Horrifying App Undresses a Photo of Any Woman With a Single Click. “The software, called DeepNude, uses a photo of a clothed person and creates a new, naked image of that same person. It swaps clothes for naked breasts and a vulva, and only works on images of women. When Motherboard tried using an image of a man, it replaced his pants with a vulva. While DeepNude works with varying levels of success on images of fully clothed women, it appears to work best on images where the person is already showing a lot of skin. We tested the app on dozens of photos and got the most convincing results on high resolution images from Sports Illustrated Swimsuit issues.” The app has since been taken down.