Medical Xpress: Using lung X-rays to diagnose COVID-19

Medical Xpress: Using lung X-rays to diagnose COVID-19. “Researchers from the Department of Computer Architecture and Technology at the University of Seville’s School of Computer Engineering (ETSII) are working on a system that uses X-ray images of patients’ lungs to help diagnose COVID-19. This system uses deep learning to train a neural network model that can distinguish between healthy patients, pneumonia patients and COVID-19 patients. This has been achieved using a freely accessible online database that medical professionals from around the world have been feeding with lung X-rays since the onset of the pandemic.”

Geekologie: Fleischer Studios ‘Superman’ Upscaled To 4k Using Neural Networks

Geekologie: Fleischer Studios ‘Superman’ Upscaled To 4k Using Neural Networks. “YouTuber Jose Argumedo took the 1941 Fleischer Studios Superman cartoon ‘The Bulleteers’ and upscaled it using Waifu2x, an image upscaler that uses deep convolutional neural networks. Waifu2x is trained on anime (as evidenced by the name) and it works remarkably well for any animation and even pixel art.”

EurekAlert: New learning algorithm should significantly expand the possible applications of AI

EurekAlert: New learning algorithm should significantly expand the possible applications of AI. “The high energy consumption of artificial neural networks’ learning activities is one of the biggest hurdles for the broad use of Artificial Intelligence (AI), especially in mobile applications. One approach to solving this problem can be gleaned from knowledge about the human brain. Although it has the computing power of a supercomputer, it only needs 20 watts, which is only a millionth of the energy of a supercomputer. One of the reasons for this is the efficient transfer of information between neurons in the brain. Neurons send short electrical impulses (spikes) to other neurons – but, to save energy, only as often as absolutely necessary.”

Rice University: Early Bird uses 10 times less energy to train deep neural networks

Rice University: Early Bird uses 10 times less energy to train deep neural networks. “Researchers from Rice and Texas A&M University unveiled Early Bird April 29 in a spotlight paper at ICLR 2020, the International Conference on Learning Representations. A study by lead authors Haoran You and Chaojian Li of Rice’s Efficient and Intelligent Computing (EIC) Lab showed Early Bird could use 10.7 times less energy to train a DNN to the same level of accuracy or better than typical training. EIC Lab director Yingyan Lin led the research along with Rice’s Richard Baraniuk and Texas A&M’s Zhangyang Wang.”

Towards Data Science: Shakespeare Meets Google’s Flax

Towards Data Science: Shakespeare Meets Google’s Flax. “Google Researcher introduced Flax, a new rising star in Machine Learning, a few months ago. A lot has happened since then and the pre-release has improved tremendously. My own experiments with CNNs on Flax are bearing fruit and I am still amazed about the flexibility compared to Tensorflow. Today I will show you an application of RNNs in Flax: Character-Level Language Model.”

The Serious Computer Vision Blog: Training a Rap Machine

The Serious Computer Vision Blog: Training a Rap Machine. “In my previous post, I gave a short tutorial on how to use the Google AI platform for small garage projects. In this post, I am going to follow up and talk about how I built (or more like an attempt to build) my holiday project, a machine that completes your rap lyrics using the ‘Transformer’ neural network.” I played with it a little using lyrics from G YAMAZAWA’s “North Cack”. It was… pretty good?

Neowin: Neural networks are now being used to track exotic particles at CERN

Neowin: Neural networks are now being used to track exotic particles at CERN. “Research within the domain of physics has profited from the rise of artificial neural networks and deep learning. In the past, we’ve seen them being applied to study dark matter and massive galaxies. Continuing this pattern, we now have artificial neural networks being used in the study of exotic particles.”

The Next Web: Reuters built a prototype for automated news videos using Deepfakes tech

The Next Web: Reuters built a prototype for automated news videos using Deepfakes tech. “The Reuters news company and an AI startup named Synthesia today unveiled a new project they’ve partnered on that uses Deepfakes-style technology to generate automated news reports in real time.”

Boing Boing: Neural network restores and colorizes old movies

Boing Boing: Neural network restores and colorizes old movies. “From the excellent “Two Minute Papers” YouTube channel, a discussion of a paper titled ‘DeepRemaster: Temporal Source-Reference Attention Networks for Comprehensive Video Enhancement,’ that demonstrates the results of a neural network that fixes and colorizes aged, blurry, scratchy films.” My husband and I watched this last night. I’m kind of a snob about AI-based colorizing, so that was eh, but the restoration of old/degraded video was absolutely remarkable.

The Register: Facebook mulls tagging pics with ‘radioactive’ markers to trace the origin of photos used to build image-recog AI

The Register: Facebook mulls tagging pics with ‘radioactive’ markers to trace the origin of photos used to build image-recog AI. “Facebook researchers have developed a digital watermarking technique that allows developers to tell if a particular machine-learning model was trained using marked images.”

TechCrunch: Facebook speeds up AI training by culling the weak

TechCrunch: Facebook speeds up AI training by culling the weak. “Training an artificial intelligence agent to do something like navigate a complex 3D world is computationally expensive and time-consuming. In order to better create these potentially useful systems, Facebook engineers derived huge efficiency benefits from, essentially, leaving the slowest of the pack behind.”

Ars Technica: How Google researchers used neural networks to make weather forecasts

Ars Technica: How Google researchers used neural networks to make weather forecasts. “The researchers say their results are a dramatic improvement over previous techniques in two key ways. One is speed. Google says that leading weather forecasting models today take one to three hours to run, making them useless if you want a weather forecast an hour in the future. By contrast, Google says its system can produce results in less than 10 minutes—including the time to collect data from sensors around the United States.”

News@Northeastern: He’s Training Computers To Find New Molecules With The Machine Learning Algorithms Used By Facebook And Google

News@Northeastern: He’s Training Computers To Find New Molecules With The Machine Learning Algorithms Used By Facebook And Google. “For more than a decade, Facebook and Google algorithms have been learning as much as they can about you. It’s how they refine their systems to deliver the news you read, those puppy videos you love, and the political ads you engage with. These same kinds of algorithms can be used to find billions of molecules and catalyze important chemical reactions that are currently induced with expensive and toxic metals, says Steven A. Lopez, an assistant professor of chemistry and chemical biology at Northeastern.”