AraNet: New Deep Learning Toolkit for Arabic Social Media (Synced)

Synced: AraNet: New Deep Learning Toolkit for Arabic Social Media. “The performance of natural language processing (NLP) systems has dramatically improved on tasks such as reading comprehension and natural language inference, and with these advances have come many new application scenarios for the tech. Unsurprisingly, English is where most NLP R&D has been focused. Now, a team of researchers from the Natural Language Processing Lab at the University of British Columbia in Canada have proposed AraNet, a deep learning toolkit designed for Arabic social media processing.”

Phys .org: Deep learning enables real-time imaging around corners

Phys .org: Deep learning enables real-time imaging around corners . “Researchers have harnessed the power of a type of artificial intelligence known as deep learning to create a new laser-based system that can image around corners in real time. With further development, the system might let self-driving cars ‘look’ around parked cars or busy intersections to see hazards or pedestrians. It could also be installed on satellites and spacecraft for tasks such as capturing images inside a cave on an asteroid.”

Morning Brew: Finland Expands AI Basics Course to EU

Morning Brew: Finland Expands AI Basics Course to EU. “Finland will relinquish the rotating presidency of the Council of the EU at the end of the year. Its outgoing gift = expanding Elements of AI to 1% of the EU population by 2021. Starting next year, the course will be available in all 24 official EU languages. But since there are no restrictions on who can take the course, this is basically a Christmas present to anyone who speaks one of those languages. Since it launched, over 220,000 people from 110 countries have signed up to take the class (it was available online in English). ” I signed up, said I lived in the United States, no problem.

Ars Technica: Deep Learning breakthrough made by Rice University scientists

Ars Technica: Deep Learning breakthrough made by Rice University scientists. “In an earlier deep learning article, we talked about how inference workloads—the use of already-trained neural networks to analyze data—can run on fairly cheap hardware, but running the training workload that the neural network ‘learns’ on is orders of magnitude more expensive. In particular, the more potential inputs you have to an algorithm, the more out of control your scaling problem gets when analyzing its problem space. This is where MACH, a research project authored by Rice University’s Tharun Medini and Anshumali Shrivastava, comes in.”

The Verge: AI R&D is booming, but general intelligence is still out of reach

The Verge: AI R&D is booming, but general intelligence is still out of reach. “Trying to get a handle on the progress of artificial intelligence is a daunting task, even for those enmeshed in the AI community. But the latest edition of the AI Index report — an annual rundown of machine learning data points now in its third year — does a good job confirming what you probably already suspected: the AI world is booming in a range of metrics covering research, education, and technical achievements.”

Cloudy with a chance of neurons: The tools that make neural networks work (Ars Technica)

Ars Technica: Cloudy with a chance of neurons: The tools that make neural networks work. “Artificial Intelligence—or, if you prefer, Machine Learning—is today’s hot buzzword. Unlike many buzzwords have come before it, though, this stuff isn’t vaporware dreams—it’s real, it’s here already, and it’s changing your life whether you realize it or not.” Deep dive with lots of resources.

Ars Technica: How neural networks work—and why they’ve become a big business

Ars Technica: How neural networks work—and why they’ve become a big business. “Computer scientists have been experimenting with neural networks since the 1950s. But two big breakthroughs—one in 1986, the other in 2012—laid the foundation for today’s vast deep learning industry. The 2012 breakthrough—the deep learning revolution—was the discovery that we can get dramatically better performance out of neural networks with not just a few layers but with many. That discovery was made possible thanks to the growing amount of both data and computing power that had become available by 2012. This feature offers a primer on neural networks. We’ll explain what neural networks are, how they work, and where they came from. And we’ll explore why—despite many decades of previous research—neural networks have only really come into their own since 2012.”