IEEE Spectrum: Natural Language Processing Dates Back to Kabbalist Mystics. “While specific technologies have changed over time, the basic idea of treating language as a material that can be artificially manipulated by rule-based systems has been pursued by many people in many cultures and for many different reasons. These historical experiments reveal the promise and perils of attempting to simulate human language in non-human ways—and they hold lessons for today’s practitioners of cutting-edge NLP techniques. The story begins in medieval Spain.”
Search Engine Land: Why you may not have noticed the Google BERT update. “Google introduced the BERT update to its Search ranking system last week. The addition of this new algorithm, designed to better understand what’s important in natural language queries, is a significant change. Google said it impacts 1 in 10 queries. Yet, many SEOs and many of the tracking tools did not notice massive changes in the Google search results while this algorithm rolled out in Search over the last week. The question is, Why?” Mr. Schwartz with a great explainer on the BERT update.
CNET: Google search engine will better understand natural speech, not just keywords. “Google’s search engine will now better understand your confusing search queries, the company said Friday. Google said it’s updating the tool to improve analysis of natural language. The idea is to let people type in queries that reflect how they speak in real life, instead of entering a string of keywords they think the software is more likely to understand.” I’m a little nonplussed by this; natural language searching has been a thing for a long time. Remember Ask Jeeves? Remember Electric Monk?
VentureBeat: ProBeat: Wolfram’s natural language understanding looks incredibly useful. “Wolfram Research yesterday launched Wolfram Alpha Notebook Edition for Windows, Mac, and Linux. The news largely flew under the radar, which is frankly a shame. The new tool combines Wolfram Alpha and Mathematica to give students (and teachers) a new way to build through whole computations. But it’s the natural language understanding (NLU) examples that really caught my eye.”
New York Times: A Breakthrough for A.I. Technology: Passing an 8th-Grade Science Test. “The world’s top research labs are rapidly improving a machine’s ability to understand and respond to natural language. Machines are getting better at analyzing documents, finding information, answering questions and even generating language of their own.”
EurekAlert: One class in all languages. “Now anyone from around the world can listen live to a Nobel Prize Laureate lecture or earn credits from the most reputable universities with nothing more than internet access. However, the possible information to be gained from watching and listening online is lost if the audience cannot understand the language of the lecturer. To solve this problem, scientists at the Nara Institute of Science and Technology (NAIST), Japan, presented a solution with new machine learning at the 240th meeting of the Special Interest Group of Natural Language Processing, Information Processing Society of Japan (IPSJ SIG-NL).”
Ars Technica: Microsoft open sources algorithm that gives Bing some of its smarts. “Microsoft has released today the SPTAG [Space Partition Tree and Graph] algorithm as MIT-licensed open source on GitHub. This code is proven and production-grade, used to answer questions in Bing. Developers can use this algorithm to search their own sets of vectors and do so quickly: a single machine can handle 250 million vectors and answer 1,000 queries per second. There are some samples and explanations in Microsoft’s AI Lab, and Azure will have a service using the same algorithms.”