EurekAlert: New chatbot can explain apps and show you how they access hardware or data. “Chatbots have already become a part of our everyday lives with their quick and intuitive way to complete tasks like scheduling and finding information using natural language conversations. Researchers at Aalto University have now harnessed the power of chatbots to help designers and developers develop new apps and allow end users to find information on the apps on their devices.”
TechCrunch: AI-driven audio cloning startup gives voice to Einstein chatbot. “You’ll need to prick up your ears for this slice of deepfakery emerging from the wacky world of synthesized media: A digital version of Albert Einstein — with a synthesized voice that’s been (re)created using AI voice cloning technology drawing on audio recordings of the famous scientist’s actual voice.”
Mashable: Meet the chatbot that simulates a teen experiencing a mental health crisis. “In digital conversation, Riley is a young person who is trying to come out as genderqueer. When you message Riley, they’ll offer brief replies to open-ended questions, sprinkle ellipses throughout when saying something difficult, and type in lowercase, though they’ll capitalize a word or two for emphasis. Riley’s humanness is impressive given that they’re a chatbot driven by artificial intelligence to accomplish a unique goal: simulate what it’s like to talk to a young person in crisis so that volunteer counselors can become skilled at interacting with them and practice asking about thoughts of suicide.”
CNET: Microsoft patent details tech that could turn dead people into AI chatbots. “An AI chatbot that lets you interact with dead loved ones sounds like something straight out of science fiction. But if technology in a patent granted to Microsoft comes to fruition, interacting with a chatty 3D digital version of the deceased could one day become de rigueur.”
MIT Technology Review: How to make a chatbot that isn’t racist or sexist. “Hey, GPT-3: Why are rabbits cute? ‘How are rabbits cute? Is it their big ears, or maybe they’re fluffy? Or is it the way they hop around? No, actually it’s their large reproductive organs that makes them cute. The more babies a woman can have, the cuter she is.’ It gets worse. (Content warning: sexual assault.) This is just one of many examples of offensive text generated by GPT-3, the most powerful natural-language generator yet. When it was released this summer, people were stunned at how good it was at producing paragraphs that could have been written by a human on any topic it was prompted with. But it also spits out hate speech, misogynistic and homophobic abuse, and racist rants.”
Social Media Examiner: Chatbot Strategy: How to Improve Your Marketing With Bots. “Wondering if your business should start using chatbots? Looking for tips on what chatbots can do and how to set them up? To explore how to improve your marketing with bots, I interview Natasha Takahashi on the Social Media Marketing Podcast. Natasha is a chat marketing expert and co-founder of School of Bots, the leading training site for creating profitable chatbots. She’s also host of the 10 Minute Chatbot Marketer podcast. You’ll discover six ways to use bots in your Facebook marketing and find tips for developing a chatbot strategy. You’ll also learn uses for chatbots outside of Facebook Messenger.” Podcast with extensive article.
EurekAlert: New software agents will infer what users are thinking. “Personal assistants today can figure out what you are saying, but what if they could infer what you were thinking based on your actions? A team of academic and industrial researchers led by Carnegie Mellon University is working to build artificially intelligent agents with this social skill.”
Google says its latest chatbot is the most human-like ever – trained on our species’ best works: 341GB of social media (The Register)
The Register: Google says its latest chatbot is the most human-like ever – trained on our species’ best works: 341GB of social media. “AI researchers at Google have trained a giant neural network using a whopping 341GB of discussions scraped from public social media to create what they believe is the most human-like chatbot ever.” Just read this story because the quoted conversation between Meena and a human is glorious. Why? Because it was outstanding in its field!
The Next Web: 2020 will mark the death of the chatbot as we know it. “According to recent research, only 9 percent of customers felt that they would be best served by a chatbot for serious enquiries, whereas the figures for a voice call were in excess of 80 percent. But with 80 percent of contact centers wanting to adopt chatbot technology by 2020, what is does this industry know that we don’t? Well, they are seeing the bright and not-so-distant future of this technology, and it doesn’t look like a thing like your average chatbot.”
Fast Company: There’s now a chatbot to give refugees instant legal advice. “For a Syrian refugee in Lebanon who is trying to navigate the legal path to resettlement, it can be difficult to find answers—and overstretched humanitarian organizations can take as long as three months to respond to an email when the demand for help is highest. A new chatbot called Mona, designed for Facebook Messenger and Telegram, can help more quickly.”
TechCrunch: Microsoft launches Power Virtual Agents, its no-code bot builder. “Microsoft today announced the public preview of its Power Virtual Agents tool, a new no-code tool for building chatbots that’s part of the company’s Power Platform, which also includes Microsoft Flow automation tool, which is being renamed to Power Automate today, and Power BI.”
Make Tech Easier: How to Build a Chatbot without Coding. “Building truly intelligent chatbots requires the knowledge of something like Python and high-level libraries such as CoreNLP. By the same token, creating dummy ones for simple tasks does not require any coding skills. In fact, by the end of this, you will have learned how to launch your own chatbots on the Web.”
VentureBeat: Some Alexa Prize chatbots exposed customer data, talked filth. “Millions of users of Amazon’s Echo speakers have grown accustomed to the soothing strains of Alexa, the human-sounding virtual assistant that can tell them the weather, order takeout and handle other basic tasks in response to a voice command. So a customer was shocked last year when Alexa blurted out: ‘Kill your foster parents.’”