Translate at Hyper Speed with Facebook

I wish that the language barrier did not exist, but since that will never go away I wish a quick, accurate translation device existed.  We live in a time when such a device is not science-fiction, but only decades, possibly only years away.  Companies are experimenting with new algorithms, AI, and other processes to speed up translation and accuracy.  Bitext relies on breakthrough Deep Linguistic Analysis, while Digital Trends reports that Facebook is working on speed in “Facebook IS Using AI To Make Language Translation Nine Times Faster.”

Facebook reports that their artificial intelligence translates foreign languages nine times faster than the traditional language software.  What is even more astonishing is that the source code is open source, so anyone can download and use it.  Facebook uses convolutional neural networks (CNN) to translate.  CNNs are not new, but this is the first time they have been used for translation.  How does it work?

The report highlights the use of convolutional neural networks (CNN) as opposed to recurrent neural networks (RNN), which translate sentences one word at a time in a linear order. The new architecture, however, can take words further down in the sentence into consideration during the translation process, which helps make the translation far more accurate. This actually marks the first time a CNN has managed to outperform an RNN in language translation, and Facebook now hopes to expand it to to cover more languages.

I think the open source aspect is the most important part.  Language translation software relies on a lot of data in order to have decent comprehension skills.  Open source users tend to share and share alike, so we can rely on them feeding huge data piles to the code.

Whitney Grace, May 30, 2017

 

Amazon Phone Thinly Disguised as Alexa Calling

Amazon once tried to release its own phone, but it proved a failure.  Amazon is trying the venture once again as a feature on its smart assistant Alexa called “Alexa Calling.  Alexa Calling allows users to send free voice calls and messages through Echo devices.  It does not come as a surprise, however, that there is a design flaw, this one related to privacy.  The Tech Portal explains more about the privacy issue in, “Privacy Flaw: Why You Should Think Twice Before Enabling Alexa Calling?”

It seems like enable Alexa Calling would be a good idea.  You could call or text anyone for free without evening picking up a phone.  The problem is that when the feature is enabled, anyone can access your contacts list if they can access the speaker.  They will also be able to listen to calls and messages.  Even worse is that Alexa Calling does not allow users to curate their contact list, rather everyone is added.

The absolute worst problem is this:

What’s even more creepy (at least I think it is) is the fact, if you have blocked someone’s number on your phone, won’t matter when s/he calls on the Echo device. YES, they can call and you can do absolutely nothing as of now to prevent this from happening. This is because Alexa just uses your number and not your phone and the settings for voice calls. Hence, you cannot block the incoming calls at all.

When Amazon was asked about blocking unwanted callers on Alexa Calling, their reply was there currently was not any way to do it.  Good job Amazon, another failure in communication.

Whitney Grace, May 25, 2017

Bots Speak Better Than Humans

While Chatbots’ language and comprehension skills remain less than ideal, AI and algorithms are making them sound more and more human each day.  Quartz proposes that chatbots are becoming more human in their actions than their creators in the article, “Bots Are Starting To Sound More Human Than Most Humans.”  The article makes a thoughtful argument that while humans enjoy thinking that their actions are original, in reality, humans are predictable and their actions can be “botified.”

Computers and bots are becoming more human, but at the same time human communication is de-evolving.  Why is this happening?

It might be because we have a limited amount of time and an unlimited amount of online relationships. In an effort to solve this imbalance, we are desperately trying to be more efficient with the time we put into each connection. When we don’t have the time to provide the necessary thought and emotional depth that are hallmarks of human communication, we adopt the tools and linguistic simplicity of bots. But when our communication is focused on methods of scaling relationships instead of true connection, the process removes the thought, emotion, and volition that makes that communication human.

The article uses examples from LinkedIn, Facebook, and email to show how many human interactions have become automated.  Limited time is why we have come to rely on automated communication.  The hope is to free up time for more valuable interactions.  Computers still have not passed the Turing Test and it will only be a matter of time before it does happen.  Companies like Bitext with their breakthrough computational linguistics technology are narrowing the margins.  The article ends on a platitude that we need to turn off the bot aspects of our personality and return to genuine communication.  Yes, this is true, but also seems like a Luddite response.

The better assertion to make is that humans need to remember their human uniqueness and value true communication.

Whitney Grace, May 25, 2017

Alexa Denies She Is a Spy

According to Amazon, it was just a glitch, but Alexa’s response to one user’s queries raised suspicions. In “Alexa, Are You Connected to the CIA?”, New York Magazine reports one Alexa user posed that very question to the virtual assistant. Writer Madison Malone Kircher reveals:

First, the person asks if Alexa would lie: Alexa says she always tries to tell the truth …. Next, the person asks what the CIA is: Alexa gives a boilerplate definition. And finally, they ask if Alexa is connected to the CIA. Alexa’s response: crickets.

There was apparently a video, but it has since been removed. Don’t bother trying to reproduce the experiment; an update to the story explains Alexa now responds with, “No, I work for Amazon.” Good answer.

Cynthia Murrell, May 24, 2017

Users Discuss Health with Alexa

For better or worse, many people now turn to WebMD for health information. We learn from Forbes’ article, “Amazon Alexa Can Now Be Your Doctor” that Amazon has worked with that site to develop a skill enabling Alexa to answer basic health questions. For those who want to get it in writing, contributor Lee Bell mentions:

In addition to providing answers via voice, the new WebMD integration gives users the chance to request additional information sent in text form to their Alexa app.

Though her voice may tempt us to think otherwise, Alexa’s involvement does nothing to combat the problem of misdiagnosis-via-internet. Still, for those determined to research their symptoms before calling the doctor, this skill could save some time.

Cynthia Murrell, May 23, 2017

Microsoft Does Their Share to Make Chatbots Smarter

Chatbots are like a new, popular toy that everyone must have, but once they are played with the glamor wears off and you realize they are not that great.   For lack of better terms, chatbots are dumb.  They have minimal comprehension and can only respond with canned phrases.  Chatbots are getting better because companies are investing in linguistic resources and sentimental analysis.  InfoQ tells us about Microsoft’s contributions to chatbots’ knowledge in, “Microsoft Releases Dialogue Dataset To Make Chatbots Smarter.”

The Microsoft company, Maluuba, released a new chatbot dialogue dataset about booking vacations with the hopes to make chatbots more intelligent.  Maluuba accomplished this task by having two humans communicate via a chatbox, no vocal dialogue was exchanged.  One human was trying to find the best price for a flight, while the other human who played chatbot used a database to find the information.  Travel-related chatbots are some of the dumber of the species, because travel planning requires a lot of details, content comprehension, and digesting multiple information sources.

What makes travel planning more difficult is that users often change the topic of their conversation. Simultaneously you might discuss your plan to go to Waterloo, Montreal, and Toronto. We humans have no trouble with keeping apart different plans people make while talking. Unfortunately, If users explore multiple options before booking, computers tend to run into problems. Most chatbots forget everything you talked about when you suddenly enter a new destination.

Maluuba’s dataset is like enrolling a chatbot in a travel agent course and it will benefit anyone interested in a travel planning chatbot.  Maluuba is one of many companies, not just Microsoft owned, that are sharing their own expertise and building specific chatbot datasets.  One such company is Bitext, but their expertise lies in a number of languages they can teach a chatbot.

Whitney Grace, May 23, 2017

Wake up and Smell the Social Equality, Silicon Valley

The article on Digital Trends titled Alexa, Why Aren’t You a Dude? How Female Digital Assistants Reinforce Stereotypes investigates the overwhelming tendency towards female personas for digital assistants from Alexa to Siri to Cortana to Ok, Google. Mansplainers beware, the author notes that Siri allows for different genders and accents, but the more important point to locate is the default equivocation of women with subservience. The article articulates,

Both Apple and Google have both stated a desire to make their digital assistants more sophisticated, giving users a sense of a relationship rather than a device. It’s a potentially troublesome phenomenon as the makers of anthropomorphic assistants to accent non-threatening and subservient qualities to achieve social acceptance. Scarier still is the idea that digital assistants are not only reflecting gender bias, but causing it. Kids are already anthropomorphizing their robot friends, and also bossing them around…

The article is chock full of quotes from smart people calling for an end to defaulting to female voices, or for improving the design with social equality in mind. But who is really designing these digital assistants? We know that women are vastly underrepresented in Silicon Valley, and it is an unfortunate reality that the people driving these huge cultural influences might have no concept of their own bias. They have created a dream woman for men and waking nightmare for women.

Chelsea Kerwin, May 22, 2017

How Voice Assistants Are Shaping Childhood

The Straits Times examines how AI assistants may affect child development and family interactions in, “When Alexa the Voice Assistant Becomes One of the Kids.” As these devices make their way into homes, children are turning to them (instead of parents) for things like homework help or satisfying curiosity. Some experts warn the “relationships” kids experience with AIs could have unintended consequences.  Writer Michael S. Rosenwald cites University of Maryland’s Allison Druin when he notes:

This emotional connection sets up expectations for children that devices cannot or were not designed to meet, causing confusion, frustration and even changes in the way kids talk or interact with adults.

The effects could go way beyond teaching kids they need not use “please” and “thank you.” How will developers address this growing concern?

Cynthia Murrell, May 19, 2017

Fun with Alexa

Developers are having fun casting Alexa’s voice onto other talking objects. On the heels of the Alexa Billy Bass, there is now a skull version aptly named ”Yorick,” we learn from CNet’s write-up, “Fear Alexa With this Macabre Talking Skull Voice Assistant.” Reporter Amanda Kooser explains that the project uses:

… A three-axis talking skull robot (with moving eyes), powered speakers, Raspberry Pi, and AlexaPi software that turns the Raspberry Pi into an Alexa client.

Kooser finds it unsettling to hear a weather forecast in Alexa’s voice emerge from a robotic skull, but you can see the effect for yourself in the article’s embedded video. Developer ViennaMike has posted instructions for replicating his Yorick Project. I wonder what robotic knickknack Alexa’s voice will emanate from next?

Cynthia Murrell, May 18, 2017

 

Amazon Controls the Voice Controlled

Voice-controlled speakers that can answer questions, schedule appointments, play music, order products, and do many more activities are a luxury product.  Google Home, Lenovo, LG, Harmon Kardon, and Mattel have their own fans, but Amazon remains the top seller in the market with Echo dot products loaded with Alexa.  Tech Crunch explained how Amazon dominates the market in the article, “Amazon To Control 70 Percent Of The Voice-Controlled Speaker Market This Year.”

Amazon controls an astonishing 70.6 percent of the voice-controlled speaker market and the current trends show that consumers prefer to stick with one type of speaker, instead of buying a competitor.  Compatibility issues apparently weigh heavy on their minds.  The Google Home is predicted to grow from its 23.8 percent as it reaches more people, but for now, Amazon will remain in control.  Amazon’s marketing plan will certainly be hard to beat:

Amazon’s strategy with Alexa is to allow the assistant to extend beyond just voice-controlled speakers it manufacturers. The company has also included the assistant in its Amazon mobile shopping app, and has made it available to third-parties for use in their own hardware and software applications.

Google, however, is known to make decent and less expensive products than most of the bigger name companies, such as phones and laptops.  One other thing to consider is the quality of Alexa’s conversation skills.  Bitext, one of the best-kept secrets related to sentimental analytics, has many major clients, including a popular search engine.  Bitext’s clients deploy its technology to improve a chatbot’s skills.

Whitney Grace, May 18, 2017