Translate at Hyper Speed with Facebook

I wish that the language barrier did not exist, but since that will never go away I wish a quick, accurate translation device existed.  We live in a time when such a device is not science-fiction, but only decades, possibly only years away.  Companies are experimenting with new algorithms, AI, and other processes to speed up translation and accuracy.  Bitext relies on breakthrough Deep Linguistic Analysis, while Digital Trends reports that Facebook is working on speed in “Facebook IS Using AI To Make Language Translation Nine Times Faster.”

Facebook reports that their artificial intelligence translates foreign languages nine times faster than the traditional language software.  What is even more astonishing is that the source code is open source, so anyone can download and use it.  Facebook uses convolutional neural networks (CNN) to translate.  CNNs are not new, but this is the first time they have been used for translation.  How does it work?

The report highlights the use of convolutional neural networks (CNN) as opposed to recurrent neural networks (RNN), which translate sentences one word at a time in a linear order. The new architecture, however, can take words further down in the sentence into consideration during the translation process, which helps make the translation far more accurate. This actually marks the first time a CNN has managed to outperform an RNN in language translation, and Facebook now hopes to expand it to to cover more languages.

I think the open source aspect is the most important part.  Language translation software relies on a lot of data in order to have decent comprehension skills.  Open source users tend to share and share alike, so we can rely on them feeding huge data piles to the code.

Whitney Grace, May 30, 2017


Bots Speak Better Than Humans

While Chatbots’ language and comprehension skills remain less than ideal, AI and algorithms are making them sound more and more human each day.  Quartz proposes that chatbots are becoming more human in their actions than their creators in the article, “Bots Are Starting To Sound More Human Than Most Humans.”  The article makes a thoughtful argument that while humans enjoy thinking that their actions are original, in reality, humans are predictable and their actions can be “botified.”

Computers and bots are becoming more human, but at the same time human communication is de-evolving.  Why is this happening?

It might be because we have a limited amount of time and an unlimited amount of online relationships. In an effort to solve this imbalance, we are desperately trying to be more efficient with the time we put into each connection. When we don’t have the time to provide the necessary thought and emotional depth that are hallmarks of human communication, we adopt the tools and linguistic simplicity of bots. But when our communication is focused on methods of scaling relationships instead of true connection, the process removes the thought, emotion, and volition that makes that communication human.

The article uses examples from LinkedIn, Facebook, and email to show how many human interactions have become automated.  Limited time is why we have come to rely on automated communication.  The hope is to free up time for more valuable interactions.  Computers still have not passed the Turing Test and it will only be a matter of time before it does happen.  Companies like Bitext with their breakthrough computational linguistics technology are narrowing the margins.  The article ends on a platitude that we need to turn off the bot aspects of our personality and return to genuine communication.  Yes, this is true, but also seems like a Luddite response.

The better assertion to make is that humans need to remember their human uniqueness and value true communication.

Whitney Grace, May 25, 2017

Microsoft Does Their Share to Make Chatbots Smarter

Chatbots are like a new, popular toy that everyone must have, but once they are played with the glamor wears off and you realize they are not that great.   For lack of better terms, chatbots are dumb.  They have minimal comprehension and can only respond with canned phrases.  Chatbots are getting better because companies are investing in linguistic resources and sentimental analysis.  InfoQ tells us about Microsoft’s contributions to chatbots’ knowledge in, “Microsoft Releases Dialogue Dataset To Make Chatbots Smarter.”

The Microsoft company, Maluuba, released a new chatbot dialogue dataset about booking vacations with the hopes to make chatbots more intelligent.  Maluuba accomplished this task by having two humans communicate via a chatbox, no vocal dialogue was exchanged.  One human was trying to find the best price for a flight, while the other human who played chatbot used a database to find the information.  Travel-related chatbots are some of the dumber of the species, because travel planning requires a lot of details, content comprehension, and digesting multiple information sources.

What makes travel planning more difficult is that users often change the topic of their conversation. Simultaneously you might discuss your plan to go to Waterloo, Montreal, and Toronto. We humans have no trouble with keeping apart different plans people make while talking. Unfortunately, If users explore multiple options before booking, computers tend to run into problems. Most chatbots forget everything you talked about when you suddenly enter a new destination.

Maluuba’s dataset is like enrolling a chatbot in a travel agent course and it will benefit anyone interested in a travel planning chatbot.  Maluuba is one of many companies, not just Microsoft owned, that are sharing their own expertise and building specific chatbot datasets.  One such company is Bitext, but their expertise lies in a number of languages they can teach a chatbot.

Whitney Grace, May 23, 2017

Amazon Controls the Voice Controlled

Voice-controlled speakers that can answer questions, schedule appointments, play music, order products, and do many more activities are a luxury product.  Google Home, Lenovo, LG, Harmon Kardon, and Mattel have their own fans, but Amazon remains the top seller in the market with Echo dot products loaded with Alexa.  Tech Crunch explained how Amazon dominates the market in the article, “Amazon To Control 70 Percent Of The Voice-Controlled Speaker Market This Year.”

Amazon controls an astonishing 70.6 percent of the voice-controlled speaker market and the current trends show that consumers prefer to stick with one type of speaker, instead of buying a competitor.  Compatibility issues apparently weigh heavy on their minds.  The Google Home is predicted to grow from its 23.8 percent as it reaches more people, but for now, Amazon will remain in control.  Amazon’s marketing plan will certainly be hard to beat:

Amazon’s strategy with Alexa is to allow the assistant to extend beyond just voice-controlled speakers it manufacturers. The company has also included the assistant in its Amazon mobile shopping app, and has made it available to third-parties for use in their own hardware and software applications.

Google, however, is known to make decent and less expensive products than most of the bigger name companies, such as phones and laptops.  One other thing to consider is the quality of Alexa’s conversation skills.  Bitext, one of the best-kept secrets related to sentimental analytics, has many major clients, including a popular search engine.  Bitext’s clients deploy its technology to improve a chatbot’s skills.

Whitney Grace, May 18, 2017

Los Angeles Relies on Chatbot for City Wide Communication

Many people groan at the thought of having to deal with any government entity.  It is hard to get a simple question answered, because red tape, outdated technology, and disinterested workers run the show.  But what if there was a way to receive accurate information from a chipper federal employee?  I bet you are saying that is impossible, but Government Technology explains that “Los Angeles, Microsoft Unveil Chip: New Chatbot Project Centered On Streamlining.”

LA’s chipper new employee is a chatbot named Chip (pun intended) that stands for “City Hall Internet Personality.”  Developed by Microsoft, Chip assists people through the Los Angeles Business Assistance Virtual Network (BAVN).  “He” has helped more than 180 people in twenty-four hours and answered more than 1400 queries.  So far Chip has researched contract opportunities, search for North American Industry System codes, and more.

Chip can be trained to “learn,” and has already been backloaded with knowledge more than tripling his answer base from around 200 to roughly 700 questions. He “curates” the answers from what he knows.  Through an extensible platform and Application Program Interface (API) programming, the bot can connect to any data or back-end system…and in the future will likely take on new languages.

Chip’s developers are well aware that voice-related technology coupled with artificial intelligence is the way most computers appear to be headed.  Users want a sleeker interaction between themselves and a computer, especially as life speeds up.  Natural-sounding conversation and learning are the biggest challenges for AI, but companies like Bitext that develop the technology to improve computer communication are there to help.

Whitney Grace, May 16, 2017

Amazon Aims to Ace the Chatbots

Amazon aims to insert itself into every aspect of daily life and the newest way it does is the digital assistant Alexa.  Reuters reports that, “Amazon Rolls Out Chatbot Tools In Race To Dominate Voice-Powered Tech,” explaining how Amazon plans to expand Alexa’s development.  The retail giant recently released the technology behind Alexa to developers, so they can build chat features into apps.

Amazon is eager to gain dominance in voice-controlled technology.  Apple and Google both reign supreme when it comes to talking computers, chatbots, and natural language processing.  Amazon has a huge reach, perhaps even greater than Apple and Google, because people have come to rely on it for shopping.  Chatbots have a notorious history for being useless and Microsoft’s Tay even turned into a racist, chauvinist program.

The new Alexa development tool is called Alexa Lex, which is hosted on the cloud.  Alexa is already deployed in millions of homes and it is fed a continuous data stream that is crucial to the AI’s learning:

Processing vast quantities of data is key to artificial intelligence, which lets voice assistants decode speech. Amazon will take the text and recordings people send to apps to train Lex – as well as Alexa – to understand more queries.

That could help Amazon catch up in data collection. As popular as Amazon’s Alexa-powered devices are, such as Echo speakers, the company has sold an estimated 10 million or more.

Amazon Alexa is a competent digital assistant, able to respond to vocal commands and even offers voice-only shop via Amazon.  As noted, Alexa’s power rests in its data collection and ability to learn natural language processing.  Bitext uses a similar method but instead uses trained linguists to build its analytics platform.

Whitney Grace, May 11, 2017

How I Learned to Stop Typing and Love Vocal Search

Search Engine Watch reported a mind-blowing fact that is not that hard to fathom, but still amazing in the article, “Top Tips On Voice Search: Artificial Intelligence, Location, And SEO.”  The fact is that by 2020 it is projected that there will be 21 billion Internet-connected devices.  Other neat facts are that 94 percent of smartphone users frequently carry it with them and 82 percent never turn their phone off.

As one can imagine, users are growing more reliant on voice search rather than typing in their queries.  Another habit is that users do not want to scroll through results, instead, they want one, immediate answer delivered with 100 percent accuracy.  This has increased reliance on digital assistants, which are equipped to handle voice search.  Why is voice search on the rise, however?

Mary Meeker’s Internet Trends Report looked at the reasons why customers use voice search, as well as which device settings are the most popular. The report indicated that the usefulness of voice search when a user’s hands or vision were otherwise occupied was the top reason that people enjoyed the technology, followed by a desire for faster results and difficulty typing on certain devices.

Where do users access voice search? It turns out that, more often than not, consumers are opting to use voice-activated devices is at home, followed by the car and on-the-go.

The power behind voice search is artificial intelligence using natural language processing, semantics, search history, and user proclivities, and other indicators.  Voice search is still an imperfect technology and it does need improvements, such as being able to speak the human language fluently.  It is not hard to make a computer speak, but it is hard to make it comprehend what it says.  Companies invested in vocal search should consider looking at Bitext’s linguistics platform and other products.

The article shares some tips on how to improve search, including using specific language and keywords, use markup to make sure content is ready to be displayed in Google, do not neglect apps, and other helpful advice.

Whitney Grace, May 9, 2017

Alexa Wears More Than One Robot Hub

Amazon Alexa is a digital assistant that can be accessed on the Amazon Echo or the Echo Dot hubs.  While Alexa can be accessed on other devices, none of them have gotten much prevalence yet.  Tech Moran explains that LG might have the next hub users want to download Alexa into: “LG’s Robot Hub & Amazon’s Alexa Tie Up The Entire Smart home Together.”

One thing people love to do with Alexa is program their home to be a smart home, but it usually requires buying extra technology to make a home smart.  LG’s solution was to build the LG Robot Hub that unites all LG appliances into one single user interface: Alexa.  LG is interested in making their appliances compatible with the next generation of home design.

’Our Hub Robot has enabled us to transform our home appliances to be smart, useful products for everyday consumers – thanks to Amazon Alexa’s voice control technology. So far the technology has been the game changer in IoT connectivity. It’s flexible and innovation-driven,’ LG East Africa Marketing Manager Moses Marji commented.  Leveraging on the incredible LG Hub robots, Marji noted that LG want to bring consumers the possibility of not just a smart home, but a truly intelligent home – one that enables their consumers to automate their home and control it as they want.

LG is an example of how companies are augmenting their products to comply with the Internet of Things, that is, old analog items are upgraded to have wifi, be digital, include speech recognition technology, etc.   Speech recognition technology, one use is chatbots, is notoriously bad, but there are companies like Bitext that specialize in software using text analysis and linguistics to make them as smart as the smart home they power.

Whitney Grace, May 4, 2017

Voice Technology Gives Computers a Face Lift

One thing evolving faster than organic life is a computer.  Fortune recently spoke with head of Amazon Alexa Paul Cutsinger about how computers are changing, “This Major Change Is Coming To Computer Design, Says Head Of Amazon Alexa.”

The article starts with small description how computers have changed since the 1970s from punch cards to terminals to touch screens, but the next big thing will be voice technology and artificial intelligence.  Paul Cutsinger predicts this is the next computer evolution based off his work as the head of Alexa voice education.

Cutsinger believes we are not far away from holding deep and rich conversations with personal technology.  He thinks the way this is going to happen is reevaluating how we text and write and that computer design needs to start over again:

Applying what we learned from responsive design for mobile won’t work for voice. If you apply the old principles, you’ll have a command-and-control system,’ says Cutsinger, explaining that command-and-control makes sense for touchscreen design but not for voice-based interactions.

Part of this redesign centered on voice technology is Antonio Valderrabanos and his company Bitext, centered in Madrid.  Bitext’s specialization is deep language analysis geared towards improving text analytics and content processing a big part of voice technology.

Whitney Grace, May 2, 2017

Deep Linguistics for More Savvy Chatbots

Existing chatbots are missing one key ingredient, Bitext explains in their blog post, “Linguistics to Create a Human-Like Chatbot.” Writer Clara García outlines the three linguistic factors an AI requires for human-like conversation—syntax, semantics, and pragmatics. The first two have been covered, but algorithms still struggle with pragmatics; that is, context and cultural knowledge. García illustrates:

Without pragmatics, our bot will never sound like a human, and what’s more important, it will not understand the user when she talks like one. If the user uses an idiom, cracks a joke, or uses the word ‘it’ referring to the skirt she was trying to buy, the bot will not understand her.

Before chatbots achieve that level of nuance, machine learning must improve. For  more on how linguistics can help train smarter bots, download Bitext’s Deep Linguistic Analysis Benchmark.

Cynthia Murrell, April 27, 2017