Editor’s note: This article is from the wechat official account QbitAI, compiled by Li Shan, 36 kryptonite.

It’s been seven years since Apple launched Siri, and three since Jeff Bezos launched Alexa, inspired by Star Trek.

Ai-based interfaces have been around for decades. In 1966, Massachusetts Institute of Technology professor Joseph Weizenbaum introduced ELIZA, widely regarded as the prototype for today’s conversational ARTIFICIAL intelligence.

Decades later, in a wired story, Andrew Leonard declared that “robots will be hot,” and argued that the near-technology would soon “help me find the best deal on CDS, pick flowers for my mother, and cover the latest developments in Mozambique.” Since the article was published in 1996, the examples used are long out of date CDS.

Today, companies like Slack, Starbucks, Mastercard, and Macy’s have all changed to use conversational interfaces in areas as diverse as customer service, connected home, and online flower ordering.

If you doubt the value or promise of the technology, take a look at Gartner’s forecast for 2019, in which the market research firm argues that virtual personal assistants “will change the way users interact with their devices and become a universally accepted part of life.”

Not all conversational AI is created equally, nor should it be. Conversational AI comes in two categories: virtual personal assistants (Alexa, Siri, Cortana and Google Home) and professional assistants (X.ai and Skipflag). They can be developed based on a set of rules engines or they can use machine learning techniques. Use cases range from specific and trivial tasks (Taco Bell’s TacoBot) to generic and broad services (Alex, Siri, Cortana, Google Home).

Many organizations are also considering deploying dialogue interfaces in the personal or professional arena, often relying on technology provided by partners. But there’s a lot more to consider than technology. While it’s too early to call it a “field guide,” here are a few things organizations should look at when considering trying or deploying conversational AI:

Start with a small area where there is a clear answer

“It’s about the product or the brand.” “Don’t think, ‘I’m building a bot,'” says Amir Shevat, Director of developer relations at Slack. “Think, ‘What kind of service am I providing? ‘”

Beyond that, Shevat and others say, the best place to start is to look for tough problems that can be mitigated or solved with a lot of data. This is not to say that all successful robots should do the same thing, but the key is to start with a small area where there is a clear answer, and then design an experience that users often don’t know they can enjoy.

Goals determine patterns of interaction

Some dialogues are great for voice interaction. For example, when driving a car or turning on the heat in your home. But in situations such as asking for a bank balance, precise privacy information may need to be entered via text. But there are other ways to help users interact with robots. The figure below shows two examples of successful interactions.

“A lot of people still have a misconception that robots can only talk or type.” Chris Mullins of Microsoft. In fact, robots can interact with people or convey information in many ways (or forms) :

  • Voice (Alexa, Siri, Google Home)

  • Typing (bots in chat apps)

  • Provide clues through keyboard support to narrow down the range of input options

  • A card displaying visual information

“In the most successful cases,” Mullins says, “we see a hybrid form win out. The pronunciation is perfect at the right time. But at other times, the typing is perfect. Sometimes you’ll want to use card or keyboard support, too. Figuring out how to have a conversation is an incredibly difficult problem that no one has quite figured out yet.”

Carefully plan and clearly choose multiple backgrounds

If a customer asks a retailer, “Where can I find a power drill in my local store?” Developers have to think in terms of where the customer is. Is she in the store right now? Is she using her phone or her home computer? Developers must design for multiple scenarios and experiences.

This process is challenging because different interaction patterns need to be envisaged in the scoping process. “Interacting with humans is very complex, and identifying patterns of conversation is difficult.” Mullins said. To achieve the best results, the project team must make choices from the start.

Continuous interaction requires continuous understanding of background information

It’s one thing to understand a single command like “Play Beyonce’s Lemonade” and “Check my bank balance.” It’s quite another to write programs for interactions between humans and chatbots. This is what makes multiple interactions (” turns “) between humans and robots so complex and difficult to develop, requiring a good understanding of the context.

Here is an example from Kasisto that illustrates the complexity inherent in a simple payment interaction.

Round 1:

  • The user asked Kai (the chatbot) to pay Emily $5.

  • Kai finds two people named Emily in the user’s contact list and asks which one they are.

Round 2:

  • — Users change the subject and ask how much money they have in their account.

  • Kai replies, then adds, “Where are we now?” Then we move on to the original topic of paying Emily $5.

  • At first glance, this seems like a very simple interaction, but from an engineering perspective, it requires a deep understanding of the context and language:

  • First, Kai must recognize and track the user’s goals and, in this case, pay someone.

  • Second, Kai must determine who to pay. When you discover that the user has two friends named Emily, you need to ask to determine who to pay.

  • Third, Kai must understand that the word “Neubig” is a separate word from the previous conversation, meaning pay Emily Neubig.

  • Fourth, Kai must interpret “How much money is in my account?” Understand that this is a new request and not the same as the previous two interactions.

  • Finally, it must answer the new request, then continue the conversation and complete the original request: pay Emily $5.

This conversation demonstrates why clear purpose, narrow responses and deep expertise are all crucial to the development of chatbots — because understanding the user’s intentions when they are expressed in a natural way is so complex, but crucial if it is to provide an effective experience.

EQ is as important as IQ

Superior intelligence and clear user intent are not the only elements of robot success. Detecting emotions and choosing the right words and intonation are also key to ensuring a comfortable conversation experience. As a result, LABS and startups are developing software that can detect emotional states through pictures, speech, text or video.

SRI International’s Voice technology and Research lab has developed The SenSay Analytics platform, which claims to be able to sense a speaker’s emotions through voice signals. This can tell when a user is confused and provide them with a human interaction, and it can tell if they are receptive and provide them with relevant content.

Branding is a small but effective opportunity

Branding is a key factor in the success of robots. Ineffective robots can damage reputations, and strong brand tentacles can also help robots succeed. “I think there’s less branding opportunity in a conversational interface.” Lars Trieloff of Adobe says, “So leverage brands in everyday interactions. Make sure it does one thing really well, exactly what the customer wants.”

At present, it is still in the early stage of using the conversational interface, and there is still a long way to go. But conversational AI — programs that better mimic human ways of interacting with machines — will take root. It may be primitive now, but advances in data science, natural language technology, machine learning and other technologies will eventually create the environment necessary for more fluid human-machine interaction.

Is conversational interaction equal to or better than human interaction? Some types of interaction may never be easy to unfold through a machine. But for some uses it may be possible: we’ve seen a lot of innovation, but it’s just the tip of the iceberg.

One thing is certain, as futurist and creative strategist Monika Bielskkyte says: “We are entering a screen-free future.” “In the future, the world will be our desktop,” she predicts.