Build a Conversational AI Virtual Agent in Three Steps

Guest Post from Qalius Consulting

Jul 20, 2021 | 0 comment

What matters to customers today? Short answer: convenience. 

Customers are showing that they will take their business where they can get convenient self-service solutions built on conversational artificial-intelligence (AI) platforms. 

One digital bank claims to execute 82% of its transactions via conversational AI.  Research confirms that up to 80% of service requests can be fulfilled with automation. Conversational AI solutions complement live-agent customer service. They do that by diverting routine processes to automation

When conversational AI takes care of simple tasks, live agents can focus on complex and more delicate cases. This improves live agent focus and creates a win for the bottom line. According to McKinsey, a digital transformation expert group, AI chatbots are one of the ways organizations are unlocking a trillion dollars of value from cloud computing.

To efficiently work with customers the conversational AI solution needs to solve a set of straightforward problems without fuss. It should have a natural-language interface so people can speak to it – no “press 5 for balance inquiry.” It must understand the variety of ways customers might express their intentions. Also, it needs to integrate with the customer-service technology ecosystem so it can effectively execute tasks. Finally, it needs to seamlessly handle escalations to live-agent customer service without dropping any of the context.

At Qalius, teams achieve these objectives using Amazon Lex, the service from Amazon Web Services (AWS). This system builds chatbots powered by natural-language artificial-intelligence technologies. Amazon Lex is a service that is ready to use out of the box without needing to worry about training machine-learning algorithms. Amazon Lex supports voice or text interface in many languages. Implementation is straightforward if you follow these simple steps.

Step 1: Define the Customer Intentions

The first thing to consider when setting up a conversational AI solution is the need for the system to recognize and fulfil specific customer requests, known as intents. Getting feedback from your agents or checking out common FAQs your customers ask, could be a great place to start. 

We recommend starting with a handful of common intents that are currently handled by live agents. Choose routine support requests where the customer will value efficiency and speed of service over human contact. Avoid requests that occur in emotionally charged situations and save those for live agents to handle. Prioritize requests that you fulfill using up-to-date information from other systems because these requests allow the automation to excel.

Let’s imagine that your customers often call your contact center to check their order status. You can configure the virtual agent to recognize the “check order status” intent.

We configure the virtual agent to recognize this intent by providing it with examples of what a customer might say when they want to check order status. These examples are called utterances. The utterances “Where is my order?” and “Has my stuff shipped?” could indicate customer intent to check order status. The power of Amazon Lex is that you simply give it a few representative utterances. Then ,the natural-language conversational AI will recognize related statements that mean the same thing. You don’t need to capture all the possible wordings: Amazon Lex recognizes them automatically.

Step 2: Structure the Information Collection

In Amazon Lex, you can configure the conversation AI to seek information required to fulfil the customer intention. Each piece of information is called a slot. The natural-language AI in Amazon Lex will conduct the conversation to gather the information it needs for each slot. 

As Amazon Lex converses it keeps track of the conversation context, a feature that is very important when fulfilment requires several slots. This step in configuring the virtual agent is easy because we just need to define the slots, and then we can trust Amazon Lex to manage the conversation from there.

Let’s look again at the example of customer intent to check order status. You can configure the virtual agent with a single slot for order number. If the customer offers the order number with an utterance such as “Has order 1234 shipped?” then Amazon Lex will recognize the intention to check order status, and that the customer has provided order number 1234. In this case Amazon Lex needs no additional information, so it proceeds with fulfilling the request. If the customer’s utterance is “Has my order shipped?” then Amazon Lex will prompt the customer to provide their order number.

Step 3: Build Execution Fulfilment

Each intent associates with a way to fulfil the request. For example, the system can send the order number  to a business operations system to get back the shipping status of the order and provide that information to the customer. Don’t shy away from fulfilment that requires one or more integrations with other systems: the AWS services are a great toolbox for implementing a custom solution. Your fulfilment could involve a sequence of queries and transactions orchestrated on AWS and implemented with connections to corporate or third-party systems.

One other integration is critical. Make sure your virtual agent can pass complicated customer intentions to your live agents. Do this by building your virtual agent to work alongside live agents in LiveHelpNow. In this architecture, handoffs from the virtual agent to a live agent work like any other escalation within LiveHelpNow.

What Next?

That gets you started! Configure your virtual agent by providing examples of the utterances that reflect various customer intentions. Define the information (the slots) it will need to fulfil the intent. Use the broader AWS platform to implement the fulfilment. Most importantly, at every stage keep it simple. Dedicate your virtual agent to cases that deliver an efficient solution to your customers while relieving your live agents of tedium. After all, the main function of conversational AI is to help live agents, not replace them.