Last year Google announced its open source platform for machine learning, giving developers access to one of the most powerful machine learning platforms created. Google is moving to become a machine learning company.
The core of Contact Center AI is conversational. “Human-like” interactions to improve conversation and the customer experience.
The development suite of code from Google is an example for building natural “conversational” experiences across multiple contact center channels (voice, chat, etc.) More than one million developers are now using this suite of tools in multiple areas of AI including the Contact Center.
There are two areas in particular that are growing in prominence within a business contact center environment:
- Virtual Agent
When you call in for help you may get a “Virtual Agent”. This type of agent gives customers 24/7 access to immediate conversational self-service, with easy handoffs to human agents for more complex issues. Instead of a traditional calling tree (i.e. “press 1 for payroll, press 2 for…”) you can simply say “I need help with payroll” and the Virtual agent will take you to a live agent who specializes in that function.
- Agent Assist
When you are talking to a live human agent, “Agent Assist” helps the human agent with continuous support during the calls by identifying intent and providing real-time assistance. You (the customer) may say a word or a phrase that causes the Agent Assist to go look up some information and present it to the agent live while you are talking.
Currently in contact centers, analytics are all done after the call. The call is recorded, it goes to a server where it gets transcribed for analytic purposes and then reported. This process pulls out the trending keywords, checking for tone, and recording metrics requested by the business. Then, it presents the overall readout with sentiment and keyword analysis. Looking at this information after the call is over is not nearly as helpful as doing it in real-time.
On November 14, 2019, it was announced that Google’s Cloud Contact Center AI is now Generally Available (GA). Genesys, Avaya, Cisco, Incontact, Five9s and Mitel have all announced interconnection or partnerships with Google’s Cloud Contact Center AI platform.
The current thinking and the current model is that, for the foreseeable future, AI will help agents in the call center, but it won’t replace them.
When a call comes in, it will be listening to what is being said and it will learn from what it hears. It will also be able to transition easily to an agent when needed, all of which helps the customer experience.
Example: AI Anticipating Customer Needs
What we mean by AI:
A call comes into a parts supplier company from a long time customer. AI will automatically pull up recent purchases. It may recognize that that this caller buys x number of parts every six month so it will pull up the current rates for a potential new order… all before the human agent even answers the phone.
If the customer actually mentions a new type of part during the conversation with the live agent, the AI will immediately search and present to the agent the inventory, shipping dates available and prices without the agent having to type anything. AI will hear the phrase from the customer and perform the search and present the information. All of this to “assist” the agent and improving the customer’s experience.
This technology is already available. For example, Vonage’s communications API platform works with IBM Watson to provide just this type of virtual in-call assistant. In the contact center of the next few years, it will be commonplace.
The contact center of the future will anticipate a customer’s inquiry and predict what they’ll want to talk about. It will even provide appropriate support throughout the interaction, all thanks to artificial intelligence (AI).