Episode Summary: As Senior Director and World Wide Head of the Cognitive Innovation Group at Nuance Communications, Mark Hanson works on bringing Nuance lab innovations to business applications, with the guiding goals of improving customer experience and business efficiency. In this episode, Hanson speaks about natural language processing (NLP) and virtual assistant services, where he believes this technology is headed in the future and where it’s driving value now, and how companies are applying NLP in Silicon Valley and elsewhere.
Expertise: Artificial intelligence in customer service
Brief Recognition: As World Wide Head of Nuance’s Cognitive Innovation Group (CIG), Mark is responsible for setting Nuance’s AI Strategy and leading product development efforts in collaboration with Nuance’s Definitional Customers and Nuance’s AI Lab. Currently, Mark is focused on machine learning advancements in analytics, prediction, knowledge, and human assisted learning. Prior to leading CIG, Mark led product management and design for Nuance’s Nina product. Mark began his career in AI focusing on business and design strategy for the Automotive, Telco, and Finance verticals.
Current Affiliations: World Wide Head of Nuance’s Cognitive Innovation Group (CIG)
Q: What are…some of the exciting areas where (Nuance) is plugged in and making a difference already?
Mark Hanson: Nuance probably touches a lot of peoples’ lives in ways that they don’t realize; Charlie mentioned the automobile, if you’re driving a car and you want to say, “Hey, play me Youtube”…it’s our technology, for the most part, that’s driving that experience. Underneath that technology is a core technology that includes speech recognition, natural language processing (NLP), text-to-speech, dialogue management, and what we’ve done in enterprise is we’ve taken that core IP and now created a platform so that enterprises can start to build these smart applications as well…
If you’re a USAA member, USAA was one of our first customers to adopt Nina, which is our Nuance interactive natural assistant; think of it as a smart virtual assistant that is able to do all of the things that an enterprise needs it do. This is a virtual assistant that sits within the mobile applications of USAA, and rather than trying to traverse multiple screens or escalate to a contact center…you can just ask your questions and tell the virtual assistant what you want it to do. That was one of our first implementations, and since then we’ve seen some really cool implementations…
It’s really a seamless conversational experience the we’re trying to drive that simplifies the systems that we’re providing the user…more and more power is going into smaller spaces…now, within a banking application, you can do almost anything within the mobile application, and as you increase features you don’t want to increase complexity and friction….you don’t need to be taught how to have a conversation, that’s something we all know how to do. You don’t need to learn the interface when you come into contact with a virtual assistant (VA), you just talk to it the same way you and I are talking.
What are some other examples, brands people might recognize, applications that other people don’t realize you play a role in?
MH: I think it’s the ones that you don’t hear about a lot that often times are really unique and cool; so, when you think about travel for example, we’ve done Jet Star – this is a web implementation, so a VA on the web…tremendous improvements in customer satisfaction, and in addition to that, they start to learn what customers are asking for. For example, a customer might be coming in and asking a particular question, we get this learning all the time, which is “Hey, I didn’t know my customers actually wanted to know that piece of information”, and because they’re talking to a VA and it’s the same VA everyone’s talking to – think of it as a centralized brain – whether it’s on mobile, or the web…it’s the same intelligent system that you’re having that conversation with, and so we can start to centralize the intelligence and learn all sorts of new things about our customers…
Sounds like part of the value proposition of a company like Jet Star…better customer service…but it sounds like you’re also filtering and pooling questions and patterns into some sort of visual format that can make sense of and unearth reams of questions that they (businesses) should have better answers for…
MH: We have tooling today…so that when a VA is asked something it doesn’t know the answer to…we actually capture the information, and we apply NLP to that so that we can start to group those unknowns together, and so within that tooling we can say, “Hey, here’s a cluster of questions that the system has identified as very similar and they’re all about this one particular thing”.
Now I can go in and teach it the answer to that question or the action needed to resolve the particular intent, and harvest that across all of your customers, so you are getting that information in a much more digestible, actionable way than we’ve gotten in the past…you’re not only getting the increase in automation, in customer service…they’re also getting deep insight into what exactly customers want.
In terms of future applications…where do you see exciting progress forward; what are some areas that most people don’t think of as something that’ll be changed in a couple of years…with where (Nuance) is applying this into the future?
MH: The way in which we think about this is, you train a VA to do a specific thing today; that can have a very meaningful impact, an experience from a customer’s perspective and the business drivers you’re trying to achieve from a business’ perspective, but ultimately it only knows what you teach it. There are systems out there that we’ve said, “Oh we don’t need to be explicit, we can just point it at a website”…we’ve gone down that path too, and what we’ve found is that even though there are websites out there…the questions that are being asked, the answer isn’t neatly packaged in some paragraph on a website…you come back to this question, well then how do we teach a system to do new things?
Today, the answer is we build it, we explicitly teach it; the answer tomorrow is it’s going to learn, but the question is how is it going to learn? One of the things that Nuance is doing that’s very unique is, we’re actually developing a way where a VA can learn through the observation of human agents. Rather than explicitly going and programming ‘here’s an intent…and here’s how to resolve that intent’…we’re trying to emulate the intelligence of the humans inside your contact center or company; why not just observe what they do when they get those questions?
Nuance is developing a lot of applications in this space of conversational interfaces…where do you see the business contest in proliferation and dominance within NLP – do you see it in the developments of narrow applications…is it advances in hardware…is it licensing…where is the contest in this business world of NLP?
MH: I think the competitive landscape is being developed along the specific applications of technology. A lot of machine learning is public domain knowledge, it’s out there in the journals and the articles of academia and also within the work that companies are doing, but where the value is, is how that technology is applied to very specific problems; it’s our belief that no one company will own AI, and what we’re seeing is an emergence of companies that are really good at very specific things.
Our strategy is…those companies that can develop a good platform of AI and machine learning and the development of conversational interfaces that are intelligent and able to access knowledge, those platforms that can do that and incorporate some of the technologies that come from these other players…an AI ecosystem…that’s where you’re going to see a lot of the competitive differentiation
Related Emerj Interviews/Articles: