Nuance is a developer of conversational and cognitive technologies. Continuing just-auto/QUBE's series of interviews at the CES this week, we spoke to Arnd Weil, VP & General Manager Automotive & Consumer Electronics, Nuance Communications about the company's Dragon Drive and other technologies. 

While giving instructions in our cars is nothing new, putting questions to the likes of Alexa and Cortana while on the road is. Is this the way things are going … having more conversations with our cars?  

Having conversations, yes. But they need to be meaningful for the driver or passengers, such as recommendations for restaurants, POIs or agenda entries. We're experts in building highly customised, domain-specific solutions – designed for both today and the future – that hear, understand, learn, and leverage language to interact with their customers, providing personalised experiences and new levels of convenience, engagement, security and satisfaction.

Although a number of cars have some sort of voice recognition set up, some drivers are frustrated about how challenging it is to use voice commands on the road. How does your artificial intelligence-based automotive assistant help reduce that frustration?

As technology advances, the interaction between drivers and vehicles grows. The earlier, more archaic systems struggled with numerous restrictions, such as limited on-board computing power and user interfaces (HMI) that were not geared towards spoken dialogue. However, newer systems are exposing what intelligent assistants can do and the potential they hold. There are certain traits that these assistants should have, for example, being easy to control and having a voice command to ensure journeys are as comfortable as possible. These three technologies now make a difference:

  • Deep learning: Self-learning systems based on deep neural networks have been successful in many speech-recognition processes. They could be used not only for cloud-based technologies, but also for embedded systems integrated in cars.
  • Natural language understanding (NLU): Previously, speech-recognition system developers needed to define rules specifying what could be recognised at which position within a dialogue. Modern systems are trained using real user data and use statistical methods and language models.
  • Reasoning & personalisation: Reasoning enriches speech recognition and comprehension to provide logical consequences and conclusions using a knowledge-based system. The technology accesses a wide variety of background knowledge to provide the best possible response to a request. This can be seen in action in an intelligent parking assistant that understands the command "Find a cheap car park near the opera that accepts credit cards and is still open after the concert", whilst taking additional information into account.

Could you tell us a little about Dragon Drive? 

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Powered by artificial intelligence, Dragon Drive's conversational automotive assistant listens, understands and responds to drivers, and empowers automakers to make the AI-powered connected car experience a reality. Dragon Drive understands and learns the needs and preferences of drivers and passengers over time to provide a personalised experience, including access to entertainment, navigation, points of interest, news feeds and in-car features such as heating and air conditioning. Dragon Drive's innovative new features, which were honored with a CES 2018 Innovation Award in the Vehicle Intelligence and Self-Driving Technology category, include: Multi-modal usage, Enhanced interoperability via cognitive arbitration and AI integration with car sensors.

Dragon Drive powers more than 200 million cars on the road today across more than 40 languages, creating conversational experiences for Toyota, Audi, BMW, Daimler, Fiat, Ford, GM, Hyundai, SAIC, and more.

From your perspective, how will conversational and cognitive technologies change the look and feel of the cockpit?

There will be definitely fewer buttons! As the driverless car technology becomes commonplace, voice-activated assistants will only become more influential. Our goal is to create a seamless and easy driving experience, in which drivers can be as productive as possible, in the safest possible manner. Nuance's vision for the future of the connected car is fuelled by the passion of its customers and partners to make people's connected lives even more productive and fulfilling. 

A while ago, IT in cars was seen as more of an enabler but nowadays it is viewed as a core element, linking the home and personal devices. What opportunities does this present Nuance?

By 2020, there will be 26 billion intelligent, capable, connected devices armed with conversational virtual assistants that manage nearly every possible consumer experience. These assistants all have strengths and specialties, but today, they rarely communicate with each other or work together across devices – and it's the consumer who loses out. The consumer is expected to know which assistant is available in each of the eco-systems and how to get each task accomplished. 

At CES we have introduced the Cognitive Arbitrator for Dragon Drive. With this new AI-based arbitration capability, the driver or passenger can simply articulate need and the automotive assistant figures out which assistant is best at solving the specific task – whether the automotive assistant itself or a third-party assistant or chatbot. A driver can talk to their in-car assistant to request driving directions and streaming music, but also make requests that will be routed to other third-party assistants that handle tasks such as shopping, food ordering, personal banking, and more. All of this can be requested by voice while at the wheel, making life easier and more intuitive for drivers in today's 24/7 society. 

Our goal is to create a seamless and easy driving experience, in which drivers can be as productive as possible, in the safest possible manner. As the driverless car technology becomes commonplace, voice-activated assistants will only become more influential. Therefore, ensuring those in the car have access to multiple assistants through one, easy-to-use hub is important. 

Given the appetite for voice-enabled intelligent technologies in cars, how is your automotive business shaping up? 

Dragon Drive powers more than 200 million cars on the road today across more than 40 languages, creating conversational experiences for Toyota, Audi, BMW, Daimler, Fiat, Ford, GM, Hyundai, SAIC, and more. In Q4, we delivered outstanding customer deployments (Such as Audi, Porsche and Range Rover) to make the connected car a reality for leading automakers, again, making automotive an excellent growth business for Nuance. 

We understand that about 63% of the revenue from your Mobile segment comes from the automotive. Will your automotive business break out from your Mobile segment as a separate category?

Yes, automotive now comprises the lion's share of the segment's revenue, approximately 63% with a double-digit growth rate in our Mobile business. Other areas of our Mobile Business are Emerging Markets and Communication Service Providers. As you see in the recent launch of our Cognitive Arbitrator we see a need of having the virtual assistants working together. The next generation of our solutions for connected (and eventually autonomous, electric and shared) cars and smart homes, as well as for enterprise customer service centres and doctors, will be powered by conversational AI, an evolution of technology that will allow for more dynamic and collaborative interactions with assistants that will complete more complex, realistic tasks with higher accuracy. By acting as the 'hub', this new technology ensures the virtual assistants are able to talk to each-other, sharing vital information such as calendar clashes and user preferences – easing the process for the end-user and maximising the power of the assistants. 

We hear that you are opening a DRIVE lab in Detroit to study innovation in the automotive sector. Could you tell us more about this centre?  

The Nuance DRIVE Lab will be a hub for research and learning that will advance user interfaces in cars, ultimately providing drivers with safer, smarter, and more delightful in-vehicle experiences and giving automakers the ability to differentiate with excellent user experiences. Over the coming months, Nuance will also open DRIVE labs in Europe and Asia to address specific local UX needs.