While giving instructions in our cars is nothing new, putting questions to the likes of Alexa and Cortana while on the road is. Carmakers are fast adopting virtual assistants, confirming that speech is becoming the preferred interface for tomorrow's cockpit. Continuing just-auto/QUBE's series of research snapshots, this one takes a look at virtual driver assistants and who is embracing who.

Amazon's Alexa hits the road

Earlier this year, we learned that Ford and Amazon have teamed up to offer drivers the ability to access their car from home and call up other features from their vehicle via Alexa – Amazon's cloud-based voice service. While Amazon has managed to introduce this voice-control system into pretty much everything, its move into cars marks another major milestone. Using Amazon Echo, a hands-free speaker and voice command device that interfaces with Alexa, Ford owners could request assistance with various functions of their car. From inside the car, a driver would access Alexa through the steering wheel-mounted voice recognition button, allowing them to make requests of connected smart devices or functions of Alexa, including weather reports, music, shopping lists, etc.

No escape from Windows 10

Ford is not the only carmaker offering voice assistance. Nissan and BMW stole some of Ford's media thunder at the most recent consumer electronics show (CES) by revealing that they will introduce Microsoft's Cortana virtual assistant into cars, using Windows 10. This means that the voice-controlled capabilities already offered by Microsoft Cortana on a home PC or smartphone in future will also be available on board a BMW. For example, BMW Connected can provide a reminder en-route of an upcoming appointment for which no location has yet been fixed. And

Cortana can be used to make a suitable restaurant recommendation and reserve a table.

Cortana can be used to make a suitable restaurant recommendation and reserve a table. Cars are just part of the picture for Cortana. Microsoft is looking to extend its tentacles into as many smart home devices as possible, such as kitchen appliances and thermostats.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

Okay Google, start my car

US owners of 2016 and 2017 Mercedes-Benz models can now use both Google Assistant on Google Home and Amazon Alexa to 'talk' to their cars. In addition to using Google to remotely lock, unlock and start their car, those with Alexa devices can say, "Alexa, ask Mercedes me to send an address to the car" for remote navigation input and point of interest requests. "We want to offer our customers a broad range of services 24/7, not just when they are in our cars," said Nils Schanz, head of IoT and wearable integration at the automaker's US research arm. "[Our] goal is creating an intelligent ecosystem around cars and developing cutting-edge technology to make everyday life more convenient for our customers."

For its part, Hyundai Motor America is offering customers a three-year contract for remote door lock, engine start and a monthly 'vehicle health' report as a standard part of its Blue Link (the carmaker's infotainment software) service via smartphone, smartwatch and Amazon Alexa. "Remote start has always been our most popular remote service," said Manish Mehrotra, director, digital business planning and connected operations, Hyundai Motor America. "Hyundai owners request more than one million remote starts a month during the winter and perform more than a million remote locks annually."

At the heart of Toyota's Concept-I is a powerful AI system, named Yui that learns with the driver and builds a 'relationship'.

Toyota unveiled a concept car at the most recent CES, dubbed the Concept-i. The heart of Concept-i is a powerful artificial intelligence (AI) system, named Yui that learns with the driver and builds a 'relationship' (a term used no less than four times by Toyota here). Concept-i goes beyond just mere driving patterns and schedules, maintains the carmaker, making use of multiple technologies to measure emotion, mapped against where and when the driver travels in the world.

Hey Siri

Apple's Siri is already playing a smart assistant role in some cars via the company's CarPlay software. For example, Vauxhall's facelifted and renamed Mokka X features Apple CarPlay that allows you to make calls, send and receive messages and listen to music through the touchscreen or by voice via Siri.

A vision of tomorrow's empathetic car

Using our eyes, voice and hand gestures, it is possible to eliminate buttons from an infotainment system. In giving us her vision of this touch-free user experience, Rose Ryntz, Vice President, Advanced Development & Material Engineering, International Automotive Components (IAC) Group, told just-auto: "Currently, we see the first distraction free interfaces that allow the driver and passengers to control system functions of the car. These existing scenarios for autonomous level 1, 2 and 3 see hand gestures as technical solutions for these features. But they are cumbersome and not accurate as well as forcing the person to double check by looking at the information to reaffirm and therefore distracting them. An alternative could be functions that are voice activated, i.e. Alexa, Cortana or Siri. One step further, AI will soon free the 'driver' from controlling the vehicle and will instead detect and implement instantly like a radar by recognition and machine learning."

With nearly all new cars expected to offer voice recognition by 2022, future systems may evolve to interpret tone of voice and facial expressions, says Ford. Tomorrow's systems – equipped with clever microphones and in-car cameras – could learn to identify which songs we like to hear when we are stressed and, for example, when we prefer to simply enjoy silence. Interior lighting could also complement our mood.

"We're well on the road to developing the empathetic car which might tell you a joke to cheer you up, offer advice when you need it, remind you of birthdays and keep you alert on a long drive."

"We're well on the road to developing the empathetic car which might tell you a joke to cheer you up, offer advice when you need it, remind you of birthdays and keep you alert on a long drive," said Fatima Vital, senior director, Marketing Automotive, Nuance Communications, which helped Ford develop voice recognition of the SYNC in-car connectivity system.

"Voice commands like 'I'm hungry' to find a restaurant and 'I need coffee' have already brought SYNC 3 into personal assistant territory," said Mareike Sauer, voice control engineer, Connectivity Application Team, Ford of Europe. "For the next step, drivers will not only be able to use their native tongue, spoken in their own accent, but also use their own wording, for more natural speech."

Ford is working with a team at RWTH Aachen University to research using multiple microphones to improve speech processing and reduce the effect of external noise and potential disruptions. Nuance says that within the next two years, voice control systems could prompt us with: "Would you like to order flowers for your mum for Mothers' Day?" "Shall I choose a less congested but slower route home?" and "You're running low on your favourite chocolate and your favourite store has some in stock. Want to stop by and pick some up?"  

See also 2017 Automotive Megatrends and Markets Report