While giving instructions in our cars is nothing new, putting questions to the likes of Alexa and Cortana while on the road is. Carmakers are fast adopting virtual assistants, confirming that speech is becoming the preferred interface for tomorrow’s cockpit. Forget lucky car charms, Chinese manufacturer Nio has developed a digital assistant in the shape of a dashboard robot that moves to face you when you speak. Continuing just-auto/AIC’s series of interviews, we spoke with Ted Li, developer of the NOMI Mate, to understand more.

Could you tell us a little about what NOMI Mate is and can do?

Sure NOMI is an artificially intelligent digital companion which can learn about the driver and their passengers. Available in the ES6, EC6 and ES8, it is the first in-car-AI with a deep learning function, identifying individuals and their preferences that we believe provides a more emotional engagement.

The interface is hugely different compared with other systems in car you might have seen. Equipped with a digital ‘face’ and ‘eyes’, NOMI can address each occupant in the car. Over time, NOMI understands daily routines and travel patterns, suggesting routes and planning for calendar events. But it can also be used to control functions in the car such as remembering seating and steering positions when it recognises you. It’s got a fun side too; it can take a photo or have a joke; it is a digital companion or mate. That’s why we called it NOMI Mate.

What was the fundamental thinking behind the creation of NOMI?

For us, NOMI is the definition of the relationship and companionship that we are trying to give users with their Nio vehicle. We believe this type of UX will shape the future of human vehicle relationships and that cars should have a much better emotional attachment with the user, and eventually, the user with the brand.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

But is it actually different to the systems we see in European OEMs?

Yes, it is. Our founder William Li had spotted an issue with the standard connectivity in existing vehicles on the market. If the user is truly in companion with the car, why is the user just staring into the air with no focus for the conversation? Normally, when a human is talking to another human, they look at each other’s faces. You don’t have that in most cars. When a user is talking to NOMI in the car, they are in fact talking to the AI in the car.  It looks at you and you look at it. We believe this is more engaging and personal.

You mentioned NOMI looks at you. Where did the thinking for having NOMI as a dynamic moving character in the car come from?

We really wanted the user to feel as if NOMI is alive. This means that NOMI cannot just exist as a static 2D interface. That doesn’t seem coherent. We decided to set a clear target that we want to this character to be alive and to represent the car and be the head and face of the vehicle.

Is the movement distracting for drivers?

Many people, especially from the automotive industry, fear having moving things in the car.

They fear that it might be a distraction and that the car will be unsafe. We believe that this issue can be solved by designing the movement to be natural and more human-like, making it easier and fast to process.

We looked at what the film and animation industries do to help us design this whole character so that the motions aren’t just simply robotic and mechanical but driven by emotions. Our choice of hardware was crucial to achieve such tiny movements that we needed.

With complex movements, is that hardware proven to have long-term durability?

This is something that automotive industry never dreamt of, but we’ve passed the most rigid of crash tests. It’s survived repeated 40G forces and is world leading from a durability aspect.

How has the NOMI system evolved over its lifetime?

We developed our own stack of software which includes the ASR (Active Speech Recognition) at the front end of the design. Then the middle part, we have natural language understanding, part of which is located within the vehicle’s hardware, but most of it’s in The Cloud and is updated every day. I remember a case where NOMI was being used at a test drive event and it was asked a very specific question by someone, but it gave an incorrect answer. We immediately figured out the issue and we fixed that bug. It was a hot fix, almost instantaneous. The final element is the natural language calibration, or as we call it, the text-to-speech. That part is a very standard engine, but we had to customise how it sounded to perfect its voice, character and story. This alone took half a year to define.

With regards to the application abilities, it can link to the car functions; doors, the screen, the sound, even assisted driving through the Nio Pilot. When the NOMI Pilot is trying to warn the user of a danger it can physically speak to them and say, ‘please watch out and grab your steering wheel’. This is another part of the application that we are constantly updating in a separate loop. Along with this we also have other connected services which are endless, for example on the cloud, we doc Nomi into many capabilities, such as music.

What is the extent of the involvement of NOMI in the vehicle operating system as a whole?

From the beginning, we did set a clear boundary where NOMI does not execute driving and critical safety tasks. For example, NOMI doesn’t operate the drive mode. This is because when you’re altering your drive mode, or say you have children in the car you don’t want anyone giving the command to just change the drive mode. This is because it would change the driving pattern and changes the steering wheel force, which could interfere or impact the driving behaviour.

The other thing is, for example, we don’t allow NOMI to operate the tailgate because someone outside of the vehicle could potentially say ‘open the tailgate’ and then it would just open. So those are the boundaries we set earlier, but we have since figured out this may not all be necessary. Hopefully in the future, we might be able to talk to NOMI and say, ‘drive for me’ and it will activate the AD function. Upon discretion and after testing and validation of user mentality, if we decide it serves a better user value, we could make these changes in the future.

Have you worked with any partners on any aspects of the system, or has the development for NOMI been all in-house at Nio?

The integration of the system and most parts are developed in-house but some parts are developed outside. For example, for the active speech recognition we are collaborating with a partner because those parts of software of capabilities are primarily driven by common capability and require massive amounts of data. We know there are players who are already strong in these areas for example, Nuance, which is a European company we’ve worked with in the past. We selectively choose the partners that provide this general capability and efficient development for us. We will continue to do this going forward to make this the best UX possible.