Could you tell us a little about Smart Eye and the headline message that you are putting out at this year’s CES? 

Smart Eye is the global leader in Human Insight AI, technology that understands, supports and predicts human behaviour in complex environments. We are on a mission to bridge the gap between humans and machines for a safe and sustainable future. Today, our technology is embedded in next-generation vehicles, leading the way towards human-centric mobility through Driver Monitoring Systems (DMS) and multi-modal Interior Sensing solutions. In addition to automotive, we also have a Research division, which provides the world’s leading research organizations, with high-fidelity eye-tracking systems and the iMotions bio-sensor software platform for human factors research, and Affectiva’s Emotion AI for Media Analytics.

This year at CES we are showcasing our latest and greatest automotive technologies. First is our state-of-the-art, multi-modal Interior Sensing AI which ​​combines the company’s industry-leading driver monitoring with cabin monitoring to provide deep, human-centric insight into what is happening inside a vehicle. The system understands the state of the driver, the cabin and the backseat passengers, by detecting eye gaze, head movement, body posture, occupancy, activities, children, pets, objects and more. Powered by Affectiva’s Emotion AI, it also captures nuanced emotions, reactions and facial expressions. We also have our production-grade low-cost Driver Monitoring System. This system is fully GSR and Euro NCAP compliant and provides intelligent safety features that detect driver states and behaviour. And lastly, we have AIS, our new end-to-end driver monitoring system for fleet and after-market. AIS uses Smart Eye’s proven automotive grade DMS software for OEMs, integrated with purpose-built hardware to deliver high intelligence, great flexibility and superior performance.

The idea of a car watching the driver is not new. We have seen driver alert systems being offered by a number of OEMs using different guises. What is your USP? 

Smart Eye has over two decades of experience building AI-based eye-tracking and delivering automotive-grade Driver Monitoring Systems, as proven by 89 production contracts with 13 global OEMs. The idea of a car watching the driver is not new, but what is, is the need to also understand what is happening inside the entire cabin, beyond just the driver. This is evidenced by the rapidly evolving Interior Sensing market, where advanced AI, computer vision and other sensing modalities are applied to measure the state of the driver, the cabin and the occupants in it. Smart Eye Automotive Interior Sensing, fueled by Affectiva’s Emotion AI, detects not only nuanced emotions, reactions and facial expressions but also occupancy, activities, children, pets and objects within a vehicle. We are excited to be at the forefront of the evolution from DMS to Interior Sensing, bringing to market our unparalleled automotive-grade Interior Sensing solution, better and faster than the competition. 

As the automotive industry shifts toward higher levels of driver autonomy, what are the opportunities for your business? 

As the industry shifts toward higher levels of driver autonomy, we’ll see automakers continue to add high-tech features to stand out from the competition. There will be an opportunity here to use Interior Sensing to provide differentiated mobility experiences that enhance wellness, comfort and entertainment.

For example, using insight gathered via Interior Sensing, vehicles could tailor in-cabin entertainment. In both owned and ride-sharing vehicles, automakers can leverage the technology to serve up content based on riders’ engagement, emotional reactions, and personal preferences.

We believe that next-generation vehicles will change the relationship between people and cars, with the in-cabin experience becoming the most important consideration to consumers. Therefore, there is a big opportunity for Smart Eye and our technology to be a part of that evolution, giving both drivers and occupants a more personalized, comfortable and enjoyable ride.

Driver monitoring for fatigue and distraction has become a major focus of automotive safety regulators and governments worldwide. This trend looks set to continue in SAE level 2 (partial) and 3 (conditional) semi-autonomous vehicles. To what extent are regulations driving increasing demand for DMS (Euro NCAP, EU GSR, SAFE act of 2020)? 

We’ve definitely seen an increase in demand for DMS as a result of global safety and government regulations. In Europe, the European New Car Assessment Program (Euro NCAP) updated its protocols and began rating cars based on advanced driver monitoring. To get a five-star rating, carmakers will need to build in technologies, such as DMS, that check for driver fatigue and distraction.

And starting in 2022, Euro NCAP will award rating points for technologies that detect the presence of a child left alone in a car, potentially preventing tragic deaths by heatstroke by alerting the car owner or emergency services. To achieve this, we’ll see automakers moving the camera to the rearview mirror and deploying Interior Sensing so they can monitor more than just the driver. And, with this new perspective, OEMs and Tier 1a can develop systems that detect not only people’s emotions and cognitive states, but also their behaviours, activities, and interactions with one another and with objects in the car. 

I guess that in-cabin sensing has more potential than driver monitoring. Consumer technologies such as fitness trackers have been popular for some time, monitoring our heart rates, performance and sleep, but how might the renewed focus on wellness translate to the automotive space? 

There is an opportunity for Interior Sensing to transform the in-cabin experience. When it comes to wellness specifically, think about drivers and passengers who have daily commutes. In the mornings they may feel groggy and worried about the day ahead, and in the evenings they may get frustrated by being stuck in rush-hour traffic. But what if they could step out of their vehicles feeling better than when they entered?

With Interior Sensing AI, vehicles could provide a customized environment based on occupants’ emotional and cognitive states. In the morning, they may prefer a ride that promotes alertness and productivity, whereas in the evening, they may want to relax. The system could learn an occupant’s preferences and cause the vehicle to adapt accordingly, helping to enhance overall wellness and comfort. 

ADAS is a fast-evolving landscape. While we are seeing an acceleration of level 1 and 2 driving automation, there are delays in higher levels due to the lack of an established regulatory framework and the technical challenge of providing safety in all driving situations. In terms of Level 2+ for cars in the near future, how do you see the market evolving? 

With automakers turning the cameras inward, I think that the biggest focus for Level 2+ vehicles in the near future will be the adoption of interior sensing systems, to gain a better understanding of what’s happening inside of a vehicle. Many of the existing DMS on the market use a camera mounted on the steering wheel, tracking the driver’s eye movements and blink rates to determine whether the person is impaired—perhaps distracted, drowsy, or angry. But to tell whether a driver is truly impaired is a tricky task – you need more than their head position or blink rate, you need the full context. This is where the need for interior sensing, and not only driver monitoring, comes in.

With Level 2 and 3 vehicles, interior sensing comes into play addressing the handoff challenge. When sensing driver fatigue, distraction, or other impaired states, Interior Sensing can determine if the car must take over control from the driver. And, when the driver is alert and engaged, the vehicle can pass back control. 

How are you feeling about 2022 – both in terms of business prospects and more generally? 

This past year was one of growth for the Smart Eye team, with the acquisitions of both Affectiva and iMotions. There has always been a synergy between our three companies and we have a long history of working together. By joining forces, we can build on these synergies, capitalize on shared vision and values and set a path for widening the possibilities in multiple sensor integration solutions across verticals. I am excited to head into 2022 with the Smart Eye team positioned as a true, integrated powerhouse in delivering unparalleled insights into human behaviour.