Sachin Lawande

Sachin Lawande

According to Harman, the next generation of sat-nav software will be more user-friendly and use a head-up display (HUD). The conventional screen will be replaced by technology known as 'augmented reality', which projects a computerised image on to the base of the car windscreen directly in front of the driver, and which can react in real time to the road outside. To find out how, Matthew Beecham talked with Harman's President Infotainment Sachin Lawande, Executive Vice President and President, Infotainment Division.

Harman presented an advanced HUD concept at a recent event in Karlsbad, Germany. Can you describe the system?

Harman is focussed on delivering rich content and driver assistance in a safe and intuitive way, whatever the platform of vehicle. The HUD is one of our more advanced high end technologies, seamlessly integrating navigation and car data onto the windscreen.  In fact, we call this Augmented Navigation. Technologies such as touch screen offer a good level of user interface but there are drawbacks. Invariably, the screen is in the centre console or the cluster, meaning the driver has to take their eyes off the road. With driver workload set to increase, we want to avoid overwhelming them and making things more natural, more intuitive.

Touch screen is proven in the market. Why not develop that technology?

Touch screen technology has been around for a long time.  It offers possibilities in certain applications but still requires the driver to look at a second screen and see where they are touching, which requires too much of user attention whilst driving.

But Apple has announced a series of patents in the last few days with touch screen a key element ...

The Apple patent contains technologies such as gesture control, tactile feedback and infra-red sensing that have been used by Harman and fellow automotive tier ones for many years. These technologies have already been widely used and do not impact the activity of trusted suppliers in the automotive industry.

Consumer device migration into the vehicle has increased rapidly in the past five years, with consumer electronics developers such as Apple are naturally keen to explore the in-car arena.  But this is a complex environment requiring well thought out solutions. Existing technologies and applications, identified in the patent, don't offer the solutions that car makers and legislators demand.

Coming back to your Augmented Navigation, how does it work?

Our system relies on a forward-facing camera that feeds the infotainment system with data collected from images of the route ahead. The infotainment system uses image processing to take the camera's feed and identify the road in a '3D space'.  A map-matching algorithm  in the head unit then detects the features of the road and combines it with the navigation data and positioning subsystem to construct a 3D virtual model of the route ahead. Route guidance markers, directional arrows, traffic-related data, and notifications indicating environmental and road features, are superimposed on the 3D image which is then displayed on the HUD, placing the information within the driver's line-of-sight.

What benefits does this integrated system bring?

Augmented Navigation is primarily focused on increasing the safety of the driver; by making the required information readily available without requiring additional input from the driver or passengers. Thanks to the HUD, the system can project guidance instructions on to the windscreen.  For example, lane guidance recommendation [will] tell the driver when it is safe to change lanes during motorway driving. With information displayed in real-time and directly in front of the driver, Augmented Navigation negates the need for the driver to glance at a dashboard or external display that might be on the limit of peripheral vision. By presenting navigational information, driving safety is enhanced as the driver's attention is directed on the route ahead.

From the considerable research that Harman has conducted in this area, we have found that during driving tests of our technology, people have a significantly lower cognitive load when deciding when and how to manoeuvre - for instance, during lane changes or turning.

With information projected on to the HUD or second screen, drivers do not need to mentally match a representation of the road - say a map - with the road ahead in order to understand where they are going. From a single viewpoint drivers can simply see the road ahead with the super-imposed navigation instruction and know precisely where make a turn or change lanes. We have found the benefit is greatest during complicated manoeuvres such as a busy roundabout or junction.

What were the challenges in developing this system?

From a technical point of view, displaying detailed 3D information in real-time on the HUD has been very challenging; to precisely project 3D arrows that follow the geometry of the road requires careful calculation of the vehicle position that continuously compensates for the car's movement and in real time. This requires a fair amount of computing power and if the processing cannot keep up then a delay is experienced in the shift of position of the directional arrows. However, processing tasks can be offloaded onto other systems in the car; for example, adaptive cruise control systems already feature sophisticated vehicle detection technology. The information from these systems can be used to prevent displayed guidance arrows from overlaying other vehicles on the road.

As the HUD projects images on to the car's windscreen, there can be severe limitations on the amount of windscreen space available for projecting directional images. Depending on the car's design, this can hamper the functionality of the HUD. As the technology develops, we expect to be able to overcome such an issue in the near future.

In scenarios where a HUD may not be appropriate, a second screen in the driver's line-of-sight can be used to display a real-time image of the road interlaced with information, graphics and navigation instructions provided by the Augmented Navigation system.

What about the drivers who could feel that the Augmented Navigation system is overly intrusive and distracting in itself?

Harman has considered how to present that information from the system to the driver in a fashion that is intuitive and prevents over-complicating the information displayed and avoid overwhelming the driver with distracting data. For example, we have considered the priority of the warnings supplied to the driver; multiple warnings alerted at the same time need to be avoided so as the driver is not overloaded with information. It is worth noting that we can combine ADAS features, such as collision warning and lane departure warning with this. In critical situations, we can warn driver using the HUD and overwrite other information displayed.

To further prevent driver distraction, Harman is undertaking many tests on the HMI (human machine interface) design. We are looking at what type of 'markers' work well in all driving conditions, as well as selecting the most appropriate size, colour and the timings of when they should appear on the windscreen.

Going back to the touch screen and suggestion by Apple of a tactile response, we still think our solution offers more flexibility and choice of what data to display and when.

When can we see augmented reality in passenger cars?

The remainder of this interview is available on just-auto's QUBE research service

Auto market intelligence
from just-auto

• Auto component fitment forecasts
• OEM & tier 1 profiles & factory finder
• Analysis of 30+ auto technologies & more