Although uptake of ADAS features has increased, some hold reservations and anxiety surrounding the basic concepts. To assist with such concerns, Harman has developed augmented-reality HUDs, allowing drivers and passengers to experience a virtual field of view showing upcoming speed cameras, pedestrian crossings and more.

Alongside the augmented reality HUD, Harman has recently unveiled its latest line-up of automotive products which implement new in-cabin features to empower drivers and passengers, as well as to enhance their safety and overall experience.

We spoke to Rhita Boufelliga, senior director, Next Gen Platform at HARMAN International, to learn more about the new technology releases and what benefits they offer to the industry.

Rhita Boufelliga

Just Auto (JA): Could you tell me about Ready Vision?

Rhita Boufelliga (RB): At Harman we really are starting to think about development differently. We want to have a balance between the products and projects, and started to think about what consumers are looking for, what their concerns are. We are changing our mindset to the consumer first and figuring out how to make the experience in the vehicle safer and dispelling anxiety.

I think this goes back to consumer experiences automotive grade; part of that is the vision. Ready Vision’s motto is to keep your eyes and mind forward. It’s a suite of products, all related to this concept of keeping your eyes and mind forward.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

Part of that is our augmented reality HUD. We have the head-up display and we also have combined it with AR software.

At the last CES, we launched our newest product in our line-up, which is called QVUE. QVUE is about showing all of the most important information in the bottom of the windshield. It’s a reflective display projected into the bottom side of the windshield. That provides you with that information whereas the AR HUD provides you the augmented reality information ahead of you in a virtual distance.

How does this technology work in the vehicle?

The QVUE is scalable and modular. What we’ve launched are 8.8-inch displays that are reflected onto the black ceramic paint onto the windshield. It’s bringing the black ceramic paint up by a bit and then you’re reflecting onto that. You’re reflecting onto a black surface essentially.

It is a plane projection, meaning that even though the image is in front of you and looks straight, it’s pretty close to the windshield, it’s not out onto the street.

The way that we’ve looked at it is focusing on keeping your eyes forward because now you’re not looking down into your cluster. It provides you with a curated set of information and by having the different displays you can show different kinds of information on each one.

The way that we’ve looked at it is focusing on keeping your eyes forward because now you’re not looking down into your cluster.

In our main use-case example, we showed the cluster information, on the one closest to the driver. Then we showed some navigation information with alerts that may come up. We had a collaboration with third party apps like the weather channel. We can also show the battery charging status or charge stations that are coming up for your EV vehicle.

Then the third display was for entertainment like media information, metadata, those kind of information sets.

The very interesting point about QVUE is it has a wide line of sight in the sense that all the passengers and all the occupants of the vehicle can see this information being projected at the bottom of the windscreen. It’s accessible from passenger side, driver side or even from the rear seat.

It’s very flexible, it’s open to different points. For example, we can also show an animation across all of the displays that can even help with your focus as a driver. This is the beauty of our ready products; they can work independently, but they can also work together. In this sense, for example, the QVUE software is running on the ‘Ready’ upgrades. It can be complemented by our ‘Ready Care’. For example, if you detect that the driver is experiencing a high cognitive load, you can immediately switch your HMI and your UX of what information you’re providing them in order to match the state of your driver.

QVUE shows information in the bottom of the windshield

What is the current development timeline?

These are not concepts or just demos. These are real products that are ready to be deployed. The real point here is working with our customers to focus on their branding and maybe some human machine interface (HMI) customization, or any customization that is needed for their vehicle, but not on the heavy lifting of getting a basic product from the ground up ready to go. That work is done.

What are some key benefits this solution offers to OEMs?

With the AR HUD the major benefit here is its applicability as we see ADAS and autonomous driving really taking off in the automotive industry. What AR HUD allows you to do is smartly visualise all the information that you’re getting from ADAS.

With this advanced technology and from the visualisation of ADAS you can see, for example, your lane departure warning or any other features; you can see them superimposed on the lane.

It’s a way to present information to the vehicle occupants from the autonomous driving technology. You can help to present what is going to happen to the user, which really helps dispel the uncertainties and anxiety that that some consumers might have.

With this advanced technology and from the visualisation of ADAS you can see, for example, your lane departure warning or any other features; you can see them superimposed on the lane. It’s a lot easier and more comfortable to understand it; there’s less of a learning curve there – you see it and understand it.

What are your predictions for this technology?

I think part of the ‘Ready’ product is the concept of ready for anything. What we mean by that is to anticipate the needs of our consumers and customers and be prepared for that ahead of time to provide them with products that could even exceed their expectations and continue to deliver on the consumer experience in automotive grade.

We have just launched QVUE; this is our latest product and we’re continuing to work on some enhancements on that product as well, based on some of our customer feedback that we’re working with.

I would say in that sense, we feel that the current products that we are offering are still very relevant and brand new. Of course, from a software standpoint we are also adding some features, for the QVUE use cases as well as for the AR HUD.

ADAS visualisation is a key point for AR HUD. We can see some advances in ADAS and with the sensors coming through, we always keep up with how to visualise that in such a way that makes sense for the user and is not too cumbersome.

We also announced at CES that we have our current version of QVUE which is a 2,000 nits brightness, and we are working on another version that will have a higher brightness. That would be for specific use cases.

Is there anything else that you would like our readers to know?

Sometimes we’re so involved in our day-to-day system that taking this step back and really thinking about what we are doing, and why are we doing it is very interesting. We really see the windshield bridging the gap between the physical and digital world.

When I think of Ready Vision, I think of it from this full windshield concept; what are the experiences we can bring to really keep your eyes and mind forward and keep up with all this upcoming technology?

I see the windshield as a sort of bridge. For me, all the technology that comes together to make the physical and digital world fuse is really where my passion is. Whether it’s in augmented reality or directly showing you all the information, to really provide you with a cohesive experience in the  vehicle.

In addition to the visual, we also have some work that we’re doing with audio. For example, using directional audio to understand the emergency vehicle, where is it coming from, and using audio to assist with that in an intuitive manner. We’re really trying to make the driving experience enjoyable and safe.