On a freeway close to Magna’s North American headquarters in Troy, Michigan, we got the opportunity to observe the supplier’s advanced driver assistance technologies in action. When the moment was right – and lanes clearly marked – our driver, Chris Van Dan Elzen, Global Product Director, Far Field Driver Assistance Systems, Magna Electronics took his hands off the steering wheel and let the Cadillac ATS steer itself.  With the demo car safely parked up, Matthew Beecham learned more about such novelties and what else is in the pipeline.

By when do you think the truly self- driving mass produced car could be available?

I guess I need to ask your definition of that, because I see more autonomous functions coming. By 2020 you’re going to see cars that can pull out of a parking spot and can execute parts of a drive under your command; you tell it to do it, but it’s still a car that you would drive.

A truly self-driving car would be one in which you can go on a commute and feel safe to have a nap, do a crossword or travel across Michigan in winter with your hands off the steering wheel all while feeling safe and comfortable.

I think we’re up to about 350 million people in the United States, so the guy that feels comfortable behind the wheel is a broad question as well. There are people who are going to have trust where they shouldn’t, and there are people who don’t trust it when they could, so that’s going to exist through this whole time.

We’re going to have systems in the car that are going to look for driver participation. We are going to have a hands-off capability; as soon as you have hands-off capability, now we’ve got to start talking about what manoeuvres we can handle and how long do we have to handle them? If I need the driver to come back are we talking about 10 seconds or are we talking about 10 minutes?

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

We don’t know what we don’t know. We’re going to test these cars in a lot of manoeuvres and we’re going to find some unique situation in some back corner somewhere, that’s always going to be the case. So we’re going to test in every condition we possibly can and make it as safe as we can. The easiest thing is to say we would like a driver to participate. The other side might be a Google, where they are working to jump straight to a fully autonomous vehicle and have a car with no steering wheel. The timeline for those two different approaches I think is quite different.

It sounds like you enjoy thinking through how people would react in those driving situations?

Absolutely, you have to play all the scenarios and you have to think about all of it. Not only that, you have to think about the human factors in this. We’re not just talking about delivering groceries, we’re talking about people and are they comfortable with this or are they not comfortable with it? Part of this will get into segmentation. When you look at it, there’ll be people that are comfortable and autonomous functions work for them, as well as people who aren’t comfortable and we need something else for them.

Could you give us a roadmap of what Far Field expects to see in the future?

Yes. We believe that the cameras are a foundational technology.  With the camera we can see the lane markings, speed limit signs and stop signs, we can see vehicles, cyclists and pedestrians, headlights and taillights, etc. We can squeeze the most out of cameras:  surround cameras, back-up cameras and the one in the windshield. Those will be kind of a minimum requirement and from there we would love to add more with consideration to the vehicle segment and costs.  From there, we would love to add ranging to the system including the front and corner radars, ultrasonic and lidar.  Other technologies that would be great to add include: vehicle-to-vehicle communication and local map information.  Weather information and local surface conditions would be great as well.

We would love to bring all that in there, but if you boil it all down to what’s the minimum, we would put a camera on the car. We’re going to see cars that are entry-level college kid cars and we’re going to see cars all the way up to ultra-luxury six figure cars, and they each have a different price point that they can hit for technology. We want something for all these cars.

Could you update us on what stage of development and commercialisation EYERIS is at?

Today in the Far Field segment we are in production with our 3rd generation front camera.  It is now up to megapixel resolution and a 52-degree field of view.  This allows us to see vehicles, pedestrians, signs, and other objects at much further distances than any previous generation, as well as seeing them sooner from both sides.  The quality of the features has improved, as well as the number of features.   Our 4th generation is in development and will push this even further with higher resolution and a 100 degree field of view.

In the Near Field segment we are now in production with higher resolution cameras in both rear vision and surround vision products.  We have also recently launched trailer backing capability with Ford on the F-series trucks.  This product line is also adding capability in the area of image processing, with object and pedestrian detection, automated parking and more.

If we look back over the last three or four years and what was predicted in terms of the application of camera based technologies, how has that actually played out in terms of North America? 

The remainder of this interview is available on just-auto’s Global light vehicle safety systems market- forecasts to 2030