One of the themes running through this year’s IAA was autonomous driving. While some OEMs believe such cars will appear on a road near you by 2020, others are not so convinced. Meanwhile, Continental is doing its part to help us get there.  The supplier used the show to highlight a number of innovations, including its latest advanced driver assistance systems (ADAS). To find out more, Matthew Beecham talked with Friedrich Angerbauer, Executive Vice President, ADAS.

As we understand it multifunctional cameras are forming a core part of the advanced driver assistance systems. Some reckon that these cameras will be cheaper, more effective and easier to integrate than radar and infrared systems.  Is that correct?

Discover B2B Marketing That Performs

Combine business intelligence and editorial excellence to reach engaged professionals across 36 leading media platforms.

Find out more

I would not say that they are easier to integrate but they are more effective. Our fourth generation of multifunction cameras will be launched in 2015 and feature a scalable concept.  That means we can offer a scalable multifunction camera which could include simple functions like lane detection or light control all the way up to premium functions such as traffic sign recognition and pedestrian detection.

Do you see such cameras appearing in lower car segments?

Yes, we see more going down to the lower segments. Our C segment customers need functions that would give them a Euro NCAP five star rating.  So that means they are asking for pedestrian detection which can be achieved using our mid-range camera. 

You mentioned the Euro NCAP.  I guess the requirement for AEB will drive the ADAS market?

It’s a driver of Euro NCAP, yes. These are the absolute drivers.

Some OEMs see an autonomous car on the road by 2020. What’s your view?

We believe that there will be several steps of evolution with regard to automated driving and autonomous driving. We have already seen the first partly automated driving functions launched this year. For example, Mercedes launched with the new S-Class the first functions like the traffic jam assist. We expect the car will perform more [autonomous] manoeuvres in future. But there is still a long, long way to go and many questions to be answered before we see a truly autonomous car on the road.

Do you see a Chinese appetite for these driver assistance systems?

Yes, there is an appetite. There is interest in the market definitely, but what we see is currently there is no real business in the market. So everybody wants to equip vehicles, especially the C segments, so not in the cost sensitive A and B segments in China but starting from C segment with driver assistance systems. Mainly here we see a trend both in the direction of ACC for example, is seen as very valuable, but also blind spot and emergency braking. The only issue is that everybody wants to launch some pilot application. It’s very low volume today. So there is no big business out there. So we see a need for pilot applications because the local Chinese OEMs want to learn about these new technologies but there is not really a market.

What about the Indian market?

I would say India is even behind China; there is not an interest for these systems.

You mentioned earlier traffic sign recognition. To what extent has it been advanced?

Yes, it’s a little bit tricky sometimes because you see a lot of signs on the road; they are also not really standards. Just think about highway driving and you have an exit lane, and some of these exit lanes are quite close to the highway, just one or two metres distance and there is a sign which says 60 or 80, and how should your vehicle driving on the right lane know, “Okay, this is the sign for the exit road and now the sign for the highway?” So we still have to work on that situation by merging data for example with navigation systems that we know there is an exit lane for example. So we are still doing the refinement but the detection and the faults, positive faults negative rate is getting better and better with the newer systems that are now launched in the market. That’s number one.

Then today we are quite limited because of processing power. With the next generation we are increasing the calculation power by a factor of three to five, so with more processor power we can also detect more signs. We are working on US signs which are rectangle signs, a little bit more difficult. We are working on text reading on the fly, so when the car is driving 150 kph we have to be able to read also these signs. So this is what we are working on today and of course our target is in a few years to be able to read all traffic signs all over the world.

Is there a danger of giving too much assistance to the driver?

The remainder of this interview is available on just-auto’s QUBE research service