Continuing just-auto’s series of interviews with tier one suppliers at the 2015 IAA, Matthew Beecham met with Dr Christian Amsel, Member of the Executive Board, Business Division Electronics, Hella to enquire about tomorrow’s advanced driver assistance system (ADAS) cameras and sensors.

Could you tell us about some of the ADAS technologies that Hella is highlighting this year here at the IAA and your message?

This year our IAA motto is “making sense(s)”. We think that the connected vehicle requires additional senses, such as sense of touch, smell and sight to allow the integration of the vehicle into a digital network of “mobility units”. In today’s vehicles this is already realized through lighting, camera and radar technology, for example from Hella. With an increasing degree of integration of vehicles into networks such as the principles of vehicle-to-vehicle or vehicle-to-infrastructure, the vehicle needs to extend its senses to derive sufficient information about its environment in order to draw up a “digital environmental map”. More specifically, this digitalized map should not only include objects like pedestrians, vehicles or road boundaries, but also environmental conditions such as air quality, light conditions or rain intensity.

Furthermore, we think that connected vehicles will interact more closely with their users through smartphone integration or smart lighting functions. This is where Hella’s vision of “making sense(s)” comes in: We provide connected vehicles with (human) senses that allow interpretation and interaction with the environment. To convey our vision at the IAA 2015, we have sketched five use cases on consumer level, which reflect how connected vehicles could be enhanced by innovative electronics and lighting technologies. In the use case “damage detection”, the consumer learns how a daily life situation – like a car being scratched in the parking lot – can be handled independently by the connected vehicle: the car detects the damage, illuminates the environment, takes a picture of the incident, informs the driver via smartphone and sends the case to the insurance for further processing – this is how innovative technology can make our life easier. This is Hella’s motto “Technology with Vision” in action.

To what extent are you seeing the proliferation of sensors around the car?

There will be some core technologies in the context of autonomous driving in the future. The core technologies are definitely going to be camera and radar technology. But we also think that LIDAR technology will be needed as well. However, each technology has advantages and disadvantages. Radar for example is more or less weather independent – fog, rain and so on. Camera technology has advantages as well as you can detect traffic signs and lanes with it which are needed for autonomous driving. No other technology can do this and you really need a camera for it. And laser sensors are needed in city and urban environments where you have to be aware of all the pedestrians – a very complex scenario. Here you need a laser scanner capable of detecting any object in front of or around the car.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

Of those technologies, none will dominate? You need them all for different reasons?

Yes, that’s right. A camera can offer most of the functions but is limited to performance, such as detection range. It is also heavily dependent on for example weather conditions and daylight. If you take safety relevant functions including braking into account camera technology as a single technology would not be sufficient anymore. You really need radar in as a stand-alone sensor or fused with camera.

In my opinion radar and camera are the two core technologies. We now expect LIDAR, laser scanner technology, as an additional technology. For example for pedestrian detection in cities and other functionalities in the context of autonomous driving where you still need some redundant technologies due to safety reasons. We will have to see what the 79 GHz technology with a higher bandwith can do and if this technology will be able to replace e.g. laser.

Before moving on, is there anything else you would like to highlight in terms of your main messages from IAA?

We also think that the car needs more sensors, for example crash sensors, in addition to camera and radar technology.

Imagine you are sitting in a restaurant and someone bumps into your car in the parking lot. Now it would be nice to have a technology which would, for example, detect this accident and send a message to your mobile phone. We have developed such a technology at Hella. With it the car will automatically inform you that somebody has hit your car and you can go outside to clarify the situation.

If you look at Volkswagen you will find a new application “guided parking” based on front camera. The driver has to train the way to the garage only one time. The next time you are getting out of the car, you don’t have to drive the car to the garage by yourself; you just hit the roof of the car three times. This is enough to tell the car, ‘Please go into the garage’. This is just one example of a car with additional senses.

Let’s say that someone bumped into your car, left without stopping and the data has then been recorded on the owner’s mobile. Who owns that data and can police have access to it?

I can only talk about the technical side. Just from a technical point of view, in case of an accident, pictures from the surround view cameras can be taken and stored in the car. Who owns this data and for which purpose this data can be used is a different topic and has to be answered by customers and legislation.

How does this feature interact with your lighting or electronic solutions?

We have dedicated use cases where we already use the information from sensor technologies and combine it with lighting applications. If an accident happens or somebody hits your car this event will be recognised and you could be informed via your mobile phone. On the other hand, the exterior lighting will be activated immediately and the camera takes a photo. The photo will be stored in the car and you can use the information in case the person who has caused the accident has disappeared. This is of course an important use case.

There are other use cases which really will improve comfort. Rental car companies are also interested in this. They would of course like to know if an accident has happened while the car was rented out. So far this cannot be easily verified. But if you now return the car the company can easily check by means of a diagnostic device if an accident has happened while the car was rented out. If this is the case, the car can be checked at the garage before the next driver takes it over. This will make car hiring a lot safer.

Lots of interesting possibilities using some clever technology but I guess you need to direct the time and effort on what the customer actually wants?

For us as a supplier it is important to find out what kind of functions a customer expects from a technology. Discussions with OEMs and end customers are therefore essential for us. We have noticed that the complexity is increasing more and more. We are offering something that can be called a single sensor. This sensor can be applied in so many different ways. It has so much potential for innovative applications, like the car rental example I just mentioned.

What is your vision of the connected car?

The vision of the connected car in 2020 is that cars and trucks will be far more autonomous. I clearly see benefits for the truck segment but also the passenger vehicle segment. Driving will become much safer and more comfortable. It is hard to foresee how the penetration of the vehicle-to-infrastructure will proceed in all the different countries. The question is not only standardisation of protocols and interfaces but also the question of who will invest into the infrastructure?

In the context of the very fast 5G mobile phone standards, the benefits for the driver will increase rapidly. Relevant information, for example on traffic jams, weather conditions and so on will be provided on time. Each car will contribute information to the cloud.

Here in Hall 3.1 of the IAA we’ve just walked through the ‘Startup Zone’ made up of a collection of small and innovative companies presenting their business ideas. Has anything caught your eye?

Yes, I have seen an app which can be easily installed on your mobile phone, or on your children’s mobile phone. The GPS information collected by this app tells the parents where the child is at the moment. The child can even push a button to send a signal to the parents when it needs help. If you extend the functionality of this app, you could also use it in the car. You could then be informed on time if there are children in or near the road. In this way, pedestrians can be classified as children or even blind people. If you drive an electric car, blind people cannot hear it because an electric car doesn’t make any noise. If this app in the car detects blind people in the road, then the noise generator in the electric car could be switched on and the blind people can be alerted that a car is approaching.

As we understand it, multi-function cameras are forming a core technology for advanced DAS. These cameras will be cheaper, more effective and easier to integrate than radar and infra-red systems. Is that correct?

Camera technology can easily be integrated behind the windshield. Yet today the opening angle of this camera technology is still limited to normally 50 to 60 degrees. So if you really would like to see if there is traffic on the left or on the right at a corner or a crossing, then the opening angle is not sufficient and you would need more cameras. But we now have realized that radar technology will take over this kind of crossing assistance by looking to the left and looking to the right. The new radar technology will be integrated behind the bumper. Thereby the integration problem which we had in the past can be solved. You no longer need these expensive devices in front of the radar system because the radar integration behind the bumper will be much easier and even invisible for the end user. This is quite handy, and I think will be a focus in the future. If you integrate four sensors at the corners you will even have a 360 degree surround view …

As effective as cameras are, they have a few drawbacks such as limited range and performance (due to rain, fog and varying light conditions). To what extent can sensor/data fusion help?

Every sensor technology has advantages and disadvantages. A camera can be disabled by snow or by dirt. Also a camera is like a human eye. It cannot look through fog, whereas radar can. Therefore if you combine these technologies you can eliminate the disadvantages of each. And the advantages of both get enhanced by the fusion.

Combining the two technologies can offer the end customer a higher availability of functionalities. If, for example, camera technology is available in 80-90 percent of road situations, adding radar technology can also cover the remaining percentage, such as in foggy situations or heavy rain.

With more and more sensors under development, what is the significance of 79 GHz UWB sensors and how will 77 GHz MRR can make a difference to the ADAS car?

Mid-range radar systems which are based on 24 GHz or 77 GHz do not have any significant difference regarding detection range. A range of about 100 m is typical. The difference between 24 GHz and 77 GHz radar sensors is based on applicable bandwidth which results in a difference in range resolution. For basic rear radar functions like BSD or RCTA the 24 GHz radar technology is currently the market mainstream. In the premium segment 77 GHz becomes popular due to add-on features requiring higher range resolution which result for example in an active intervention of lateral dynamics of the vehicle (e.g. active BSD intervention).

A significant higher bandwidth is available when using 79 GHz radar. This leads to a higher resolution which is usually required for near field functions like parking. Therefore these types of sensors or especially a smart combination of 77 GHz and 79 GHz sensors will offer new kinds of radar applications in vehicles, such as parking, which is currently based on a different measurement principle.

Do you see trifocal cameras replacing mono-camera in long-term?

The remainder of this interview is available on just-auto’s QUBE light vehicle safety systems research service