Continental is evaluating the next generation of an environment model to deliver a true-to-life, 360-degree view of a vehicle's surroundings.

The supplier notes to allow automated vehicles to assume control from drivers, it must continuously acquire, process, and interpret data, while also acquiring and building up contextual knowledge. This is the only way to achieve sophisticated levels of automated driving that are able to master anything from straightforward freeway driving to the highly complex urban environment.

A reliable environment model requires a range of information, for example of other traffic participants, of static objects such as road boundaries, of the vehicle's own precise location, and of traffic control measures. "In order for the system to acquire this information step-by-step, a range of sensors such as radars, cameras, and Surround View systems are needed," said Continental Advanced Driver Assistance Systems Business Unit, head, Karl Haupt.

"The aim is to achieve an understanding of the vehicle's surroundings which is as good as or better than a person's own understanding. More range, more sensors, and the combination of acquired data with powerful computer systems will help to sharpen the view and is the key to achieving a consistent view of our surroundings."

Since each of the different surroundings sensors – whether the radar, camera, or surround view system – have their own physical strengths and weaknesses, it is possible some applications will reach the limit of their capabilities. In addition to extra, for example backend information, other sensors are needed for enhancing reliability and robustness as well as for further redundancy.

"This is why we are working on a High Resolution 3D Flash Lidar, which is ideal for fulfilling the strict requirements regarding vehicle surroundings monitoring. The sensor captures and processes real-time 3D machine vision and does not contain any mechanical components," added Haupt.

Data from various sensors can be processed in either the individual sensor or on a central control unit, constructing a high-precision environment model of the vehicle's surroundings. The greater the data volume to be processed and analysed, the more computing power is needed.

This, in turn, drives the need for control units that are more powerful than the ones currently in use today, in order to construct and manage the environment model. The environment model represents an intermediate software layer between the individual sensors and the different applications.

This layer contains data fusion and planning algorithms designed to enhance accuracy and reliability and to expand the field of vision from the individual sensors, acting as an abstraction layer with respect to the different functions.

Operating as the central point for evaluating and interpreting all the information acquired is Continental's Assisted & Automated Driving Control Unit, which generates the environment model at more than 50 times per second.

"Continental is developing and producing the necessary components and systems for automated driving worldwide – in the US as well as in Japan, China and Europe," noted the supplier.

"The engineers involved are working on six important building blocks to make this a reality: sensor technology, cluster connectivity, human-machine dialogue, system architecture, reliability, and the acceptance of automated driving."