Continental is using the Consumer Electronics Show here in Las Vegas this week to present its vision of the mobility of the future along the megatrends of networking and automated driving. The supplier’s focus is on the connected car and “the entire living space of mobility in the intelligent city of tomorrow.” To learn more, we spoke to Werner Köstler, Head of Strategy & Business Development, Vehicle Networking and Information, Continental and Christian Schumacher, Vice President Program Management Systems, ADAS, Autonomous Mobility and Safety, Continental.

What is the headline message that Continental is putting out here at the 2020 CES?

Werner Köstler: The slogan Continental is using here at CES 2020 is “Mobility Is the Heartbeat of Life”. Under this tagline, we are presenting numerous innovations that reflect the two major trends of the automotive industry: Connectivity and Automated Driving. Mobility is a basic human need and a healthy mobility ecosystem fuels our society, our economy and our world. As a technology leader, we play a key role in this ecosystem by engineering safe, secure and responsible technologies that move people, goods and information to sustain our work, our health and our future.

Looking around the CES this week, it seems like everything is connected to everything else. What is your vision of the connected car?

As part of the Internet of Everything, the connected car opens up completely new business models enabling mobility services.

Werner Köstler: Vehicle connectivity is not only an add-on, but a key technology for the intelligent mobility of the future. The connected car is seamlessly networked in multiple ways: to the driver and passengers, other vehicles, mobile devices, the cloud and the infrastructure. This connected car system consists of networked hardware and software components. As part of the Internet of Everything, the connected car opens up completely new business models enabling mobility services. We provide technologies from products to services, protected by advanced Cyber Security solutions and over the air capabilities.

We’ve seen transparent A-pillars and now your transparent hood. What’s next?

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Werner Köstler: The “virtual A-pillar” and the transparent hood are basically two independent innovative technologies.

The “transparent hood” function, awarded a CES 2020 Innovation Award, highlights what an intelligently integrated camera system is capable of. When driving slowly – such as when parking or off-roading – the ground beneath the engine compartment is displayed on the screen in the vehicle. With the help of the new optical information, a vehicle can be manoeuvred precisely and in a controlled manner in narrow parking spaces with high curbs, over speed bumps and potholes, or when off-roading over rocks and rough terrain. To the driver it looks as though the hood and the engine compartment beneath it are transparent.

For the “virtual A-pillar” we pair our interior camera and integrated OLED displays. The technology tracks the driver´s movement and displays an image of the vehicle´s exterior environment on those displays, enabling the driver to “see through” the A-pillar.

In the area of human-machine interface we are already working on various future trends. One example is the next generation of our Natural 3D Display, which we are now developing as a 3D Centerstack display. With this technology we generate an unprecedented 3D experience for all vehicle passengers without special eyewear. We will also be bringing our speakerless sound solution Ac2ated Sound to a next level with exciting news that we will announce at the CES 2020.

Could you tell us a little about your Ultra-wideband technology for keyless vehicle access?

Werner Köstler: Continental is using Ultra-wideband (UWB) to make keyless vehicle operation more secure and convenient. In future, UWB technology will complement the existing short-distance radio standards in keyless vehicle access via smartphones or conventional remote access keys. As the position of the key is determined with great accuracy, the car “knows” for certain that the driver is nearby. This prevents previous theft strategies from working – such as the wireless extension of a low-frequency remote access key signal (relay or man-in-the-middle attack) to simulate proximity of the driver. 

Integrated into smartphones, the UWB technology enables new functions, such as secure hands-free access via smartphone or the option of parking vehicles remotely using a smartphone.  

We are hearing more about biometrics to open car doors and start engines. How do you see these applications unfolding?

Werner Köstler: Based on our systems integration know-how, we combine our keyless entry and start system PASE with biometric elements to increase comfort and security. This means, for example, that the presence of a valid key inside the vehicle is no longer enough to start the engine as the driver is also required to provide authentication using a fingerprint sensor. This two-fold authentication process significantly increases the vehicle’s anti-theft protection measures. Biometric elements also allow drivers to personalise their vehicles. The system is connected to an interior camera, which recognises the driver’s face and automatically personalises vehicle settings, such as seat and mirror position, music, temperature, and navigation, for the driver in question.

Appropriate sensor topology for autonomous driving is a much debated topic. What is Continental’s stance?

Christian Schumacher: The number and design of the sensor setup depend strongly on the driving function and autonomy level to be implemented. Therefore, it is not possible to give a general statement which sensor set is used. However, the higher the degree of automation, the higher the number of sensors installed, and the level of diversity considered. Especially with regard to redundancy and safety. For highly automated systems, however, a sensor set based on radar, camera and lidar will be necessary.

What are the challenges of LiDAR in its current form?

The very computation-intensive processing and fusion of these complex sensor data is a challenge that needs to be solved.

Christian Schumacher: Different sensors have different strengths. Systems for automated driving must be equipped with different sensor technologies in order to create redundancies on the one hand and to validate sensor data on the other in order to create a precise environment model. The very computation-intensive processing and fusion of these complex sensor data is a challenge that needs to be solved.

A major advantage of the Hi-Res 3D Flash LIDAR sensor technology is that a 3D environment acquisition can be realised in real time utilising the global shutter principle. The technology enables a much more comprehensive and detailed picture of the entire vehicle environment, both day and night, and works reliably even in adverse weather conditions. Lidar sensors are therefore an important component for highly automated and driverless driving. We consider Lidar technology to be necessary, especially from the point of view of redundancy and ensuring functional availability under all conditions.

While we are seeing an acceleration of level 1 and 2 driving automation, there are delays in higher levels due to the lack of an established regulatory framework and the technical challenge of providing safety in all driving situations. In terms of Level 2+, could you summarise Continental’s position and product offering?

Christian Schumacher: So called Level 2P systems offer the driver significant added value, more safety and comfort thanks to a higher functional scope and added operational design domains. The large number of Level 2P systems in use on the road would offer manufacturers and suppliers the opportunity to generate the critical amount of data to ensure a more efficient validation and safeguarding of higher levels of automation. Level 2P systems can be set up from two sides. As a premium variant (L2 Premium), which is already taking the Level 3 architectures into account and will realise the step to Level 3 in the future with software updates and possibly necessary additional sensors. On the other hand, there is the performance variant (L2 Performance) with a stronger focus on competition based on traditional ADAS systems.

As a system supplier, we offer all relevant components for both assisted and automated driving from a single source. This includes sensors for environment detection based on radar, camera, lidar and ultrasonic technology. In addition, we also have the necessary central control units (Assisted & Automated Driving Control Unit) with the necessary computing power to implement the higher functional scopes. This includes software solutions (end-to-end) as well as system integration expertise.

In an autonomous car, is it all about processing power for sensor fusion or are there other aspects to make computing dependable?

Christian Schumacher: There are several aspects, but the main parts are sensor fusion, infotainment, and HMI functions. Depending on the architectural design of the automobile manufacturers, some functions will run on a central servers and some in decentralised control units. However, it should be noted that particularly computationally intensive functions such as infotainment, HMI or driver assistance (environment recognition) will benefit from the computing power of high-performance computers. In addition, functions that have to be kept updateable or even hybrid, i.e. run partly in the cloud, will take place on the central servers. Especially in infotainment, we will increasingly see such hybrid functions.

We hear a lot about the ‘last mile’ in terms of delivery to the customer. What’s your vision of how this will change and in what ways can Continental support it?

Christian Schumacher: With demonstrations like robot delivery we want to present the vision of how a cascaded robot delivery approach could look like. In times of shortage of drivers and skyrocketing e-commerce delivery companies could generate a lot of advantages with delivery robots. Moreover, by delivering parcels in the night, empty roads and robocars which are not required for passenger transport in the night could be utilised.

Industrialising the automation of goods delivery requires reliable, robust, high-performing, and best-cost technology – a mix perfectly reflected in the automotive equivalent of automation. It is this very profile of expertise that has made Continental one of the industry-leading suppliers of advanced driver assistance systems and vehicle automation.

Mobility means freedom but the current infrastructure is reaching its limits. … There is a better way to get things done.

Continental’s vision is to provide sustainable mobility solutions. In general, mobility means freedom but the current infrastructure is reaching its limits. We are losing time in traffic jams or use our multi-ton heavy cars for small errands, like picking up groceries. There is a better way to get things done.

With last mile delivery robots, we add a fully autonomous robotic application to our mobility portfolio. Driverless vehicles like the CUbE and robots are sharing basically the same algorithms and might even share computing resources to perform their duties. Therefore, we build-up development platforms (CUbE, Robots) to develop autonomous technologies for our customers and partners.