While ZF remains excited about automated driving, the challenge will require more time and effort to fully realize. The supplier, therefore, sees two distinct paths: Level 2+ for passenger cars, which will take a longer, perhaps more winding road to achieve; and cargo and people movers in defined geo-fenced areas utilizing Level 4 and 5 technologies. Continuing our series of interviews here at the CES this week, we caught up with Aaron Jefferson, VP, Strategy, Marketing and Business Development, ZF Electronics and ADAS Division, to learn more.
What is the headline message that ZF is putting out here at the 2020 CES?
At the 2020 CES ZF is featuring its future plans for Automated Driving under the banner – Automating Next Generation Mobility. This highlights ZF’s efforts for personal passenger vehicles where the company is concentrating on Level2+ systems including the industry’s most affordable solution – ZF coASSIST. ZF also has additional scalable solutions including the coDRIVE and coPILOT systems [More about Level2+ technology below.]
Our approach to autonomy is to invest in technologies that will have near term benefits for the public in terms of personal mobility and goods delivery.
In addition, ZF continues to develop Level 4 systems for applications like commercial vehicles, people movers and robo-taxis where there is a nearer term business case. Our approach to autonomy is to invest in technologies that will have near term benefits for the public in terms of personal mobility and goods delivery.
Appropriate sensor topology for autonomous driving is a much debated topic. What is ZF’s stance?
Again, this is a matter of delivering the amount of sensing and processing power required for the functions and application needed. For something like a coASSIST system it can be made affordable by utilizing forward-looking camera, corner radars and a lower cost safety domain control unit to deliver 2020+ EuroNCAP compliant safety while delivering the most popular ADAS safety and convenience features like Full Speed ACC, Lane Centering Control, Traffic Jam Support and Highway Driving Support w/driver-initiated lane change. When referring to higher Level2+ up to Level 4 systems the sensor sets become increasingly more complex adding things like next Generation mid-range or full-range radar, remote camera heads, Image Processing modules, and in some cases Lidar – and higher levels of Processors such as ZF’s ProAI.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataWhat are the challenges of LiDAR in its current form?
Delivering higher levels of perception and a highly accurate depiction of the 3D environment at a reasonable cost with greater reliability are primary challenges.
Delivering higher levels of perception and a highly accurate depiction of the 3D environment at a reasonable cost with greater reliability are primary challenges. ZF continues to work with its partner Ibeo that is planning to introduce the industry’s first solid state Lidar within the next 1-2 years. It will feature higher degrees of object detection and greater reliability than scanning Lidar.
While we are seeing an acceleration of level 1 and 2 driving automation, there are delays in higher levels due to the lack of an established regulatory framework and the technical challenge of providing safety in all driving situations. In terms of Level 2+ for cars in the near future, could you summarise ZF’s position?
ZF believes that Level 2 up to Level2+ systems are the near-term future for passenger cars for several reasons. As stated, there is no consistent global regulatory framework that oversees Level 3 and higher automation making implementation very challenging and causing potential confusion for vehicle operators. The cost challenge is also highly significant as the average consumer would not be able to afford Level 3-5 vehicles due to the number of sensors, processing units, redundant actuators, etc. In addition, the test and validation challenge for such vehicles on world roadways is highly challenging – as we are able to bring more and more Level2/2+ vehicles on the road the more opportunity there is to capture the unusual use cases and then test and validate for them.
We’re hearing that level 2+ systems can deliver greater levels of comfort and safety. Can you give some examples of ZF’s L2+ technologies? And when will they be launched?
ZF is the system developer and integrator for its full range of L2+ systems including:
- ZF coASSIST – the cost-effective Level 2+ solution that helps meet Euro NCAP performance requirements while delivering popular Level 2+ ADAS functions such as traffic jam and highway driving support with Mobileye EyeQ front camera technology and ZF control functions
- ZF coDRIVE – extends the functionality of traffic jam and highway driving support. 360° surround camera perception and the processing capability of Mobileye’s EyeQ technology enable feet-free and hands-free driving, including automated lane changes and automatic overtaking.
- ZF coPILOT is designed for maximum computing power and processing scalability from Level 2+ up to Level 4. It offers functions like feet-free and hands-free operation, automated lane change and overtaking, voice-controlled actuation, visualization of the surrounding sensing environment, automated garage parking and route learning and utilizes ZF’s ProAI controller jointly developed with NVIDIA.
The coASSIST system is set to launch before the end of 2020 – the coDRIVE and coPILOT systems will be available in the 2022-2023 timeframe.
In an autonomous car, is it all about processing power for sensor fusion or are there other aspects to make computing dependable?
The architecture chosen in which data is absorbed and processed whereby safety or vehicle control decisions are arbitrated and implemented is an important consideration. While processing power to handle multiple sensors is one primary consideration, the design of the system architecture, the level of sensor data (raw, object level, etc), and how sensor inputs are fused and validated is also a key factor.
Could you give us an idea of the ways in which ZF is developing cargo and people movers in defined geo-fenced areas using level 4 and 5 technologies?
A prime example is ZF’s partnership with 2getthere, a company that has more than 25 years of experience with automated vehicles operating in a variety of challenging environments. They will deploy autonomous shuttles in dedicated areas in 2020, for instance at Rivium Park in Rotterdam as well as along a dedicated route in ZF’s home town of Friedrichshafen, Germany.
ZF is fully committed to advancing these types of Mobility-as-a-Service solutions, and we have recently joined the “Mobility as a Service Alliance” so that we can help to shape standards for legislation and technologies in this space.
We understand that ZF is collaborating with Israeli technology companies, Cognata and OptimalPlus for plant operations management and enhanced efficiency of ADAS validation operations. What stage have you reached with these partners?
We have been working closely with both partners for more than a year and are exploring further formalization of these relationships.
The volume of data we’re collecting and processing for our ADAS and AD systems is huge and growing. Cognata will help us manage this growing volume of raw data and, importantly, help transform the value of the output and how we are using it into better formats that will help us develop and improve ADAS and AD systems. OptimalPlus will help us with our Quality 360 program, a next-generation Big Data Quality and Predictive Maintenance program that works across all aspects of our business. OptimalPlus is one part of this much larger push for a next-generation solution and approach to Quality.
The idea of a car watching the driver is not new. Systems that look for signs of drowsiness have been around for a while yet gaining importance as cars offer more ADAS features. How will tomorrow’s car use driver monitoring systems?
Driver monitoring will, of course, be increasingly important in understanding not just driver state but whether the driver or the vehicle is in control as we move to higher levels of automated driving. From a safety perspective monitoring drivers and expanding that to passengers can help make critical safety decisions. For example, understanding how close an occupant is to an airbag enclosure can help inform whether or not to fire an airbag or at least adjust the force with which it deploys. Understanding driver state such as drowsiness or distraction will be another key input for the ADAS and semi-automated AD systems of the future – whether for activation of warning systems or in assisting the driver to be aware of issues affecting their judgment.