Connected, Autonomous, Shared and Electric (CASE) mobility solutions need sensors. Lots of them. The rapidly changing landscape of advanced driver assistance systems (ADAS) as it mutates towards autonomous is putting LiDAR (Light Detection and Ranging) technology firmly in the spotlight. LiDAR is said to be the most important of the sensor suite that enables the different levels of driver autonomy. Continuing just-auto/AIC’s series of research snapshots, Matthew Beecham examines the whys and wherefores.
In terms of sensors for autonomous vehicles, there has been something of an industry debate forming as to what technology is best – be it radar, LiDAR or cameras. In reality, the vehicles themselves are likely to be fitted with a combination of all of the technology due to the different strengths that they bring to the party.
LiDAR is used in ADAS for measuring distance and forming 3D representations of its surrounding objects and their positioning. Applications such as lane change assist, highway pilot and traffic jam assist rely on LiDAR. It works by emitting a laser on targets and analysing the light reflected to gauge distance. LiDAR can also support autonomous driving.
LiDAR of the pack
The main manufacturers of LiDAR systems (either software or equipment) include Bosch, Denso, Infineon, Aptiv, Continental, Denso, Hexagon, Waymo, Quanergy, Valeo, Luminar, Velodyne LiDAR, LeddarTech, Innoviz and AEye.
“While publicly, dozens of LiDAR companies exist, in reality, there are less than a handful of legitimate contenders in the mobility market, and the same in ADAS,” Blair LaCorte, president of AEye told just-auto. “Those that lacked first-mover advantage, tech differentiation, and tier 1 partnerships were already seeing product timing, market opportunity, and fundraising challenges pre-COVID. These issues will only be exasperated by the economic downturn, causing the LiDAR market to quickly consolidate and materialise.” California-based LiDAR specialist, AEye has developed iDAR (Intelligent Detection And Ranging), a perception system that acts as the eyes and visual cortex of autonomous vehicles (AVs).
Israel-based Innoviz also develops and manufactures solid-state LiDAR sensors and perception software that enable the mass-production of AVs. The company demonstrated its InnovizOne for the first time at the Consumer Electronics Show (CES) earlier this year. During the event, we spoke to Omer Keilaf, Co-Founder & CEO of Innoviz Technologies to learn more about its path using LiDAR and thoughts on how the autonomous driving market could evolve. He said: “InnovizOne will be seamlessly integrated into BMW‘s first AVs – BMW is the first OEM to select a solid-state LiDAR sensor for mass-production of its Level 3 to Level 5 AVs. … In addition to InnovizOne, InnovizPro is ready for deployment in shuttles, robotaxis, drones, robotics, security, tracking, and other industrial applications.”
Some challenges remain
In its current form, however, LiDAR still has a few challenges to overcome. Delivering higher levels of perception and a highly accurate depiction of the 3D environment at a reasonable cost with greater reliability are the main challenges.
Not so long ago, OEMs – most famously Tesla – tried to shy away from incorporating LiDAR in their solutions because of the higher cost involved. However, some extensive trials have reported back that full autonomous driving won’t be possible without it.
For its part, Bosch says it can make lower cost sensors production ready for automated drive vehicles and that a new long-range LiDAR sensor will be the first solution suitable for automotive use. The supplier says the laser-based distance measurement technology is indispensable for driving functions at Levels 3 to 5 and the sensors will cover both long and close ranges – on highways and in the city. By exploiting economies of scale, Bosch wants to reduce the price for the sophisticated technology and render it suitable for the mass market.
Further challenges for manufacturers with this type of sensor, in particular, is to find reliability and robustness along with economic viability. Velodyne Lidar claims it has addressed this area. During an interview with just-auto, Anand Gopalan, Chief Technology Officer at Velodyne Lidar, said: “Velodyne has achieved the balance between building Lidar that is both high-performance and is manufacturable at scale. We achieve this by simultaneously developing the Lidar sensor and its manufacturing process. This is what makes Velodyne different. We have our own manufacturing processes and factory. We believe the only way to deliver high-performance and cost-effective Lidar is to co-develop the product and the manufacturing processes.”
“Different sensors have different strengths,” points out Christian Schumacher, Vice President Program Management Systems, ADAS, Autonomous Mobility and Safety, Continental during an interview with just-auto. “Systems for automated driving must be equipped with different sensor technologies in order to create redundancies on the one hand and to validate sensor data on the other in order to create a precise environment model. The very computation-intensive processing and fusion of these complex sensor data is a challenge that needs to be solved.”
As effective as cameras are, they have a few drawbacks such as limited range and performance due to rain, fog and varying light conditions. “All of the technologies [LiDAR, camera, radar] have their place, but when you add them together, the sum of their parts is greater than each alone,” says LaCorte. “Having said that, it’s all about how they are pieced together. Much of what’s being done today is sensor fusion – gathering disparate data from each component, then stitching it together.”
The automotive industry remains divided on the sensor configuration needed to support autonomous driving. Tesla is resolute that cameras and radar systems will be sufficient, yet both systems have their shortcomings. Some believe that safe automated and autonomous driving can only be achieved by combining LiDAR, radar and cameras. Technology partnering arrangements are also helping.
During an interview with just-auto, Robert Kempf, Harman’s vice president in ADAS/automated driving, explained how the supplier is partnering with Innoviz to overcome today’s LiDAR issues with solid-state technology and why the merging of this data in the ECU architecture is key to its success. He believes that, due to the complexity and the need for inter-company collaboration to enable autonomous driving, for L3 and higher vehicles it’s imperative that hardware and software provided by different suppliers can be linked, along with other sensors and ADAS solutions. He said: “It must also allow scalability and flexibility to handle different sensor topologies, as cost will often determine the sensors and features possible. This fusion of information will help with the rollout of the technology and its affordability in the long term. Ultimately, sensor fusion is key to future success, and Harman is looking at this bigger picture of optimum sensor mix to fulfil the necessary performance, cost and durability requirements.”
On balance, LiDAR looks set to remain critical to seeking the holy grail of autonomous driving. Whether the introduction of low-cost LiDAR will see Tesla adopt the technology remains to be seen.