Renault says the leap between autonomous driving levels is vast as the automaker wrestles with how to marry human activities with machine understanding.
“For Renault – like all car manufacturers – we offer cars at Level 2; that has really improved road safety,” said Renault autonomous driving chief engineer, Laurent Taupin at the recent Forum on the European Automotive Industry (FEAL) in Lille, grouping suppliers, OEMs and government officials in France’s primary manufacturing region for the sector.
Discover B2B Marketing That Performs
Combine business intelligence and editorial excellence to reach engaged professionals across 36 leading media platforms.
“Nine out of ten accidents are caused by human error and machines don’t make mistakes. Level 4 is the first level where human beings become passengers. You can’t let a Level 2 evolve into a Level 4 – it is as if you take a candle and turn it into a light bulb.
“You have to accumulate millions of hours of autonomous driving. There are data lakes rather than centres. Obviously you have to believe in this – you can’t prove today there will be added value. We are all moving into virgin territory here. How much are people willing to pay?
“For 120 years we have been [producing] cars which rely on infrastructure in a human-like way. The human sees [and] interprets the information, makes decisions, steers and brakes and this has an effect on the path of the vehicle; it pretty much functions as a circle.
“When we started autonomous vehicles we started to teach a machine how to read a human language. We understand this language as filled with ambiguities. It is perfect for the human brain – it is absolutely awful for a machine. For a machine it is right or wrong, it can’t be maybe.”
The Renault chief autonomous engineer used the example of a lift operating at Level 4, while also citing the OrlyVal automatic train linking the Parisian airport to the Antony RER station, although he stressed the latter works in an extremely confined area and is on rails.
“More and more we have the feeling it would be a lot easier if we developed a different language,” added Taupin. “[A] kind of a digital transformation of the environment which would arrive to AI over the air – something which would have a lot fewer ambiguities.
“Something which would be a lot less easy to tamper with a cyber attack.”
While working on an “eyes-off/hands-off approach to autonomous driving, Renault has prototypes based on the Espace model using the following sensors:
- Three lidars, or long-range laser scanners (two front, one rear)
- One long-range front radar
- Four medium-range corner radars
- Three digital cameras with short, medium and long focal lengths, at the top of the windscreen
- Four 180° short-range digital cameras under the wing mirrors and at either side of the licence plate
- A belt of 20 short-range ultrasound sensors
The data input from the sensors is processed by multiple onboard ‘brains’ in the form of embedded software which tells the car what to do, thus enabling the driver to safely hand the car over to its automatic driver, in Eyes-off/Hands-off mode.
To ensure the required safety of autonomous drive in Eyes-off/Hands-off mode, steering and braking control systems are doubled up, as are all related electrics.
In the event of a flaw in one of the two parallel systems, the other takes over to ensure the vehicle keeps full control of its trajectory and automatically ensures safe driving conditions, in the event of the driver not resuming control.
Test fleets of Renault Eyes-off/Hands-off autonomous vehicles will be on the road in 2018, with implementation on production models following after 2020, as permitted by legislation at that time.
