With ever increasing complexity comes the challenge of testing in the real world. Matthew Beecham spoke to Kia Cammaerts, founder of Ansible Motion, manufacturers of ‘engineering-class’ driver in the loop (DIL) simulators, to learn how such simulators can offer OEMs and suppliers a way to validate advanced driver assistance systems (ADAS).
Last time we spoke, you had just opened an R&D centre. How have things progressed?
Even in this small space of time [a year], awareness of simulation and in particular, a deeper understanding of the varying classes of driver in the loop (DIL) technology, has risen dramatically. We have been pleasantly surprised with the number of serious enquiries we are receiving from the OEMs for DIL simulators – not from a generalist’s perspective, but to serve specific use cases, such the integration of ADAS technologies. There is now an acceptance that simulators such as those in our Delta series, which are full motion simulators, are useful tools for vehicle engineering work, rather than simply for human factors or generalised driver behaviour studies.
What do you think is driving this change of interest?
I think you have to look at the pressures OEMs now have to improve vehicle safety. This comes from a variety of sources.
First, regulatory bodies are now mandating ADAS systems in new cars. The US National Highway Transportation Safety Administration (NHTSA) has already mandated stability control, for example, and they are currently evaluating autonomous emergency braking and vehicle-to-vehicle (V2V) communications. I understand that by May 2018, all new vehicles under approximately 4.5t sold in the US must have a rear camera – just one example on-board sensing technology coming of age.
Second, there is growing consumer awareness of such technologies; I saw a television advert for a new Nissan this month, and the only feature promoted was the emergency brake assist. Consumers are increasingly attracted to the idea of having better safety features. Advanced driver assist technologies are already a part of the star ratings in New Car Assessment Programmes (NCAP) in different markets around the world. We know the OEMs use high NCAP ratings to promote and differentiate their cars, which means that ADAS offerings themselves are now a direct way to increase the appeal of a car for potential buyers.
So technology and the ability to offer these systems must also play a part?
Yes, definitely, and this is where I think we are seeing our name crop up in engineers’ discussions. Software and hardware advances are the enablers for ADAS and ultimately the move to autonomous functionality. Artificial intelligence coupled with a suite of on-board sensors will enable cars to more safely navigate complex road scenarios, such as inside a city centre. But all of these interactions have to be validated.
I was at a recent symposium where it was stated cars delivering Level 4 and Level 5 autonomous driving capability must meet level D for their Automotive Safety Integrity Level (ASIL). This is the highest level international standard for risk assessment. Achieving ASIL D requires exceptional validation rigor. Would one observable incident per one billion hours of operation be acceptable? No one is sure about it at this point. What is clear is that a part of the challenge is in the actual process of validation. How you go about validating complex systems such as semi-autonomous technologies with handover back and forth between human and automated drivers? We think engineering class DIL simulators can help.
Can you give a specific example?
Let’s say you wanted to fully validate a collision mitigation system against all credible situations. There would be hundreds of thousands of scenarios that you would have to test, and practically speaking, due to time constraints, most of these must take place virtually with the aid of computer simulations. Then, if you are able to reduce your vehicle design parameters down to a few thousand possibilities, how and where could you put real people into direct contact with these possibilities in a controlled and safe environment? A DIL simulator would be a logical possibility – but it would need to be a DIL simulator with enough performance capability to deliver reasonable man-machine interactions.
Actually I know of a real world example of how this can manifest itself if you don’t consider key scenarios and human interactions. The A14 dual carriageway on the way to our office in Hethel is interspersed with gaps in the central reservation to allow cars to cross the oncoming traffic. An experienced driver that we know was driving his brand new saloon at 70mph when a car crossed safely in front of him. His new car, sensing wrongly in this case, that this was a potential crash, automatically braked, tried to bring his car to a stop. The driver did not consider the crossing car to be a risk, and he naturally assumed his car had suffered some sort of powertrain failure. Mindful of a coach not too far behind, he was about to purposefully drive his car into the verge before he realised that it was his collision mitigation braking system kicking in. Luckily, the driver in this case was skilled enough to quickly recognise and manage the situation, and carry on. But how would a different driver react?
We know that the toughest areas to validate are those where there is a human involved. That’s because the responses of real people are intertwined with the intervention of the on-board vehicle systems. As humans we think and act in complex ways based on stimuli, and a thorough understanding is required.
So you believe that simulators can be used to test such scenarios?
Absolutely. The best way to test a person’s responses to unusual situations is to put them in a representative environment where they can experience those situations. A DIL simulator provides just such an environment, but with three key advantages over real world testing with a real car: The simulated environment is inherently safe, it is inherently consistent in the sense that it is explicitly defined, and it is incredibly efficient in terms of the number of experiments that can be conducted in a given amount of time.
But why do you need a full motion simulator with a surrounding cockpit and large projection screen, rather than just a seat and a simple display? Doesn’t this add complexity and cost?
We as humans react depending on our surroundings. We therefore have to create an environment that is as immersive and compelling as possible. That means creating the impression of a cockpit with correct motion and powerful graphics. If we do not do this, we run the risk of having the simulator itself inadvertently consume a part of the driver’s cognitive bandwidth, which can in turn, invalidate some experiments. This is one of the major shortcomings of legacy driving simulators – where a part a driver’s cognitive workload is spent dealing with the non-realism of the overall experience. Ansible Motion’s Delta series simulators are, designed to ‘trick’ drivers into behaving as if they are driving a real car in what are actually controlled and repeatable laboratory conditions. This is the level that is required for validating dynamic vehicle responses to unpredictable situations, such as the ADAS development cases we are discussing. Our smaller sized, stationary Sigma and Theta series simulators are used more for validating real-time vehicle simulation models, and for certain types of human-machine interface (HMI) experiments.
Is having different types of simulators for different use cases a more cost-effective solution or just more efficient?
The remainder of this interview is available on just-auto’s QUBE Global light vehicle safety systems market- forecasts to 2030.