Do we need assistance while driving?  While some people believe we can’t do without it, others disagree.  A recent internet blog entry read: “You can just forget the rest.  If you know how to drive then you don’t need that kind of stuff!  Didn’t need these kind of gimmicks in the ‘70s, don’t need them today.” Yet recent research has shown that driver error is one of the most common causes of traffic accidents.  Matthew Beecham reports on how these gimmicks can provide a helping hand in times of trouble.

According to Bosch, driver assistance technologies aim to make the vehicle capable of perceiving its surroundings, interpret them, identify critical situations, and assist the driver in performing driving manoeuvres.  The object is, at best, to prevent accidents completely and, at worst, to minimise the consequences of an accident for those concerned.

Consequently, these systems are increasingly being incorporated in cars across the board, from luxury vehicles to small city cars.  Indeed, many of these systems are being fitted as standard equipment.

Today’s driver assistance technologies

The most common suite of driver assistance technologies available today includes adaptive cruise control, lane departure warning systems, and parking assistance systems.

In all of these systems, sensors play a key role.  Lasers, radar and video cameras represent some of the sensors currently available for monitoring a vehicle’s immediate surroundings, many of them now in their second and third generation of development.  They allow tailor-made solutions for all vehicle categories, for whatever purposes they are used, and are significantly cheaper than complete system packages.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

For instance, Delphi’s camera and image processing techniques provide a scalable architecture that can support a number of functions, including lane departure warning, advanced headlamp control, road sign recognition, pedestrian detection and rain sensing in the same package. Initial Delphi vision systems have been on the road since 2007, and the use of these systems is expected to expand as governments consider new car “star” ratings for active safety features in the future.

While camera optics are fairly similar, the additional functions require different computer processing power. Akira Kondo, general manager of Denso Corp’s driving assistance and safety engineering department, told us: “It can be said that applying a multi-functional camera with high computer processing power is more beneficial than adding a dedicated camera. We think it is important for system development to adopt advanced semiconductor technology, as well as evaluating applications’ specification and cost that meet markets’ needs.”

Andy Whydell, senior manager for electronics product planning, TRW Automotive, added that automotive cameras typically use standardised electronic components such as imager chips, digital signal processors and microprocessors, although there is some variety in the imager resolutions and optics design (lens) matched with the imager which provides performance variations in raw sensing capability between cameras.

“The major factor affecting camera functionality is the level of image processing power available to analyse each image,” said Whydell.  “Advanced functions such as vehicle and pedestrian detection require significantly more powerful video processors, due to the wide variety of potential targets to search for within a frame and the fact that these can be presented to the camera at any angle – for example a pedestrian may be wearing a hat and carrying a bag, which would affect the human silhouette but still need to be differentiated from a road sign. The additional processor requirements for advanced sensing significantly add to the material cost of the camera, while the cost of developing advanced vision recognition algorithms also need to be shared across these cameras.”

Traffic sign recognition emerges

Traffic sign recognition is another clear benefit of camera-based technologies.  “Traffic sign recognition is the shiny function of our multi-function camera platform,” Continental executives told us.  “We recognise a lot of advertisings for this driver assistance function, which raises the awareness in general. Speed limit monitoring, as the first derivate of the traffic sign recognition, will be constantly improved and added by new traffic sign interpretation, for instance overtaking assistance, speed limits in Asia and US, no entry sign recognition and so on.”

Kondo believes that, in addition to safety benefits, traffic sign recognition technology could also contribute to fuel saving. “We see the traffic sign recognition technology as a promising function in the automobile industry and expect it to grow in many regions including Europe.”

Whydell agrees that the market for traffic sign recognition is being driven primarily from Europe today, and current systems are designed to identify “STOP”, “GIVE WAY” and speed limit signs. He points out that, as with many driver assistance technologies, the first systems were available on luxury cars but have now migrated down to D-segment/large family cars such as the Opel Insignia.  “As the market evolves we anticipate that traffic sign recognition will become available on smaller C-segment/family hatchback vehicles, and that as video processing capacity becomes increasingly affordable over time, the number of different road-signs that can be recognised by cameras will increase.”

Alf Liesener, manager global marketing service, SMR Automotive Services GmbH, believes that if traffic sign recognition systems are integrated with OEMs’ future vision sensor-based lane change, cruise control or headlamp management systems, they can, of course, be offered at much lower prices. “We estimate that we will have to wait another four years for a significant global increase in application rates.”

Pedestrian detection on the horizon

While the current driver assistance applications, such as lane departure warning, speed limit monitoring, and intelligent headlamp control and object detection are gradually expected to permeate down the car segments, pedestrian recognition is on the horizon as a logical evolution of camera technology. Continental executives told us: “We are developing such systems which can recognise pedestrians. Our radar systems are able to detect pedestrians; some of those at distances of more than 150 metres. To classify an object as pedestrian, we consider the camera as the most adequate technology.”

Denso Corp’s engineers are also in the throes of developing pedestrian detection technologies.  Kondo told us: “To realise pedestrian detection technology, we are considering two options – using the camera alone, and using the combination of the radar, (lidar) and camera.”

TRW is developing an advanced object-recognition camera capable of detecting pedestrians.  Whydell added: “Due to the variety of clothing types that pedestrians may wear, a lidar system may not reliably detect dark clothing, and while ultra-wideband radars are capable of detecting pedestrians, there are limitations on their use in Europe after 2013 and these systems can not differentiate between pedestrians and other moving objects.”

Automatic steering intervention still a delicate issue

Braking is one thing yet steering and allowing the system to select the right path for the driver is another. Yoshihiko Teguri, chief engineer of Denso Corp’s information and safety systems research and development department says automatic steering intervention is a “delicate issue”, so manufacturers must take a prudent approach. He said: “For example, collision avoidance by brake is essentially safer, because vehicle speed comes down. But, collision avoidance by steering can lead to greater damage, because vehicle speed does not change. When changing course it could lead to a collision with another big obstacle. Also, if the vehicle is steered toward a pedestrian, it could lead to critical results. So, there needs to be a complete understanding of the surrounding environment to plan for a safe route to avoid collisions. The challenges are the accuracy and range of surrounding sensor, such as radar, camera.”

Whydell argues that escape route guidance (automatic emergency steering) is likely to be the “last resort” collision mitigation action, engaged only when braking alone cannot avoid a severe collision. “Due to the complexity of implementing this function (it requires accurate evaluation of forward obstacles to determine the direction to steer, along with lateral- and rear-facing sensors to determine the trajectory of vehicles in adjacent lanes to avoid additional collisions) this is expected to be one of the driver assistance functions furthest from production. For safety reasons the assistance is likely to take the form of a limited torque overlay through the vehicle’s steering system, to guide the driver on the best route to take but also allow the driver to overcome the system in case of a false alarm.”

Map-based data can help

Map data and positioning information of navigation systems is being developed to improve driver assistance functions. “I see a great synergy between map data and camera-based DAS systems,” said Liesener.  “The problem of temporarily installed speed limits, i.e. during road works, are unsolvable problems of a purely map and GPS-based system. However, information derived from vehicles with optical sensor systems can help update map data on a centralised server for maximised reliability. This is again an example of the various advantages over radar sensors.”

Executives at Tele Atlas believe that navigation systems are becoming a more important component of the vehicle’s overall safety system. “Today, they primarily assist drivers in unfamiliar areas with clear instructions, minimising distraction caused by searching for road signs and landmarks. Navigation systems also help drivers make their way through complex intersections, giving drivers adequate time to correctly position themselves to execute a turn, where for example, multiple freeways intersect and lane choice is critical to the driver not making a sudden, potentially dangerous move to reach the desired exit ramp in time.”

Frans van Dingenen, product marketing and business development manager, advanced driver assistance systems at Navteq Europe BV, points out that ADAS systems can analyse the road ahead and its surroundings, providing critical prediction of upcoming road features (such as roundabouts or junctions etc) which require manoeuvres.

“If drivers are more prepared for the road ahead,” said van Dingenen, “their driving will be more consistent, avoiding sharp braking or sudden gear changes etc and in turn will become more fuel efficient. In some cases advanced solutions will be able to automatically control acceleration and braking. Magneti Marelli, for example, is working on an ‘eco-driving’ system which focuses on driving behaviour in order to optimise fuel consumption by advising the driver when the best moment is to change gear or brake when approaching a sharp bend or steep hill etc.”

TeleAtlas believes that map-based advanced driver assistance systems have knock-on benefits such as helping the motorist to drive more economically.  “The industry is on the verge of a new level of safety and economic performance. With increases in the collection and distribution of map and related environmental data, and the sophistication of on-board electronics grows, the convergence of the two will have a great impact on the behaviour of both driver and vehicle.”

At last year’s Tokyo motor show, Denso‘s president and CEO, Nobuaki Katoh, outlined some of the company’s latest technologies to save fuel, including its so-called car navigation co-operative system.

“This system enables the air conditioning system, alternator and other devices to work together,” said Katoh.  “This is based on the information provided by the car navigation system, which ensures that fuel energy is used effectively.  For example, when the car navigation co-operative system detects a downhill slope ahead after the current climb, the system will limit the generation of electricity when the vehicle is using energy to go up hill.  Then, when the vehicle starts to go downhill, the system will produce electricity at the maximum rate.  Together and in cooperation, this saves fuel.”

Connectivity

In terms of blue sky research, connectivity appears to be the next step for driver asssistance, i.e. linking vehicles together wirelessly through sensor technology.

“While technically interesting and potentially beneficial,” says Whydell, “vehicle-to-vehicle and vehicle-to-infrastructure based driver assistance systems are perhaps some of the DAS technologies furthest from production, due to the needs to provide standardised interfaces which have been accepted by a wide variety of stakeholders (governments, car manufacturers and suppliers) and the time it takes to bring these technologies into the field in sufficient numbers to be of benefit. To give an example of the latter, it is quite possible that half of the vehicles that will be on the roads of Europe and North America in the year 2020 have already been built!”

Whydell concluded: “In terms of where onboard DAS systems will be going next, we expect closer integration of the DAS sensors with the vehicle’s active and passive safety systems, with centralised safety and chassis control ECUs receiving environmental data from DAS sensors, driving braking and steering actuators to mitigate collisions and deploying the vehicle’s passive safety systems (airbags, seatbelt pre-tensioners etc) if a collision is unavoidable.”