Driver assistance technologies aim to make the vehicle capable of perceiving its surroundings, interpret them, identify critical situations, and assist the driver in performing driving manoeuvres.  The object is, at best, to prevent accidents completely and, at worst, to minimise the consequences of an accident for those concerned.   Matthew Beecham talked with Andy Whydell, senior manager for electronics product planning at TRW Automotive, about the company’s driver assistance technologies.

just-auto: Driver assistance technologies are evolving rapidly. Just looking back, say three or four years ago at what was predicted in terms of the application of camera-based technologies, how has that now played out? I guess forward-facing cameras nowadays are finding new applications?

Discover B2B Marketing That Performs

Combine business intelligence and editorial excellence to reach engaged professionals across 36 leading media platforms.

Find out more

Andy Whydell: The camera market has grown surprisingly quickly; three to four years ago, active safety cameras were almost unheard of in the automotive market place, while today numerous manufacturers offer production lane departure warning or lane-keeping systems. This adoption rate has been faster than that experienced by radar for ACC applications when those systems were being introduced.

Initial camera development activities were primarily focused on safety applications, i.e. lane departure warning. However, in a non-regulated environment it can be difficult to create a convincing consumer value proposition for safety-only products. To increase consumer value there have been significant efforts to identify additional ‘comfort and convenience’ features that allow the driver to benefit from the camera on a daily basis; examples of this include auto headlight control for on/off and high-beam/low-beam switching, and traffic sign recognition (TSR). In addition, as image processing power becomes more affordable, cameras are gaining object recognition capability (vehicles and pedestrians) to assist in automatic emergency braking and pedestrian protection systems via data fusion with radars.

j-a: As I understand it, multi-function cameras are forming a core technology for advanced DAS. These cameras will be cheaper, more effective and easier to integrate than radar and infra-red systems. Is that correct?

AW: The relative costs of DAS sensors are dependent on the desired performance levels and features required.

The purpose of a DAS sensor is to collect environmental data using an imager or antenna, and to analyse that data stream to extract relevant information. Depending on the desired performance requirements, a more or less expensive imager/lens combination or antenna concept can be specified, which then needs to be matched to an appropriate level of data processing power (e.g. digital signal processors (DSPs), microprocessors, etc.) to extract significant data such as lane markings or vehicles within a limited period of time – typical sensor sample rates are 25-40Hz. By adding more data processing capacity, more useful information can be extracted within a given time period (e.g. more targets or objects identified); therefore the costs of radar, camera and lidar sensor systems can overlap, depending on the sensor capabilities.

In terms of effectiveness, a (passive) camera sensor has a different range of capabilities than an (active) radar or lidar sensor. Cameras are best suited to lateral applications, estimating the size and lateral position of an object and identifying it, while lidar and radar sensors are best suited to longitudinal applications, directly measuring relative speeds and distances to surrounding objects. Depending on the specific goals of the safety system, a camera, lidar or radar sensor can be the most effective solution.

The ease of integration of DAS sensors is dependent on the level of actuator integration required. At the simplest level (warning-only systems) a camera fitted for lane departure warning or a radar sensor fitted for collision warning will have similar installation requirements: a mounting bracket for the sensor, an electrical interface to provide power to the sensor and a communication bus interface to allow the sensor to communicate to an HMI interface (e.g. instrument cluster for audible/visual warnings, steering wheel for haptic warnings).

j-a: What functions can these multi-function cameras actually offer?

AW: Multi-function cameras can offer a wide range of features, depending primarily on the level of image processing capability available. These features range from automatic headlight on/off control and high-beam/low-beam switching at the basic level, to lane tracking for lane departure warning or active lane-keeping systems, traffic sign recognition and finally vehicle and pedestrian detection for automatic emergency braking (when combined with a radar sensor for data fusion) and pedestrian protection systems. In addition, other environmental sensors can be packaged within the camera housing to provide rain, temperature and humidity sensing to enable automatic wiper and demisting capabilities.

j-a: While I guess the camera optics are fairly similar, the additional functions require different computer processing power, is that correct? What is involved in the development and engineering? Presumably, the extra cost of adding more functions is small compared to the camera?

AW: Automotive cameras typically use standardised electronic components such as imager chips, DSPs and microprocessors, although there is some variety in the imager resolutions and optics design (lens) matched with the imager which provides performance variations in raw sensing capability between cameras.

The major factor affecting camera functionality is the level of image processing power available to analyse each image. Advanced functions such as vehicle and pedestrian detection require significantly more powerful video processors, due to the wide variety of potential ‘targets’ to search for within a frame and the fact that these can be presented to the camera at any angle – for example a pedestrian may be wearing a hat and carrying a bag, which would affect the human ‘silhouette’ but still need to be differentiated from a road sign. The additional processor requirements for advanced sensing significantly add to the material cost of the camera, while the cost of developing advanced vision recognition algorithms also need to be shared across these cameras.

j-a: Traffic sign recognition is a clear benefit of camera-based technologies. How do you see this market evolving in Europe?

AW: The market for traffic sign recognition is driven primarily from Europe today, and current systems are designed to identify ‘STOP’, ‘GIVE WAY’ and speed limit signs. As with many DAS technologies the first systems were available on luxury cars, but have now migrated down to D-segment/large family cars (e.g. Opel Insignia).

As the market evolves we anticipate that TSR will become available on smaller C-segment/family hatchback vehicles, and that as video processing capacity becomes increasingly affordable over time, the number of different road-signs that can be recognised by cameras will increase.

j-a: Could you explain the technology being developed and/or used by TRW to detect pedestrians and what is involved in achieving that? Does it involve radar or lidar?

AW: TRW is developing an advanced object-recognition camera capable of detecting pedestrians, as this is considered the most reliable sensing method for people. Due to the variety of clothing types that pedestrians may wear, a lidar system may not reliably detect dark clothing, and while ultra-wideband radars are capable of detecting pedestrians, there are limitations on their use in Europe after 2013 and these systems can not differentiate between pedestrians and other moving objects.

j-a: While we can see such multi-function cameras on the high-end and medium segment cars, do you see this technology permeating down to the low-end at all?

AW: As the costs of high-tech automotive electronics products continue to fall over time, we expect to see continued migration of advanced active safety technologies from luxury cars down to the mass-market; TRW already has a lane-keeping camera in production in Europe in a C-segment vehicle. In the long term (eight to ten years), we anticipate that the fitment of active safety cameras for lane departure warning or lane-keeping purposes may be a requirement in the major automotive markets, bringing this technology to all vehicles.

Prior to this time, we expect that premium manufacturers will continue to introduce these technologies down to their smaller vehicles, and that other manufacturers with strong safety brands will also offer these systems. A key factor in successfully introducing multi-function cameras on small cars will be promoting the convenience benefits that a camera can provide (e.g. automatic light and wiper control) in addition to the safety aspects; small car shoppers are typically more cost-sensitive and therefore the value proposition is more important – the cost of a camera will be a proportionally higher percentage of the vehicle price.

j-a: What about side- and rear-facing cameras? What do you see happening there?

AW: Side- and rear-facing passive cameras can provide a driver with valuable information about obstacles or potential dangers around the vehicle, but require the fitment of a relatively expensive display screen to provide this information to the driver. For vehicles fitted with OEM navigation systems or large multi-function displays (e.g. hybrids), this is not an issue, but it can be difficult to find space for this type of display on smaller cars. To be effective, passive cameras require the driver to adequately recognise dangers and respond appropriately to avoid potential collisions.

Object recognition cameras can also be used in side- and rear-facing applications for blind spot detection, but will not be as effective as radars as they can only indirectly estimate distances rather than directly measure them. They may also be less effective at night and in poor weather, as this area is not directly illuminated by the vehicle lighting systems, and prone to dirt build-up on the lens which may affect performance.

Object recognition cameras may also be used to assist drivers when reversing at low speed, to detect obstacles or pedestrians and provide a warning if something is detected close to the rear of the vehicle.

j-a: Could existing reverse parking sensors play a greater role in partial and full parking assistance systems?

AW: Existing reverse parking sensors (ultrasonic sensor systems and passive cameras) are used today to provide drivers with audible or visual information about the vehicle’s proximity to obstacles during low speed manoeuvring. These sensors have sufficient performance to be able to operate in automated parking systems, providing that vehicle speeds remain low (e.g. < 5mph).

Automated parking assistance systems typically add additional sensors to provide more information about the vehicle’s immediate environment, including the use of side-facing ultrasonic sensors to identify potential parallel parking spaces and the addition of ultrasonic sensors to passive camera-based systems to provide the driver with additional guidance when the automated parking system is in operation – these systems typically automate the steering, but the driver is still responsible for longitudinal control and braking the vehicle to avoid collisions.

With an eventual move to fully automated parking systems with accelerator and braking system control, additional ultrasonic sensors may be required at the front of the vehicle to prevent collisions when parking in smaller spaces.

j-a: To what extent is map data and positioning information of navigation systems being developed to improve driver assistance functions? What more needs to be done in this area of navigation-based DAS?

AW: Location-based data is already being used to enhance adaptive cruise control systems today by providing additional information about road geometry such as motorway exits and road junctions to improve tracking of other vehicles.

One future step in this area is expected to be the introduction of curve speed warning systems, which will use navigation system-provided road curve radius and speed limit data to provide warnings to a driver if they are approaching a corner too quickly. In more advanced systems this warning could be extended to include automatic braking before a corner, to reduce the vehicle speed to a safe level and prevent road departures.

For these systems to operate effectively, the accuracy of the navigation map databases needs to be increased beyond what is used in automotive applications today (current map databases were originally developed purely to assist navigation between locations and identify the vehicle location on a map) to ensure that sufficiently detailed road geometry information including actual curve radius and gradient is available for automated warning or guidance systems to work correctly.

j-a: As you know, from 2013, all new trucks sold in Europe will be required to have emergency braking capability and lane departure warning. Could you explain the technologies TRW have been developing and/or have in place to manage these requirements?

AW: TRW has been providing radar-based adaptive cruise control systems to European truck manufacturers for more than five years; our customers are MAN, Scania and Volvo. In addition, TRW launched its first camera-based lane-keeping system in 2008 on the Lancia Delta.

In order to meet the impending requirements for heavy trucks and coaches from Q4 2013, TRW is developing a combined 77GHz radar and object recognition camera system capable of meeting the regulatory requirements for automatic emergency braking and lane departure warning. This system is being developed using TRW’s experience in commercial vehicle electronics products; in addition to radar sensors, TRW is also a major manufacturer of diesel powertrain control ECUs for commercial vehicles.

j-a: Braking is one thing yet I guess steering and allowing the system to select the right path for the driver is another. How do you see automatic steering intervention evolving?

AW: Escape route guidance (automatic emergency steering) is likely to be a ‘last resort’ collision mitigation action, engaged only when braking alone cannot avoid a severe collision. Due to the complexity of implementing this function (it requires accurate evaluation of forward obstacles to determine the direction to steer, along with lateral- and rear-facing sensors to determine the trajectory of vehicles in adjacent lanes to avoid additional collisions) this is expected to be one of the DAS functions furthest from production. For safety reasons the assistance is likely to take the form of a limited torque overlay through the vehicle’s steering system, to guide the driver on the best route to take but also allow the driver to overcome the system in case of a false alarm.

j-a: In terms of blue-skies research, I guess connectivity is the next step, i.e. linking vehicles together wirelessly through sensor technology?

AW: While technically interesting and potentially beneficial, vehicle-to-vehicle and vehicle-to-infrastructure based driver assistance systems are perhaps some of the DAS technologies furthest from production, due to the needs to provide standardised interfaces which have been accepted by a wide variety of stakeholders (governments, car manufacturers and suppliers) and the time it takes to bring these technologies into the field in sufficient numbers to be of benefit. To give an example of the latter, it is quite possible that half of the vehicles that will be on the roads of Europe and North America in the year 2020 have already been built!

In terms of where onboard DAS systems will be going next, we expect closer integration of the DAS sensors with the vehicle’s active and passive safety systems, with centralised safety and chassis control ECUs receiving environmental data from DAS sensors, driving braking and steering actuators to mitigate collisions and deploying the vehicle’s passive safety systems (airbags, seatbelt pre-tensioners etc) if a collision is unavoidable.