For some time, Mobileye has been pushing back the technical boundaries of vision systems for intelligent transportation.  The company is recognised as a world leader in vision-based DAS. Its technologies are sourced by a number of automakers, including BMW, GM and Volvo. More specifically, Mobileye’s AWS (Advance Warning System), is being distributed and sold worldwide. Mobileye NV is headquartered in The Netherlands, with a research and development centre located in Jerusalem, Israel and sales and marketing offices in Detroit, Michigan; Cyprus and Tokyo, Japan. Matthew Beecham talked with Ido Amir, marketing manager, Mobileye Vision Technologies about the company’s driver assistance systems.


 


just-auto: Could you provide a brief update on Mobileye’s fortunes with respect to its vision-based DAS, i.e. on which new vehicles is it fitted to?


Ido Amir: Our technology is currently fitted to the BMW 5 Series, GM Cadillac DTS, STS, Buick Lucerne, Volvo S80, V70 and XC70. In 2009 the list will expand.


just-auto: How is your aftermarket business shaping-up? 

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Ido Amir: Currently between 10 and 20 worldwide signed distribution agreements, including US, UK, Chile, Holland, Turkey, China, Japan, Denmark and Australia.


just-auto: What is in the pipeline?


Ido Amir: Mobileye’s future pedestrian detection technology, blind spot detection and lane change assistance, automatic high-beam control, side-looking and rear-looking cameras are all in store as next generation aftermarket products. Multiple feature consolidation is planned to be achievable mainly thanks to the new Mobileye EyeQ2 chip, which represents another technological breakthrough, being almost an order of magnitude stronger than its predecessor, the EyeQ1, itself a leading chip of its kind.


just-auto: Conventionally, sensors such as LIDAR, millimeter-wave radar, and vision sensors were used in driver assistance systems independently to recognize vehicle-surrounding conditions. I guess the emphasis nowadays is on combining several different sensors in order to achieve more accurate recognition. Would you agree? 


Ido Amir:  I would say that for functions that require a high level of confidence in detection, such as automatic braking there is currently room for sensor integration. Consider a false detection and consequently an emergency braking where there is actually no car in front of you, or that there was really no need for braking, since the ahead vehicle was mistaken for stopping. There, you would not prevent an accident, rather, you would create one. However, as for the non-active, non-interfering functions, such as forward collision warning, which emits some sort of feedback to the driver, leaving it in their hands to prevent an impending accident, there is less damage in issuing false warnings, and even in a missed warning.


just-auto: To what extent is there a move to offer greater functionality and integration of certain sensors for DAS applications?


Ido Amir:  This is actually one of the main points of strength a camera has over its competing sensors. OEMs would like, naturally, to spend as little as possible on hardware, and receive as many functions as possible. This being said, it is understandable why OEMs, and especially the ones who are the leading edge of implementing technology into their vehicles, place feature consolidation very high up the ladder when considering both cost reduction, and innovation, which is a main selling tool. A digital camera has a very strong position in this head-to-head sensors competition. Neither radar nor LIDAR have the capability to perform a combination such as the following: lane departure warning, forward collision warning, headway monitoring and warning, pedestrian protection (forward, sides, and rear), intelligent high beam control, rain sensing, day/night sensing (for automatic headlamp control, also called automatic low-beam control), obstacle detection (rear looking), traffic sign recognition, and the list goes on.


just-auto: Could existing reverse parking sensors play a greater role in partial and full parking assistance systems?


Ido Amir: Actually, the trend is clearly to assist the driver actively in as many ways as possible, and the first ones who will be able to relieve drivers of the need to park their cars will capitalise on that. To predict whether or not the existing, ultra-sonic, parking sensors will be relevant for a ‘full parking assist’ function is not an easy task. Ultra sonic sensors have the advantage of sonar, i.e. the ability to tell there is an obstacle in a certain direction. However, it’s been noted that these sensors also have ‘holes’ in their detection capabilities. Moreover, they are not intelligent and accurate enough to ‘tell the whole picture” Some car OEMs are already equipping their high-end vehicles with an all-around-view set of cameras, which gives the driver something like a ‘bird’s eye view’ of their own car when they are attempting to park. This allows the driver to park their car without looking sideways or backwards, which is inching toward the next step: automatic parking. The technological step that needs to be surmounted is to add more artificial intelligence to those systems, and voila! I’m not suggesting that the required AI (software) is easy to create, of course.


just-auto: Could car navigation systems play an important role in detecting vehicle-surrounding conditions?


Ido Amir:  Not only could they, but they probably should, and the sooner the better. No in-vehicle sensor can see what is not in their line of sight! For example, driving around the bend and straight into a line of cars that have rear ended each other is not going to be solved by anything that is not able to see what’s happening beyond the next curve you are approaching. For this, intelligent traffic systems that have ‘eyes from above’, are a feasible, and an excellent solution. Naturally, a better-connected road scene (WiFi, Bluetooth, or some other type of wireless communication) would make a huge difference in road safety. Terrible accidents that occur in tunnels, sometimes involving tens of vehicles, and always deadly, can be prevented altogether, if all cars ‘know’ that one car is now stuck in the tunnel ahead.


just-auto: Flashing lights and video screens on the dashboard and audible bleeps from the car’s loud speakers all suggest distracting the driver too much. How will the driver respond? Will they use the information correctly? Will it elicit the correct response?


Ido Amir: In fact, many studies were made on these exact questions. Since an alert is futile if it does not elicit an effective driver response. The National Highway Traffic Safety Administration (NHTSA) has issued a long list of recommendations, based on such studies, which define the do’s and don’ts of driver feedback. Whoever adheres to those recommendations should expect to have an effective DAS, that has a good chance, statistically, to prevent and mitigate road accidents. These NHTSA comments take into account disturbances in the cabin, and indicates the correct measures that should be taken in, for instance, a noisy cabin, e.g. they define the type of sounds that should be used as an alert, but also the volume range that would make an effective warning.


just-auto: Is there a danger that some drivers will actually drive faster or pay less attention to the possibility of a hazard ahead if they have faith that a gadget will alert them if needed?


Ido Amir:  The danger is always there. Therefore companies like Mobileye make sure to state very clearly that having the device in your car is in no way replacing the need for safe and alert driving at all times, according to the local laws. To put it in a less formal way: you can’t prevent people from doing stupid things, You can only try your best, and make sure all the reasonable warning signs are put up. My personal view on this is, that in the short term, people ‘test’ new systems (which is not advisable, but happens). In the long term, however, they get used to it, and continue to drive according to their natural driving style; this is when DAS really become the most effective. Also, as long as the DAS does not take real action, the driver is eventually aware of it, so this is another force that acts in the direction of more sensible driving.


Another point is that having a DAS in your car subconsciously raises the issue of driving safety, and the fact that the road is a dangerous place. This, in turn, can and does raise the general sense of alertness – at least from some checking and surveys I’ve done. Just noting that LDW systems were able to reduce between 65 and 85 percent of all lane departure accidents in a few large trucking fleets in the US gives you a strong indication of the ‘net potential of DAS.


To summarise, the overall expected result of widespread implementation of DAS, is a considerable drop in crashes.


just-auto: Some people refer to driver assistance systems as the ‘second revolution’ after ESP.   How and where do you see the technologies evolving?


Ido Amir: This is a long discussion. In short, I would say that there is room for technologies inside and out of the car that will make the roads of tomorrow safer. Starting with ‘smart infrastructure’, through inter-vehicle wireless communication, and sat-nav based systems, and ending with both active and passive in-vehicle DAS. The rule of thumb is always: the less drivers will need to be responsible for driving, the fewer accidents we will have.


just-auto: As the algorithms for doing image processing is continuously evolving and more and more computer power enters the vehicle, could you foresee that in the next decade, every new car will have a video camera?


Ido Amir:  You have outlined the issue correctly, and in fact, Mobileye has just come out with its second generation vision-system-on-chip, the EyeQ2™, which is positioned as a global leader of its kind, having six times more computational capabilities compared to its predecessor, the EyeQ1. The EyeQ2 is a low cost, extremely powerful video processing chip that Mobileye expects to be installed and used by a considerable percentage of the cars that will be manufactured in the upcoming years. There are quite a few OEMs and Tier1 companies that target the EyeQ2 as their future chip for DAS applications, and the process is constantly ongoing and catching speed. This being said, and taking into account the value-to-cost and feature consolidation discussion above, it can be well expected that in ten years from now, most of the cars around the world will have a camera-based DAS in them.


See also: RESEARCH ANALYSIS: Review of driver assistance systems