Mobileye develops and provides the automotive industry with ADAS capabilities based on monocular vision. The monocular vision technologies include objects (pedestrian and vehicle) detection, lane detection and traffic sign recognition and high/low beam control. These technologies are powered by Mobileye’s System-on-Chip – (EyeQ) and provide a multitude of applications running in parallel.  To find out more, Matthew Beecham talked with Dr. Itay Gat, vice president of production programmes, Mobileye Vision Technologies.

To begin, could you give us an idea of the technologies Mobileye is developing?

For example, the lane detection technology supports lane-departure-warning as well as lane-keeping and traffic-jam assist. Object detection is the enabling technology for forward-collision-warning as well as collision mitigation by braking. ADAS based on Mobileye technology was released to the market starting from 2007 by car manufacturers such as BMW, Volvo, GM, Ford, PSA and many others.

In 2013 new applications based on Mobileye technology will be launched in serial production – most notably is the vision-only adaptive cruise control (VO-ACC) and the Automatic Emergency Brake (AEB). The AEB application released in 2013 will allow for limited braking force and in 2014 a full braking vision only AEB will be introduced. This will allow receiving all points in Euro-NCAP AEB testing and thus have 5-star safety rating. All the new systems are based on the existing monocular technologies and are adding further functionality without the need to add more sensors.

Mobileye is in advanced stages of developing new technologies such as General object detection, 3D world representation based on motion, Animal detection, plane plus parallax and traffic light recognition – to name just a few. Among the applications that will be based on these technology is the construction zone assist, free-space and Road profile. These technologies are developed for sourced business so that the application based on them would be launched as early as in 2014-2016 by various car manufactures. Another important application of the existing technologies is the self-driving autonomous technology, which is going to be unveiled in Q3 2013.

As the algorithms for doing image processing is continuously evolving and more and more computer power enters the vehicle, could you foresee that by the end of this decade, every new car will have a video camera?

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

In order to reach a standard fit of ADAS systems, it is necessary to offer a complete solution that would be affordable for the general public. The monocular vision solution is the most natural fit for this purpose as it is the only sensor supporting all required ADAS functionalities. A vision sensor that is coupled with a powerful system-on-chip can provide a platform that supports all the functionalities (vehicles, pedestrians, lanes, traffic signs etc.) running simultaneously. Vision sensors represent the most affordable solution and allow for the greatest flexibility. The flexibility enables the implementation of some applications in the setup stages (e.g.  lane-departure-warning and forward-collision-warning) and will allow for the inclusion of more sophisticated applications (e.g. VO-ACC) without any additional sensing or computation capabilities in the future. With an installation base of more than 1million Mobileye EyeQ systems already in 2012, and with the expected growth of 5 million Mobileye EyeQs by 2014 – it is easy to see how ADAS based on monocular vision could reach every vehicle by the end of the decade.

Radar-based safety technologies such as advance collision warning and blind-spot detection are becoming common place as optional equipment on new vehicles. While the possibilities to “assist the driver” seem endless, is there a risk of information overload?

The key element in providing the information to the driver without causing overload is in the accuracy of the information. A warning has to be provided exactly on time, only when absolutely necessary and there should be no false warnings.   In order to achieve that, Mobileye’s research is checking both the true positive performance (against ground-truth and on a test track), as well as verifying on large clip databases that there are no false positives. The complete validation also necessitates large scale databases that include many thousands of recorded driving hours. These databases include real-life driving scenarios and are properly balanced to represent a real-world user profile. Both of these efforts are also combined in large test campaigns with the automotive manufacturers that enable the testing of the technology together with the system interface to the driver. In such test campaigns it is verified that the system is actually providing valuable information to the driver and not creating a nuisance effect.

With the coming introduction of Mobileye’s active safety technologies (e.g. AEB) the contribution of ADAS will be even more pronounced. The system will be completely silent for the vast majority of the drive – but only in the very rare events where an accident is imminent – the system would kick into action. Here again the key element will be zero false actions. The testing is done on Mobileye databases that include Millions of KM driving, and ensures there are no false actions given.

Collision and lane departure warning systems have recently become part of NHTSA’s New Car Assessment Programme.  What are your predictions for market volume / fitment rates to passenger cars and light vehicles over the next few years in North America? 

The NHTSA assessment programme represents a major break-through in addressing ADAS. By embracing the need for forward-collision warning, NHTSA has set the standard and defined what is required for a safer car. Creating a standard enables end customers to be sure that their vehicle is actually safer and will reduce accidents. The forward-collision-warning (FCW) based on monocular vision is fully compliant with the NHTSA requirements, and as a result we have already noticed  growth in market volume since 2011 when such systems were introduced. Our prediction is that this trend will continue, with the combination of attractive pricing and high performance, serving as a strong lure for the general public.

Following this initiative we see that NHTSA are currently also pushing towards active systems. These features will be available in the soon to be released Collision Imminent Braking Systems (CIB) currently being examined by NHTSA. We anticipate that this initiative will help make vision based ADAS systems a standard feature in the coming few years.

For some time, advanced driver assistance systems were the sole preserve of the luxury vehicle class yet nowadays features such as adaptive cruise control and lane departure warning systems are being offered on the Ford Focus. How do you see the roll out of such ADAS technologies across all vehicles in Europe and Asia? 

Extensive testing is the critical element required in order to have the ADAS technology offered in every vehicle. This testing should be based on real-life driving scenarios, and should validate both the true activation of the function and the lack of false actions. Testing the real world scenarios is critical for the mass market. For example in inserting the pedestrian protection AEB it is required to test the performance on a test track with conditions set by Euro-NCAP. In such testing, the common method is based on a ‘dummy’ (simulating a moving pedestrian) that is pushed in front of a vehicle. The detection of this ‘dummy’, including the braking command, is being checked. Mobileye has taken this testing to the next level by introducing testing with actual human stunt-men. Instead of trying to simulate a real scenario with a static object being moved in front of the car, Mobileye has taken the extra  step to validate performance with real people that act in a more natural manner. A fully validated technology which has been tested thoroughly will ensure the successful roll out of ADAS in all vehicle segments.

Traffic sign recognition is a clear benefit of camera-based technologies in the West.  How do you see this market evolving in Europe and North America?

The introduction of the technology in 2008 in BMW vehicles has shown high penetration, with rapid growth.  Although the first application of speed-limit-indication was offered as an option system it also became standard in many vehicles. The features being released to the market in 2013 not only provide information on speed and non-overtaking indication, but also features such as stop, no-entry, and warning sign recognition. Specifically the “no-entry” detection yields a separate application that is addressing dangerous scenarios of driving against on-coming traffic.  The key elements that we see in order to enable the continued growth is extending the types of signs being detected, together with world-wide adaptations. The growth of the signs being offered will enable the development of scene understanding and increase the use of such systems. Given the expected increase in performance we also see growth in the market share of such applications.

Could existing reverse parking sensors play a greater role in partial and full parking assistance systems?

It is possible to make use of vision technology to improve parking assistance systems. For example, the system could better detect obstacles around the vehicle allowing for the life-saving difference between a standing object and a child. Furthermore, the scene interpretation and accurate measurements would allow for parking capabilities that are superior to those available today.

The remainder of this interview is available on just-auto’s QUBE research service