
Although autonomous drive technology is available in a handful of OEM models (often via advanced driver assistance systems – ADAS), robotaxi pilots and autonomous busses have been at the forefront of driverless vehicles on the road.
However, companies such as Nissan and Continental are working with an autonomous driving software company, Imagry, that has created a ‘HD-mapless driving system’ that it says allows OEMs and Tier 1 automotive suppliers to enable L3 autonomous driving in the passenger vehicles they manufacture.
Imagry’s AI-based autonomous driving software enables the self-driving vehicle to understand the road as it goes, due to the system described as an ‘HD mapless solution’, which allows a vehicle to react to situations and environments in a similar way to a human driver.
We spoke with Eran Ofir, CEO, Imagry, to learn more about the company’s AI-based mapless solution.

Just Auto (JA): Could you provide some background on the company?
Eran Ofir (EO): We have been driving autonomously on public roads for the past five years in the US in Arizona, California, Nevada, and also in Germany, Tokyo. We have also autonomous busses on the road.
We started that activity in 2023. Today we are working with Continental, Nissan, and some others. We are the first autonomous driving company in the world to drive on both sides of the road.
The company itself started in 2015, but it was not focused on autonomous driving, but was focused on a computer vision by a group of scientists that wanted to build a system that would behave like a human brain. The company pivoted to autonomous driving in 2018, and in late 2018 the technology was driving on the road with the AI technology.
How does the AI-based HD mapless driving system work?
Like Tesla we use cameras only, so our entire perception system is based on vision. We are HD mapless; we don’t need those HD maps to be provided to the vehicle in order to drive. We don’t need external input.
We see what’s happening around the vehicle like a human being does. We realise that through an array of many neural networks that are trained for a long time to understand the traffic lights, traffic signs, lanes, parked vehicles, moving vehicles and pedestrians and everything that is moving 360 degrees, for 300 meters around the vehicle.
We see what’s happening around the vehicle like a human being does.
We create that map of everything on the go, on the fly, the same way that you and I drive, and that map gets into motion planning, which is also built as an AI. We don’t write what’s called classic code, or another name for it is as rule-based coding. Rather, we have AI that you see in the movies where everything is being poured into a black box, and that black box is training itself over time. The same way that humans are being trained to drive, our system gets better and better at the driving tasks.
We are also hardware agnostic. We let the OEMs and Tier 1 companies decide which computing hardware they would like to use. This is very important for them because they are not willing to base their entire portfolio on a single chip, a single vendor – this is never going to happen.
There will be different hardware for an entry level vehicle of say €30,000 and a mid-range vehicle of €60,000, or a high-end vehicles. There is different hardware, computing set-up, cameras and capabilities; they won’t drive the same autonomously because of the strength of their computing. We know how to run transparently on all these different platforms so the OEM can get one software that runs on all its vehicles, from the entry level all the way to premium.
What are some benefits of having a mapless driving system?
We hear it from both from the large Tier 1 companies, as well as OEMs, that the way forward is increasingly seen as mapless.
If you need to continuously send vehicles maps you need high bandwidth and continuous communication. What about tunnels? Rural areas? Network congestion? It adds much complexity, cost, and there are latency issues – which is the most important parameter of any autonomous driving system.
Europe regulated and put into practice UNR-155 which is a cyber framework, which became mandatory from August 2024. If your vehicle drives by an external map, you inevitably have risk in your system. Whereas, if the vehicle is self-sufficient, it doesn’t need those HD maps to drive on the road and then it can drive everywhere; there isn’t a cyber risk doing that.
What are some use cases where this technology has been used?
The big deployment (and this is where Europe is leading), is autonomous busses. Around 20 countries in Europe are conducting pilots with autonomous busses.
There is a shortage of around 16% of bus drivers globally. The young generation just don’t want to drive commercial vehicles. There is a lot of pressure on public transportation operators. They need to provide more services, hours, and frequency of busses, but don’t have the drivers. The solution to that in Europe is autonomous busses.
In the last two months we submitted proposals for protocols in Austria, Germany, The Netherlands and Sweden. We already have two projects on the road running autonomously with busses. One of them is currently running in Israel at a very large medical centre that has 18 stops. The other is a commercial line on a public road that has 21 stops.
The Japanese government decided that by the end of 2025 they want 50 locations with autonomous busses on the road, and by 2027 they want 100 locations with autonomous busses on the road.
Our response time as humans to things that are happening on the road is 300 milliseconds. The response time of our system is less than 100 milliseconds.
In order or the bus to drive autonomously on a public road we needed to pass the NCAP testing regulations for busses. We needed to take it for a test drive and pass with a score of 100 with no faults over some 90 test scenarios in which the bus is driving at speeds of 30-60 kph. There are all kinds of tests. Whether it’s vehicles or pedestrians jumping out in front of the bus, we need to prove that our systems is much better than human drivers.
The bus needs to be able to adapt itself to what’s happening. There are many scenarios and only after you prove that your system can respond correctly and autonomously to all these scenarios will they let you drive autonomously on the public roads.
Our response time as humans to things that are happening on the road is 300 milliseconds. The response time of our system is less than 100 milliseconds, so we can respond three times faster to events that are happening on the road. When people understand that autonomous vehicles are safer than a human, I think the change will happen.
What are some barriers facing autonomous driving now?
In the last decade we have all heard of big projects and associated promises about autonomous driving. The barrier isn’t regulation, because regulation adapts itself eventually, the barrier is actually computing.
Within the last two years all new systems are now AI-based, because you need an extensive, huge amount of data to process in real time in order for the vehicles to drive safely on the road. In order to process all the data, you need a very hefty computing platform that can carry that load and be installed within a vehicle. It’s not cloud computing, it’s edge computing. Everything needs to be installed in a vehicle – but that computing capability wasn’t there.
Most of the coverage that you see on the news and platforms like LinkedIn is about robotaxis. Robotaxis are not the product that will take the mass market. For example, with Waymo [Alphabet/Google], the hardware that they put on the vehicle is costing around $100,000. You cannot take that hardware and software stack that costs over $100,000 and install it in a vehicle that costs €35,000, there simply isn’t a product market fit for that. In a robotaxi you can, because they drive 14 hours a day on the road and it’s a different business model; when you pull out a driver, which is 95% of the cost, you can justify a return on investment in two years.
There are around 15 robotaxi companies, but none of that technology is going to get to passenger vehicles because it doesn’t fit that market.
At the moment we have Tesla, Mercedes S Class, and BMW Seven Series; that’s it, that’s the three vehicles that are approved for level three autonomy on the road. BMW and Mercedes are only highway, so really only Tesla at the moment.
What do you see the future holding for commercial autonomous driving?
My prediction is that by 2030 many of us won’t drive our vehicles. I think it will start with the younger generations and we see it with the busses. The young generations are jumping into the bus and immediately take out their phone to take videos. The elderly are hesitating.
It’s always about the early adopters and what follows; by 2030 there will be a double-digit ratio of acceptance in terms of percentage of drivers that will use autonomous driving capabilities.