Harman says its new dynamic map layers solution identifies differences between on-board map data and real-world information being captured through a vehicle’s Advanced Driver Assistance Systems (ADAS), navigation system and on-board sensors.

The solution is already deployed in the market with an unnamed German automaker.

The map solution uses data collected from cameras and other car sensors to recognise road signs from the surrounding environment and compares it with the digital map information from the onboard navigation system.

If a difference is detected, the information is anonymised (removing names) and sent to the cloud, where Harman’s scalable cloud platform analyses the data collected from other similarly-equipped production vehicles.

Using spatial machine learning techniques, the solution can then, in real time, deliver updates back to the road network. Map layers also update vehicles with details to ensure the connected car and its driver are up to date about road conditions ahead.

The solution will keep ADAS and navigation systems up-to-date with speed limit changes, warn drivers of upcoming construction zones and any other signs they may encounter on the road.

Harman’s solution is based on the Navigation Data Standard (NDS), meaning dynamic map layers information could be shared among different vehicle makes and models that also use NDS for navigation purposes.

As the solution is deployed across automakers, it will offer up broader industry-wide benefits as OEMs can collaborate in building comprehensive, real-time representations of the road network.

As the number of sensors in cars increase, this same standards-based solution can be used to create updates to existing road maps and support high-definition map content required for various safety and autonomy applications within the connected car.

“By harnessing Harman’s software and experience with ADAS and navigation, we have created an automotive grade solution that will keep drivers safe,” said Harman Connected Car Division, Autonomous Drive Business Unit, vice president, Mike Tzamaloukas.

“As more cars on the road begin using this technology, our deep neural networks algorithms will develop guardian driver skills that we will all grow accustomed to using in our everyday drive experience.”