Not so long ago, the consumer electronics show (CES) was the place to exhibit the latest in computers, mobile devices and other electronic gadgets. It still is although the accent is changing. Every year, this Las Vegas tech fest attracts a rising tide of automotive suppliers taking space both inside and outside the convention centre halls. This month’s management briefing turns the spotlight on some of those exhibitors operating in the brave new auto world of voice control, artificial intelligence (AI) and machine learning. Now we’re talking.
For our next trick
Among those unveiling autonomous driving concepts at CES included Chrysler, Toyota, Honda and Faraday Future. Meanwhile BMW and Hyundai used the show to demonstrate technologies along with a number of suppliers, from Autoliv to ZF. A number of supplier-OEM technology partnerships were also announced, notably between Autoliv and Volvo Cars, Mobileye and NextEV, and ZF and NVIDIA.
Although a number of mind-boggling automated driving and connectivity technologies were also revealed at the North American International Auto Show (NAIAS), the CES stole much of the limelight.
Connectivity is king
Both shows, however, further demonstrated how connectivity between the vehicle, passengers and outside world has become a key priority for new car buyers. While the typical considerations of fuel consumption, performance and cabin comfort are still uppermost, staying connected while driving is moving up their list of priorities.
In terms of how things could shape up, Hyundai Motor has drawn up a connected car roadmap, introducing four main service fields as part of its ‘hyper-connected intelligent cars’ concept. The mid- to long- term development focus include smart remote maintenance service, autonomous driving, smart traffic, and connectivity mobility hub, all of which will benefit from continued R&D investment in the fields of in-vehicle networks, cloud and big data analytics and connected car security technologies. Hyundai also used the CES to display a number of technologies and confirm its partnership with CISCO to produce the ‘ultimate connected car.’
Continuing the connectivity theme, Ford and Amazon have teamed up to offer drivers the ability to access their car from home, and call up other features from their vehicle via Alexa – Amazon’s cloud-based voice service. While Amazon has managed to get this voice-control system into pretty much everything, its move into cars marks another major milestone.
Yet Ford is not the only carmaker offering voice assistance; Nissan and BMW stole some of Ford’s media thunder by revealing they will introduce Microsoft‘s Cortana into cars.
While gesture recognition is used in consumer electronics, it is now creeping into vehicles. Given that prodding and pinching at a touchscreen while driving can be distracting, using hand gestures and even eye movements to control climate and switch radio channels is seen as the next best thing.
Smudgy car touchscreens may soon be a thing of the past if BMW’s vision of tomorrow’s car interior becomes a reality. The luxury carmaker used the CES to dazzle us with an array of technologies, including its so-called HoloActive Touch system, itself a development of gesture-controlled technology that debuted in 2014. This new version acts like a virtual touchscreen; its free-floating display is operated using finger gestures and confirms the commands with what the driver perceives as tactile feedback.
Visteon is taking gesture control to a new level with a proprietary 3-D gesture recognition concept. The supplier used the most recent Beijing motor show to demonstrate the concept in a multi-purpose vehicle. The system reads defined hand movements to command certain features, using time-of-flight camera technology and high-performance, image-processing algorithms. “The system recognises specific gestures such as holding up one, two or three fingers to perform different functions such as operating the windows, changing audio volume or opening the glove box,” said Matthew M. Cole, who is Visteon Corp’s head of product development. “This provides quicker access, without the need to touch buttons or look for knobs. The system distinguishes between driver and passenger hand gestures, and also allows customisable gestures.”
Time-of-flight technology is based on the time it takes for light to travel from the source to the object and back to the camera’s sensor. By providing distance images in real time, the time-of-flight camera enables close-range gesture control in the cockpit.
Adient – seats for self-driving cars
Adient used the NAIAS in Detroit to unveil a futuristic seating system for the driverless car. Known as AI17, it sets out the supplier’s vision of how automated driving will impact the driver and passenger experience in tomorrow’s premium car. Adient was created when Johnson Controls’ automotive seating business was spun off as an independent company last year.
According to Richard Chung, who is Adient’s head of innovation, the AI17 showcases solutions for level-3 and level-4 autonomous vehicles. For example, it incorporates a Greeting/Conversation Mode arrangement that allows the front seat to swivel enabling the occupant to get in or out of the car seat or face the rear passengers during automated driving scenarios.
Autoliv – advanced positioning and ADAS
Given that accurate and reliable positioning and localisation are critical for today’s vehicles incorporating ADAS and even more essential for tomorrow’s self-driving cars, Autoliv used CES to take the wraps off its 6th generation location and positioning module. The Swedish safety supplier also announced that it is forming a joint venture with Volvo Cars to develop software for autonomous driving and driver assistance systems. The venture is known as Zenuity.
Continental – 360-degree view of a vehicle’s surroundings
Continental used CES to tell the world how it is evaluating the next generation of an environment model to deliver a true-to-life, 360-degree view of a vehicle’s surroundings. A reliable environment model requires a range of information, for example of other traffic participants, of static objects such as road boundaries, of the vehicle’s own precise location, and of traffic control measures. “In order for the system to acquire this information step-by-step, a range of sensors such as radars, cameras, and Surround View systems are needed,” said Continental’s ADAS business unit head Karl Haupt. “The aim is to achieve an understanding of the vehicle’s surroundings which is as good as or better than a person’s own understanding. More range, more sensors, and the combination of acquired data with powerful computer systems will help to sharpen the view and is the key to achieving a consistent view of our surroundings.”
Delphi – Vehicle-to-Anything communications
Among Delphi’s news announced at CES was an update on its work with AT&T and Ford in developing a new capability to enhance Vehicle-to-Anything (V2X) communications. The platform is designed to help vehicles ‘talk’ to each other and smart cities infrastructure to improve things like safety, vehicle security and reduce traffic congestion. The co-developed research is designed to monitor traffic conditions and notify drivers across the AT&T LTE network to approaching vehicles. For its part, Delphi developed the on-board V2X module, AT&T the software for the analytics platform while Ford integrated both into a Proof of Concept to demonstrate capability.
With software and services playing an increasingly critical role in all areas of the automotive industry, Delphi used CES to reveal it has acquired Movimento, a provider of Over-the-Air (OTA) software lifecycle and data management for the automotive sector.
Gentex – rear vision technologies
Mirror maker Gentex used the CES to reveal a three-camera rear vision system that streams rear video to a rearview-mirror-integrated display. These cameras to provide a view of the sides and rear of the vehicle. The side-view cameras are housed in downsized, exterior mirrors. Their video feeds are combined with that of a roof-mounted camera and stitched together into multiple composite views, which are streamed to the driver via a mirror-integrated display.
Harman – intelligent car cockpit
Harman manoeuvred a Chrysler Pacifica into the Hard Rock Hotel & Casino in order to showcase its intelligent car cockpit. The system supports an integrated digital assistant and car-to-home Internet of Things (IoT) functionality, as well as augmented reality navigation. The electronics specialist says it will be production ready within the next 2 to 3 years.
Harman also used the Las Vegas hi-tech show to reveal the fruits of its labours with Wayray, a Swiss-headquartered pioneer in Holographic AR displays. The partners are developing a wide view angle full windscreen head-up display proof of concept for the automotive market. WayRay specialises in using holographic optical elements that enable an augmented reality (AR) projection system that is, it claims, more compact than traditional mirror and lens based projection technology and allows for high-resolution projection in direct line.
Mobileye – developing Level 4 autonomy
Mobileye let it be known while the CES was in full swing that it has sealed a deal with NextEV Nio to jointly develop differentiated Level 4 autonomy for NIO US vehicles by 2019. Nio Usa vehicles will be built upon the Car 3.0 stack, a set of integrated modular technologies and infrastructure conceived from the ground up for L4 Autonomous Electric Vehicles. In its entirety, say these friends, the Car 3.0 stack will deliver transformational digitally immersive user experiences for their customers. NIO USA plans to launch its first vehicles built fully in 2019. See also our interview with NextEV US CEO.
Nexteer – tomorrow’s steering systems
Nexteer Automotive unveiled two new steering technologies at the NAIAS: Steering on Demand System and Quiet Wheel Steering. Steering on Demand enables the transition between driver and automated driving control through safe, intuitive steering transitions for vehicles capable of SAE Level 3 – 5 automated driving. Quiet Wheel Steering modifies the steering wheel rotation when a vehicle completes an automated directional change. Because the steering wheel remains still during automated driving, potential hazards of a fast rotating steering wheel are eliminated and the driver’s safety and sense of security are enhanced, claims the supplier.
Nexteer Automotive also used a press conference at the NAIAS to reveal a deal to form a 50/50 joint venture with Continental to develop motion control systems and actuator components for automated driving. This venture will combine Nexteer’s ADAS technologies with Continental’s portfolio of automated driving and advanced braking technologies to accelerate advancements in vehicle motion control systems. Frank Lubischer, who is head of global engineering for Nexteer, said this partnership will pave a way for both to collaborate on integrating electronic brake and steering systems.
Panasonic – Android-based in-vehicle infotainment system
Panasonic Automotive was also among those suppliers taking up prominent space at CES this year. The supplier is working with Qualcomm Technologies, a subsidiary of Qualcomm Inc, to develop a next-generation Android-based in-vehicle infotainment (IVI) system. The showcase featured this system built upon Android’s automotive features that control in-vehicle functions such as HVAC, and demonstrate the integration of Google‘s services and popular Android applications with the system. It is designed to address the challenges that passengers have when entering navigation destinations or controlling music. Panasonic also showcased its vision of a four-seater autonomous cabin concept, incorporating swivel front seats and connected interactive tables for all occupants, each featuring a screen in 4K resolution that can be used as a touch display or a foldable table – or both. This space-age cabin also features an array of touchscreens and augmented reality wizardry to keep the occupants occupied.
Visteon – transforming the future of mobility
While head-up displays (HUDs) have been in cars for more than 20 years, OE fitment rates have been slow and gradual. But the Hud market is predicted to flourish as more cars are being equipped with ADAS and drivers are demanding graphically rich data, visual alerts and greater levels of connectivity. As part of a wave of affordable smart windscreen technology, the Next Big Thing is Augmented Reality HUDs (AR-HUDs) designed for a more comfortable and safer driving experience.
For its part, Visteon Corp used the CES to show off its latest advancements in HUD technology, including an augmented reality driving experience.
We caught up with Visteon’s president and CEO, Sachin Lawande on the eve of the show. In explaining the supplier’s technology highlights, he said: “We are introducing our next-generation Phoenix infotainment platform and showcasing advancements across our cockpit electronics portfolio, including the latest in HUD and all-digital instrument clusters. We have nearly 40 technology displays at CES, headlined by Phoenix, which is designed to unlock innovation from third-party app developers while incorporating cybersecurity at every stage of development. It is also fully upgradeable over-the-air.”
Another highlight from Visteon’s showcase is augmented reality windshield HUD technology which is shown integrated into a vehicle. Lawande added: “This demonstration vehicle features an enhanced driver information system that utilises augmented reality and multi-modal feedback using audio and lighting to provide enhanced safety.”
Other technologies on display at the supplier’s booth included Smartcore, an industry-first cockpit domain controller; and a 12.3-inch dual curved plastic OLED display, which epitomises the larger, higher-resolution displays that are the trend today. “We’re also showing an informational ADAS instrument cluster, said Lawande. “It’s a digital cluster with an integrated 12.3-inch display and two cameras for facial recognition and driver monitoring.”
ZF – collaborating with Nvidia to develop a Level 4 automated driving solution
As we are hearing more the opportunities for artificial intelligence (AI) in autonomous vehicles, one innovation that caught our eye from ZF is its so-called ProAI. It is the German supplier’s first system developed using Nvidia AI technology designed to enable vehicles to better understand their environment by using deep learning to process sensor and camera data.
In explaining the unique features of ProAI, Dr Malgorzata Wiklinska who is manager, advanced R&D for ZF Denkfabrik, told just-auto: “ZF and Nvidia are providing a computing platform for multiple applications for automated operations, including automated driving up to level 4. The solution is based upon Nvidia’s PX2 processor. ZF made it automotive-ready with regard to temperature, NVH [noise, vibration and harshness], durability and implemented all necessary interfaces for automated operations. For higher levels of automation, ZF ProAI will be based upon the next generations of processors currently under development.”
In terms of the opportunities that ProAI will open, Wiklinska added: “With the ProAI artificial intelligence will enter the automotive sector and other applications, a level of intelligence never before seen in the auto industry. ZF will be the Tier One in this case and the vehicle manufacturers will work with a well-known, reliable partner and gain access to multi-platform capability and of course gain access to the competencies of both ZF and Nvidia. The ZF ProAI will be offered to OEMs by 2018 in order to meet the requirements for higher levels of automated driving.”
Is this the future?
While giving instructions in our cars is nothing new, putting questions to the likes of Alexa and Cortana while on the road is a novelty. And although waving goodbye to knobs and buttons will make the cockpit feel less cluttered, time will tell if it will make the driving experience easier. Despite driverless cars being pushed to the public as desirable, are they really? Many of us like driving cars, most of the time. Yet consumer awareness of these ADAS features is increasing. A number of car models now have some form of parking assistance fitted either as standard or optional. Yet not everyone likes the idea of driverless cars running on our roads. A sudden malfunction could leave you powerless. Hey Siri, does the car’s algorithm cover the simple but unexpected rolling football on the road? Alexa, what will the autonomous car feel like to drive in? OK Google, who is to blame if something goes wrong? Don’t hold your breath for any clear answers. We are moving into uncharted territory.