Among those traditional automakers taking space at this year’s consumer electronics show (CES) – and stealing much of the limelight – were Audi, BMW, Ford, GM, Mercedes-Benz, Toyota, Hyundai-Kia and VW. Presentations from new names such as Faraday Future and potential disrupters Google and Apple gave the show an extra buzz. The accent was on personalisation of the driver experience via cloud connectivity, shared autonomous vehicles and even drone technology. In this month’s management briefing – the first of two parts – we draw on automakers’ plans for partial- and fully-autonomous cars to see what we can learn from it all. Our second part turns the spotlight on supplier ADAS innovations – from Autoliv to ZF TRW – that caught our eye in Vegas and Detroit over the past few weeks.
Automotive technologies have become part of the CES landscape
Over the past few years, the annual CES in Las Vegas has attracted increasing interest from the automotive industry.
Today’s advanced driver assistance systems (ADAS) use a combination of warnings and some degree of active intervention to help steer the driver away from trouble. Although the emphasis is on giving assistance to the driver rather than take control away, motorists are still wary about cars that supposedly drive themselves. It is certainly true that a sudden malfunction in such a car could leave you frighteningly powerless. Aside from who is to blame if something goes wrong, it will take some time for drivers and pedestrians alike to feel comfortable with and around such technology. While driverless cars are being pushed to the public as desirable, are they really? Many of us like driving cars, most of the time. Yet consumer awareness of these ADAS features is increasing. We are seeing more TV advertising from a number of automakers, and buyers are developing an expectation that such advanced safety features be “built in” to their vehicles as standard.
While self-driving cars are not yet available to the public, there are increasing numbers of models offering some form of advanced assistance to the driver. These include adaptive cruise control (ACC), forward collision warning (FCW), autonomous emergency braking (AEB), lane departure warning (LDW) and traffic sign recognition (TSR). Yet a theme running through the recent CES and Detroit motor show is that such technologies are just the tip of the iceberg among production cars. They are becoming more affordable, too. For instance, forward collision warning and lane keeping assistance has already made its way into the Honda Civic and Hyundai Elantra. On the basis of what we have just seen, autonomous driving looks set to an even bigger theme in 2016.
Predictions as to when the first truly driverless car vary depending on who we talk to. Even if the more optimistic predictions come true, initial volumes will be small. For its part, Ford is putting three times the number of autonomous test vehicles on the road this year meaning it has more self-driving cars in the pipeline than anyone else. Before Kia’s fully autonomous car hits the road, the automaker believes that partially autonomous Drive Wise technology could be ready by 2020. Kia’s ambitions to be one of the early birds to sell such cars are not without foundation. The automaker was the second (after Ford) to be granted a license to test self-driving cars in Nevada, albeit with a number of restrictions imposed. Kia is reported to be investing some US$2bn into fast-tracking self-driving car development between now and 2018. Other notable features coming out of Kia at CES included a fingerprint reader (to adjust the seating position, music and climate control to suit the driver’s preferences) and gesture control.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataIndeed, gesture recognition is said to be the Next Big Thing. Rotating your finger clockwise at a screen could turn up the volume or a finger gesture could answer or decline a call. Such novelties will rely on sensors and cameras in the cockpit.
It is, of course, reasonable to assume that by 2030 mirrorless cars would be commonplace. Back-up cameras have recently been regulated while Tesla (and others) are calling for more flexibility on mirror requirements which date back to a rule made in 1968 (in the US, at least). But changing these rules will take some time and effort. Valeo used the 2015 IAA to present its so-called Sightstream, a new camera system that replaces conventional rearview mirrors. In a US first, Valeo used the 2016 CES to also highlight this system. And one reason BMW why hogged some of the limelight at the CES in Vegas recently was due to its mirrorless concept of the hybrid i8 supercar. Swapping mirrors for cameras could be an emerging trend. We are told mirrorless technology is coming on the Cadillac CT6 while Audi has already toyed with the idea in its Le Mans racers.
Although much of the technology needed to operate self-driving cars has been developed, the laws that allow such vehicles on our roads are some way behind. While some automakers are predicting fully autonomous cars by 2030, there is a still a lot of red tape to wade through before then. It will take some time to convince regulators – and gain the public’s trust – that it’s fine to hand over control to a computer of a car driving at 70mph.
Autonomous drive will also bring a paradigm shift to mobile net demands. Volvo and Ericsson believe that this shift will see an increased need for consistent and high-bandwidth coverage outside densely populated areas such as city centres and suburbs.
At the 2016 Society of Automotive Analysts outlook conference, we heard Brian Johnson, analyst at Barclays, set out a future in which ‘family autonomous vehicles’ and ‘shared autonomous vehicles’ replace much of the vehicle parc that we have today. In his model, the vehicles in the parc are used much more intensively (average 64,000 annual miles a year versus 11,000 today). According to his calculations, that would mean a net 40% reduction in the light vehicle market in the US (around 10m units a year versus the current 17m norm).
Automaker roundup
Audi
Audi used its exhibition space at CES to reveal its e-tron quattro concept, the brand’s conceptual study. It is an all electrically powered sport SUV. The concept has piloted driving technologies on board to use while driving in traffic jams and parking. The core component of future systems, says Audi, will be the central driver assistance controller, known as the zFAS. Information is continually acquired from all of the car’s sensors and processed in this compact module. They include signals from the 3D cameras, the laser scanner and radar and ultrasonic sensors. The high computing power of the zFAS gives it the ability to continually compare the data of vehicle sensors to the environmental model of the road. In early 2015, an A7 piloted driving concept with a large number of series production and near-series production technologies on board drove from Stanford in Silicon Valley via Bakersfield to the CES in Las Vegas.
Audi is also participating in the ‘Cooperative Highly-Automated Driving’ initiative (Ko-HAF). The aim of Ko-HAF is to develop standards and technologies that enable cooperative driving between highly-automated vehicles in everyday road traffic – for example merging with traffic on highways. The project work covers various subject areas from computer simulations to test drives on closed sites and later on public roads. In addition to Audi, other German car manufacturers, suppliers and universities are involved in the project. The initiative is funded by the German Federal Ministry of Economics and Energy.
BMW
BMW also attracted a lot of attention at the CES for its i Vision Future Interaction concept car. The model on display, based on an i8, gave a glimpse of future interface design and new iteration methods and eliminating the automaker’s iDrive rotary selector. In the cockpit, driver information is delivered through a head-up display, instrument cluster with 3D and 21-inch panorama display controlled by a new gesture recognition system, dubbed AirTouch. These gesture controls build on the technologies seen in the 2016 BMW 7 Series. Three driving modes are selectable on the steering wheel, namely Pure Drive (that’s where you take the wheel), Assist (where a variety of ADAS technologies intervene as and when) and Auto Mode (ah, the golden button allowing the car to take over the task of driving). The mirrorless concept of the hybrid i8 supercar also attracted interest. Here, three wide-angled ‘smart’ cameras are mounted on the doors and rear window to replace the rearview mirrors. The image of the car’s immediate surroundings is fed back to a display positioned in place of the interior rearview mirror.
Ford
Ford revealed at the CES that it is “exploring” linking smart devices like Amazon Echo and Wink to its vehicles to allow owners to control lights, thermostats, security systems and other features of their homes from their car, and to stop, start, lock, unlock and check their vehicle’s fuel range remotely. The automaker said it was working to link the home automation devices with its vehicles through its SYNC platform. It said half of consumers will buy at least one smart home product in the next year, citing Icontrol Networks. Ford already has 15m SYNC-equipped vehicles worldwide and expects 43m by 2020.
Ford is also saying that there are significant advances in sensor technology currently. It is using Velodyne’s newest LiDAR sensors – named Solid-State Hybrid Ultra PUCK Auto for its hockey puck-like size and shape – on its third-generation autonomous vehicle platform. Solid-State Hybrid Ultra PUCK Auto sensors boast a longer range of 200 metres, making them the first auto-specific LiDAR sensors capable of handling different driving scenarios, Ford says. Ultra Puck will accelerate the development and validation of Ford’s virtual driver software, which serves as the decision-making brain that directs vehicle systems.
Late last year, Google was reported to be in talks with Ford that could see the automaker emerging as a key manufacturing partner in the company’s plans to bring autonomous cars to market. Yet no further announcement from Ford was made at its press conference at this year’s CES to either confirm or deny this rumour. On 11 January 2016, the Wall Street Journal reported that the automaker is in talks with Google, to forge deeper ties with Silicon Valley, where Ford recently expanded its research lab. According to the WSJ, “people familiar with [Ford’s] plans” say the automaker is considering creating a separate business unit dedicated to developing autonomous cars for use in ride-sharing and fleets. Ford would develop software for components, including steering or braking, while Google would provide the autonomous-driving software that governs those functions. Earlier last year, Chris Urmson, director of self-driving cars at Google, said Google started talks with car companies and assembled a team of global suppliers to speed its push to bring self-driving cars to market. The suppliers named by Google included Bosch, which supplies power electronics and long-range radar to Google; ZF Lenksysteme, which supplies a new steering gear; LG Electronics, which supplies the batteries; plus Continental and Roush.
General Motors
Earlier this month, GM and Lyft – a rideshare start-up – partnered to create an integrated network of on-demand autonomous vehicles in the US. GM will invest $500 million in Lyft’s latest funding round to “help the company continue the rapid growth of its successful ridesharing service”. In addition, GM will hold a seat on the company’s board of directors. The announcement came in the week of the CES show which put the spotlight on new mobility solutions as well as personal and vehicle connectivity. Auto firms are coming under increasing pressure to develop strategies that address the rapidly changing mobility landscape and associated advanced technologies. They carry automotive supply-chain and manufacturing expertise, but new entrants, such as high-tech companies Apple and Google, see themselves as more agile and suited to rapidly changing market needs. Working together to blend these different attributes for new mobility solutions is one obvious approach, although who ultimately wields the power in such collaborations may be unclear for some time.
Karl Brauer, senior analyst at Kelley Blue Book’s KBB.com, noted that automakers are now having to look at partnerships with firms outside of the automotive space. “The GM-Lyft alliance follows news of a Ford-Google alliance, both of which are only the beginning of a series of automaker-tech tie-ups we’ll see in the coming months,” he said. “The rapidly-shifting nature of personal transportation has traditional car companies scrambling to position themselves for an uncertain future. It’s also creating an unprecedented opportunity for technology companies, assuming they can master autonomous technology while finding an effective partner to provide the hardware.”
Kia
Kia used CES to introduce a new sub-brand, Drive Wise, designed to encompass its future ADAS leading to autonomous drive vehicles. Kia recently announced plans to manufacturer partially-autonomous cars by 2020, and aims to bring its first fully-autonomous vehicle to market by 2030. Kia’s future Drive Wise technologies on display included Highway Autonomous Driving, Urban Autonomous Driving, Preceding Vehicle Following, Emergency Stop System, Traffic Jam Assist and a new Autonomous Parking & Out function:
- Highway Autonomous Driving (HAD) employs a combination of radar and camera detection systems to interpret lane markings, allowing the car to stay in its lane or switch into others to overtake other vehicles or follow a different road; all without driver input.
- Urban Autonomous Driving (UAD) applies GPS and sensors to identify the car’s position on the road, allowing it to safely navigate through densely-congested city environments while responding to live traffic updates.
- Preceding Vehicle Following (PVF) is an enhanced lane-keeping system which monitors the vehicle in front and allows the car to calculate its own path relative to it, following at a safe distance if road markings are indecipherable due to poor conditions or road layout.
- Emergency Stop System (ESS) operates in correlation with Kia’s Driver Status Monitoring (DSM) system, to analyse the driver’s face, ensuring their attention does not stray from the road for too long. If it detects that the driver takes their eyes from the road for too long, ESS can automatically direct the car into an appropriate side lane and come to a halt.
- Traffic Jam Assist (TJA) monitors the vehicle in front during congested traffic conditions, maintaining a safe distance from the vehicle in front and moving into appropriate spaces to gain ground.
- Autonomous Valet Parking allows drivers to exit the car and let the vehicle park itself remotely, activated using the smart key or a smartwatch.
Key to Kia’s future technologies is the development of its vehicle-to-everything (V2X) communications system. For Kia to advance its partially-autonomous ADAS technologies far enough to bring the true ‘self-driving car’ to market by 2030, it says V2X must be fully integrated into real-life driving environments and be able to react as a human driver can.
V2X applies a series of sensors, radar, LiDAR (Light Detection And Ranging) and external cameras, to perceive the surrounding environment and all relevant obstacles, as a human driver does. The system incorporates vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) technologies as well, allowing the car to recognize, judge and control every driving scenario, obstacle or potential threat.
Mercedes-Benz
Mercedes-Benz used the eve of this year’s Detroit motor show to unveil some novel autonomous driving technology with its new E-class. Drive Pilot, part of an optional driver assistance package claimed to “make the E-Class the most intelligent saloon in its class”, includes an Active Lane-change Assistant, a radar- and camera-based assistance system for changing lanes on multi-lane roads which can steer the vehicle into the lane selected by the driver – when overtaking, for example. Once the driver has indicated to turn for at least two seconds, the system assists with steering into the adjacent lane if it detects that the lane is unoccupied.
Another feature operates at speeds up to 80mph where the system can continue to intervene actively by taking account of surrounding vehicles and parallel structures, even if the lines are unclear or non-existent, like at road works. The system therefore makes driving that little bit easier, especially in traffic jams or heavy congestion. If this isn’t enough to raise some interest amongst Mercedes drivers, the E-Class’ selectable Speed Limit Pilot sub function might be. It can now autonomously adjust the vehicle’s speed in response to camera-detected speed limits or speed limits logged in the navigation system, say 30mph in built-up areas or 60mph on country roads.
Other new features includes Remote Parking Pilot, as introduced recently by BMW with its redesigned tech flagship 7 Series. This allows the vehicle to be moved into and out of garages and parking spaces remotely using a smartphone app, enabling the occupants to get into and out of the car easily, even if space is very tight.
Renault-Nissan
The Renault-Nissan Alliance heralded at the Detroit auto show that it would launch more than ten vehicles with autonomous drive technology in the next four years. The global car group confirmed it would launch a range of vehicles with autonomous capabilities in the US, Europe, Japan and China by the end of 2020. The technology would be installed on “mainstream, mass-market cars at affordable prices”.
This year will mark the debut of vehicles with ‘single-lane control’, a feature which allows cars to drive autonomously on highways, including in heavy, stop-and-go traffic. In 2018, Renault-Nissan will launch vehicles with ‘multiple-lane control’, which can autonomously negotiate hazards and change lanes during highway driving. And 2020 will see the launch of ‘intersection autonomy’, which can navigate city intersections and heavy urban traffic without driver intervention. All of the autonomous drive technology will be available at the option of the driver.
Toyota
How would a self-driving vehicle change your daily commute? That was a question flashed-up on a large screen above automaker’s booth at this year’s CES. One of the messages Toyota wanted to get across at the show was that it is developing a system for generating high-precision maps that will help the safe introduction of automated driving. It uses data from on-board cameras and GPS devices and was on display.
The new system uses production vehicles equipped with cameras to gather road images and vehicle position information. This information is sent to data centres where it is automatically pieced together, corrected and updated to create what it says are “highly accurate maps that cover a wide area”.
The company points out that it is essential to have an accurate understanding of road layouts and traffic restrictions and rules (including speed limits and signage) in order for automated driving technologies to be successfully introduced. Also, it says that “precise measurement of vehicle positional data requires the collection of information on dividing lines, kerbs and other road features”.
Until now, this kind of intelligence has been obtained using specially built vehicles equipped with 3D laser scanners. These are driven through towns and on main roads, and the data collected is manually edited to add information on highway features (kerbs, divides, signage etc). Because data collection is infrequent, the maps are not updated regularly, limiting their usefulness. This is also a cost-intensive process, Toyota maintains.
However, Toyota’s new system “uses automated, cloud-based spatial information generation technology, developed by Toyota Central R&D Labs, to generate high-precision road image data from the databanks and GPS devices used by the designated vehicles”.
It says that while there is a higher risk of error with a system that relies on cameras and GPS in this way, compared to one which uses 3D laser scanners, positional errors can be mitigated by using image matching technologies that integrate and correct the road image data from multiple vehicles, as well as high-precision trajectory estimation technologies. This restricts the margin of error on straight roads to a maximum 5cm. By using production vehicles and existing infrastructure to collect information, this data can be updated in real time, it says. It can also be implemented and scaled up at relatively low cost, Toyota maintains.
Toyota says it plans to include this system as a core element in the automated driving systems that will be available in production vehicles by around 2020. While initial use is expected to be limited to motorways, future development goals will include expanding functionality to cover ordinary routes and help with hazard avoidance. Toyota will also look to work with map-makers with the goal of encouraging the use of high-precision map data in services offered to both public and private sectors.
Volvo
For those of you who would like to give instructions to your car before you even get in, Volvo and Microsoft have the solution. Volvo let it be known at CES that it is working with Microsoft to launch a wearable-enabled voice-control system. Owners will be able to talk to their car using Microsoft Band 2, allowing them to instruct their vehicle to perform tasks including setting the navigation, starting the heater, locking the doors, flashing the lights or sounding the horn via Volvo on Call, and the connected wearable device. Voice control using the band will be available in Volvo on Call-enabled markets in spring 2016. Last November, the automaker and software company announced collaboration with the first automotive application of HoloLens technology, claimed to be the world’s first fully untethered holographic computer which could be used in future to redefine how customers first encounter, explore and even buy their car. Remote voice control using Band 2 is another step in the ambition to jointly develop new automotive technology.
Volvo also used CES to let it be known that it is developing intelligent, high bandwidth, streaming capabilities with its technology partner, Ericsson, that will ensure drivers and passengers get the most out of their time travelling in an autonomous Volvo. “We recently unveiled our design vision for fully autonomous cars with Concept 26. Now we are actively working on future solutions to deliver the best user experience in fully autonomous mode. Imagine a highway full of autonomous cars with their occupants sitting back watching their favourite TV shows in high definition. This new way of commuting will demand new technology, and a much broader bandwidth to ensure a smooth and enjoyable experience,” said Anders Tylman General Manager Volvo Monitoring & Concept Centre at Volvo Car Group. By using Ericsson’s network and cloud expertise, the automaker is aiming to provide an interruption-free experience in its cars whilst on the move. It says that by predicting your route and looking ahead at network conditions, content can be tailored to the duration of each trip and intelligently buffered to deliver a high quality and uninterrupted viewing experience.