Games technology has been credited with “turning the automotive industry upside down”. To find out how such a big claim can be justified, Matthew Beecham talked to Epic Games director of industry management, Heiko Wenczel.
Heiko, that’s a big claim. What’s the justification?
It’s actually a statement from one of our vehicle manufacturer customers, referring to our technology platform Unreal Engine. If you look at the ways Unreal is used across the automotive industry, covering every stage of the vehicle product life from R&D and styling through development and manufacture to the vehicle HMI and the sales interface with the end customer, it’s clear why he formed that view. Nine of the top ten global vehicle manufacturers have built major applications in Unreal Engine with some, like BMW Group, selecting the technology as the cornerstone of their visualisation strategy.
Epic is best known for huge online games like Fortnite. How did you get into the automotive industry?
Without knowing it was happening! Our business model is that the full features of Unreal Engine, on which games like Fortnite are built, are all available for free. That means we often don’t know what people are building with it until they ask for our help when we work with them as a technical consultancy. By 2015 there were enough B2B relationships for us to form an Enterprise division specifically to support business users, which includes identifying and developing the extra capabilities they would value in the Engine. From there it snowballed.
The earliest applications were car configurators, usually developed by the vehicle manufacturer’s digital agency. Old style configurators work like flipbooks, which makes them vastly expensive, especially to update with new images, and the results were usually unrepresentative and unflattering. Early adopters of real-time visualisation found that a game engine could create a much more engaging representation of the car; perfect for selling special paints and interior trim options as well as providing a luxury sales experience. It wasn’t long before premium manufacturers were confirming that they were also selling more high-value options thanks to improved visualisation.
Next came styling and engineering. Stylists found it invaluable, especially when we introduced improved features for simulating the way light falls on different types of paint and textured materials. They can build their styling model in a virtual world, place it in any environment and collaborate around it with colleagues and leadership at any location, making changes quickly and cheaply. The same model can then be passed to engineering, precisely dimensioned, to evolve as the vehicle progresses towards production.
Has Covid accelerated the adoption of these techniques?
Covid is one of many industry pressure points that are accelerating the use of digital tools and cloud-based operations, especially those that enable greater remote collaboration. Daimler, for example, has developed a system that allows engineers from anywhere in the world to come together in a virtual reality environment where they can evaluate designs, annotate the models, adjust sizes and finishes, reposition elements and save files back to the central PDM (Product Data Management) system. All from their laptops.
The main challenges I come across are around time and resources, especially those related to the proliferation of powertrain options and expansion of the model portfolio. Engineering departments are having to deliver more, and deliver it much more quickly. Covid has increased the need for remote collaboration, but it’s also put even more priority on these two existing pain points as programme leaders search for ways to make-up lost time.
The first application of real-time visualisation is often in response to a deep technical challenge that would munch through impossible amounts of resource if addressed using conventional approaches. An example is the rapid acceleration in the adoption of advanced driver-assistance systems. Each new system needs a new user interface, and because it is safety-critical it must be thoroughly validated. Yet there is limited prior knowledge, so there is no accepted starting point. On top of that proliferation, we are now working in a global market: is the right solution for Germany also the right solution for China? There may be 30 design options to test, with collaboration needed from specialists spread across the globe.
Mixed reality (MR) provides a fast, affordable way to find the answers. In BMW’s Mixed Reality Laboratory, engineers sit in a physical vehicle buck in which 3D printed components provide the variable tactile input while the non-tactile elements are viewed in virtual reality using commercially available headsets. All that is needed is a high-end computer games machine and some off-the-shelf VR hardware.
Using the VR vehicle model, production processes can be optimised in a virtual assembly hall before being set up in the real world.
At the other end of the vehicle journey from sketch to the showroom, one of the drivers for the introduction of VR into manufacturing planning is the need to introduce electric and electrified powertrains, initially in relatively low volumes. Using the VR vehicle model, production processes can be optimised in a virtual assembly hall before being set up in the real world. The training of operators can begin even before the hardware is ready.
An aspect of my job that I particularly enjoy is hearing how our customers are becoming much more collaborative, and that certainly has been accelerated by Covid. I’ve watched specialists in packaging, thermal, body-in-white and NVH (Noise Vibration & Harshness) using a VR model to work interactively together as they strive to find space for additional systems, eliminating many of the traditional delays that slow down decision making. I think Covid has forced many engineering leaders to sample these systems, building their confidence much more quickly. Now they’ve seen the results, we’ll see continuing rapid growth in their development and application.
To what extend does BMW’s commitment to Unreal Engine represent a turning point in the adoption of real-time visualisation?
You’ve put your finger on a very important transition. We are coming from a heritage of real-time visualisation – the technology used for configurators and in BMW’s mixed reality research systems – to a more confident approach, where the quality of the visualisation is a given and the focus becomes a real-time model of the vehicle that includes all the known data, from the dimensions and specifications of every component to the reflectivity of the paint.
Engineers often refer to a digital twin, but so far these have focussed on specialist activities: a combustion model, a body-in-white model, a vehicle dynamics model, all of which need to call on the latest, most comprehensive vehicle data. This means there are big benefits from introducing a single, company-wide model of the complete vehicle that begins its life with the earliest styling concepts, develops through R&D and vehicle engineering before supporting manufacturing, vehicle personalisation and then marketing and sales.
A direct link to CAD means this 3D virtual reality model can be employed at each stage of the vehicle’s product life without conflict, duplication or any of the risks that can plague parallel engineering. We refer to it as the “Single Source of Truth.” It’s a great example of how improved tools can create a digital thread; that is, an integrated view of the vehicle’s data throughout its lifecycle, across traditionally siloed functions.
GM name-checked Epic in their electric Hummer press release. Is this another application area?
This is an interesting one as the strategy behind GM building the new Hummer’s HMI around games technology is a response to several longer-term industry trends. It would be easy to look at the HMI and see a snappily-designed, graphics-rich user interface that’s there to grab attention, and that’s certainly part of why Unreal Engine was chosen. The designers love it because increasingly large screens provide vast, seamless surfaces for their creativity. But the benefit that GM chose to highlight was more strategic: an ability to quickly develop ways for new systems – in this case, electrification – to communicate with the driver.
When you are deciding how to display information about the state of charge and energy regeneration, or the location of hazards and the urgency of returning control to the driver, there are no decades of experience to call on. Games engines provide new freedoms for designers searching for the most effective (and most engaging) ways to communicate this information and a fast, efficient way to build and test each option.
When one is chosen, the traditional development path is for engineers to code the software, leading to an iterative process in which designers and ergonomists review the system and engineers revise the code. Games developers work without these silos, so the best platforms are structured to allow designers to evolve the end product in real-time (design-driven development), quickly and efficiently while engineers get on with their day jobs.
Having a game engine in the car also opens up a new roadmap for adding value with new product features. It would be easy, for example, to download the exact specification of the vehicle so when the driver glances down to see how the electrical system is harvesting energy as they slow down, the graphic will be that actual vehicle, in the correct colour and trim, with the wheels chosen by the owner. Or they might select a fantasy vehicle, either from embedded models or via cloud-based services. Like a smartphone, the HMI is a gateway to additional services and personalisation, but in a vehicle, it can be much more sophisticated.
Could you give us a flavour for your technology roadmap?
To really transform the way that our industry works, we need to enable seamless implementation of the Single Source of Truth concept. Everything else we have talked about is pretty much here now or will be in imminent releases of Unreal Engine. Now we have to join up all the applications to fulfil the vision of enabling a digital twin of the vehicle that develops through the design and development process, supports flexible manufacturing, enables new levels of user experience, facilitates new revenue streams, and even simplifies end-of-life recycling.
The key to much of this is the ability to transfer data between the visualisation engine and specialist applications in real-time, without pre-processing. We are not going to compete with the developers of specialist applications for specific activities; our role is to complement those systems by offering them seamlessly, two-way access to the central model and, should they choose to adopt it, real-time visualisation. In terms of our technology roadmap, that’s one of my top priorities and it’s why we are working closely with industry groups such as the Institute for Digital Engineering in the UK.
Cloud-based services are so new that our industry is still learning how to develop them commercially.
The final piece of the jigsaw is the creativity of the vehicle manufacturers and their development partners. Areas like cloud-based services are so new that our industry is still learning how to develop them commercially. The smartphone model is a starting point, but it is far too restrictive. Then there is the growing need for training AI, for example by generating the edge cases needed for calibrating Advanced Driver Assistance Systems (ADAS). All this capability is available now: the roadmap is the creativity and experience that will be brought to how it is applied.