Automotive Field Guide v1!1!683681366
Automotive Field Guide v1!1!683681366
Automotive Field Guide v1!1!683681366
AUTOMOTIVE
FIELD GUIDE
BUILDING AN OPEN AUTOMOTIVE PLATFORM AND
DATA MODEL WITH UNREAL ENGINE
Copyrights: BMW AG
Acknowledgments
We wish to thank all of the people we interviewed for this guide for sharing their
time and wonderful insights about the automotive production process.
2
“
Our vision for where automotive
goes next is one holistic open
platform that covers the entire
pipeline from end to end.
”
Heiko Wenczel
Head of Detroit Lab, Epic Games
3
Contents
CHAPTER 1
A bird’s-eye view of real-time for automotive 6
Who is this guide for? 9
History 10
Where the automotive industry is today 12
Spotlight: key real-time innovations 14
Research 14
Manufacturing 16
Marketing 20
Autonomous driving 23
VR/AR 26
Challenges and opportunities 28
Process 28
Interaction options 32
Ancillary IP licensing and metaverse-like activations 34
The connected platform: processes and departments 35
CHAPTER 2
Bringing your data into the engine 36
What is Datasmith? 38
The three key Datasmith workflows 39
Key data enhancement tools in Unreal Engine 40
Editor tools 41
How to automate data setup 42
Swim lane 44
CHAPTER 3
Real-time workflows in detail 46
Real-time technology across the product life cycle 48
Concepting 48
Design 50
Engineering 55
Manufacturing 56
Sales and marketing 59
Operations 63
Autonomous driving research 64
Human-machine interfaces 66
4
CHAPTER 4
Real-time technology in action 70
Burrows 72
Mackevision 75
CHAPTER 5
Case studies 78
BMW 80
Audi Business Innovation 82
MHP | Pagani 84
Daimler Protics 88
Geely 90
Warwick University 92
Scania | GEISTT AB 94
Toyota 96
Ike 98
CARLA 100
CarSim 102
CHAPTER 6
The future 102
Glossary 104
5
CHAPTER 1
A bird’s-eye view
of real-time
for automotive
6
Interactive technology is already
driving efficiency throughout today’s
automotive industry. Engineers
are collaborating using real-time
technology like VR to create and
review new designs, R&D teams
are conducting autonomous driving
research in game engine-driven
simulators, and customers are using
real-time configurators to make
purchasing decisions.
7
The Automotive Field Guide
Introduction
Interactive technology is already driving efficiency throughout today’s
automotive industry. Engineers are collaborating using real-time technology
like VR to create and review new designs, R&D teams are conducting
autonomous driving research in game engine-driven simulators, and
customers are using real-time configurators to make purchasing decisions.
The data model of the future is to stay contained within a game engine, on a
fully open platform where the features and tools that you need are already
available without you having to switch between different software packages.
8
Chapter 1: A bird’s-eye view of
real-time for automotive
”
providing the bedrock on which to build an entire automotive pipeline.
We’ll look at how real-time workflows can drive efficiencies across each
of the different automotive departments, then touch on key entry points
for companies looking to develop a fully open-platform approach to
automotive data.
Whatever your motivation, the following pages will provide plenty of food for
thought. Let’s get started.
9
The Automotive Field Guide
History
The automotive world has changed dramatically since CAD pioneer Dr. Patrick
Hanratty designed the first computer-aided drafting system in the 1960s. Mass
production was possible thanks to the Industrial Revolution, but the act of
designing vehicles was still stuck in the past, requiring a designer to draw each
facet by hand. Life-sized visual prototypes mostly started and ended with a
block of clay. Each method required countless hours of manual labor to achieve
the end result. Making changes or variations in design meant a literal trip back
to the drawing board and untold numbers of production delays.
It’s no wonder that by the 1970s, the auto industry was embracing CAD
systems. The window between design and production shrank. Designers could
suddenly make digital templates instead of starting every design from scratch.
Minute drafting errors no longer meant fatal flaws. It became possible to
simulate stress tests with a computer model instead of a physical prototype.
By virtue of CAD being so much more efficient, designers could experiment
with new ideas faster than ever before. By the 1980s, CAD had hit critical mass.
Over the past 40 years, CAD systems have evolved from producing just 3D
wireframes to generating 3D solid models complete with mathematically
accurate physics and material properties, unlocking a new era of virtual
analysis. This brings us up to date, where we now live in the era of the digital
twin—a mathematically perfect representation of a physical object and all
its variants in a digital space—and cloud-based collaborative virtual reality
design sessions.
10
Chapter 1: A bird’s-eye view of
real-time for automotive
What if there was a way to sidestep all this, and produce an asset once
that could be used for every stage from design and validation through to
photorealistic beauty shots for dealership literature? There is—it’s called real-
time visualization.
11
The Automotive Field Guide
SUPPORT,
CONCEPT
OPERATIONS
SALES DESIGN
MARKETING ENGINEERING
MANUFACTURING
Computers will be key to this evolution. Cars are no longer purely mechanical
entities; increasingly, complex computer systems are playing a central
role in how they operate. As vehicles become more and more autonomous,
their “digital brains” will become as important as the frames from which
they’re constructed.
12
Chapter 1: A bird’s-eye view of
real-time for automotive
Beyond the software and programs that will play an increasingly central role
in next-generation vehicles, real-time technology can transform traditional
automotive processes, opening up new avenues for creative concepting and
design, streamlined manufacturing workflows, and previously inconceivable
marketing opportunities.
This evolution is already playing out across the automotive pipeline at forward-
thinking companies. Here’s a snapshot of how:
Concepting: Concept artists are using interactive tools to explore and iterate
on ideas instantly.
In the next section, we’ll highlight some of the key areas where real-time
technology has been driving innovation.
13
The Automotive Field Guide
“
Spotlight: key real-time innovations
Unreal Engine is the Real-time technology is driving innovation across the automotive pipeline in
highest-quality real-time many different ways.
software currently
out there. It renders faster Some of these changes provide new opportunities for specific automotive
than any of the competitors, departments. In the sales and marketing phase, for example, customers can
and it’s open source, which walk away from a car dealership with a brochure showing the car they just
is often overlooked. personalized with a real-time configurator, rather than one filled with generic
”
shots of the vehicle in various standard colors.
In this section, we’ll take a look at some of the key touch points across
the automotive pipeline where real-time technology is providing new and
exciting opportunities.
Research
Game engines are perfect for research because they’re able to provide
abstractions for a wealth of complex functionalities. If you only need to
generate static 2D images or plots or don’t need real-time rendering, you’ll
be best served by your existing tools and workflows designed specifically for
those purposes. However, should you need an online rendering solution for 3D
data visualization, or need to record a simulation in real time, game engines
will provide the most powerful and flexible solutions. Their combination of
powerful GPU-accelerated 3D rendering, real-time physics simulations, and
native support for AR and VR make game engines a benefit in the field of
automotive research.
Visualizing data points, and then creating an interactive demo based on these
points, opens the floor for non-engineers to collaborate on complex studies
and experiments. An engineer may be able to look at a mathematical model
and know that if they adjust the angle on a front bumper by five degrees, the
average air coefficient will be affected in a certain way, but no one else will be
able to parse it.
When the data is in an interactive demo, anyone can interact and see how the
changes they make will affect drag. In a game engine, you can easily adjust
variables like wind direction and speed in real time.
14
Chapter 1: A bird’s-eye view of
real-time for automotive
These same principles can be used to test fluid dynamics like oil flow in an
engine or water tightness around a windshield.
Game engines are also excellent tools for geometry exploration. For example,
you can create a simulation with force feedback to test whether removing a
seat from a fully assembled passenger cabin is relatively easy.
What if you could find out how hard it is—or if it’s possible in the first place—
to remove that seat, and which designs may need to change before a single
physical model is ever produced? A game engine enables you to adjust
variables like the height of the technician, various spacial tolerances, and the
mechanical parts of the seat assembly, all in real time.
Materials testing
At its core, research is all about testing to see if concepts work in the real
world, and pushing the limits of these concepts. In a research environment,
real-time engines provide methods for proving whether a concept is possible,
or if there are ways to improve upon existing processes. Their high-quality
physics and collision simulations, along with highly optimized network
replication functionality, make it relatively simple to conduct experiments that
are both accurate and accessible by multiple users simultaneously. You can
feed live sensor data into a real-time engine and see how manipulation of key
variables can augment your existing products.
Connecting sensors to a game engine enables you to test all the various
scenarios and devices you want to research, and in a novel and relatively
simple way. Game engines can serve as a consolidated environment where
any source of data can be viewed, interacted with, and adjusted in real time.
While purpose-built software already exists for doing these tasks, most
solutions don’t allow for real-time data manipulation, nor do they produce
photorealistic imagery.
15
The Automotive Field Guide
If you want to test claims that a new sensor has the best facial recognition
capabilities, you can connect the sensor to a game engine to validate the claim.
If it isn’t actually better, you can examine the sensor and see if it’s possible
to build better facial recognition software and find out how far you can push
its capabilities. Real-time engines excel at creating photorealistic imagery, so
generating ground-truth training images for machine vision applications is an
extremely common use for researchers.
When sharing research data around the world, it’s incredibly important to
have a shared virtual environment that’s closely synchronized from one user
to another to ensure the validity of any collected data. The Pixel Streaming
system in Unreal Engine, for example, enables multiple people in different
locations to interact with, manipulate, and collect accurate data regardless of
their individual local hardware configurations. Similarly, you can have one build
in place and give various stakeholders access to the visual data in AR or VR,
helping to eliminate information silos within your organization.
Manufacturing
Manufacturing can benefit from real-time engines in myriad ways. Game
engines play a key part in Industry 4.0, serving as open-data platforms. When
your organization’s data is in a game engine, that data can be used in unlimited
ways, including workflow visualization, tracking of production cycles, robotics
programming with machine learning, and implementation of AR and VR training
for complex tasks. The game engine becomes the hub for all your digital
assets, be they models of vehicles or complete digital twins of the facilities in
which your vehicles are manufactured. It becomes the source of truth across
your entire organization, ensuring consistency and accuracy regardless of
where the data is used.
16
Chapter 1: A bird’s-eye view of
real-time for automotive
AR and VR in manufacturing
If the AR pieces of your production cycle remain relevant and provide training
benefits, and the task is relatively low-risk for physical harm, AR makes the
most sense for training. Should the task at hand be dangerous, a VR training
simulator would be more practical.
These visualizations are easy to share, and because everyone at the review
session is looking at the same model and the same data, even non-technical
colleagues can easily provide input and suggestions.
17
The Automotive Field Guide
“
Using Blueprints, our
technical artists were good
to go, and with less effort
Robotics simulation
”
repetitive tasks. But for AI and robotics to be effective on your production line,
they must first be trained for both conventional behavior and edge cases.
S tephan Baier
Head of Immersive Experience, MHP
Copyrights: BMW AG
To make a robot efficient for its environment, you have two options for
programming it: train the robot in a real-world scenario, or virtualize it by
connecting all its sensors to a real-time virtual environment and running
hundreds of thousands of test scenarios overnight.
Workflow visualization
You can zoom in to any level of detail and see every screw you want to
simulate, or zoom out to a high level and watch a box as it moves down its
path through the facility. By implementing AI here, it’s feasible to find the
most efficient paths through a plant for any given just-in-time delivery model,
and visually demonstrate that to key stakeholders. In Unreal Engine, you
can use the Blueprint visual scripting system to automate and demonstrate
different test scenarios as well, making it easier to convince others that these
efficiencies are a better way to work.
18
Chapter 1: A bird’s-eye view of
real-time for automotive
Unreal Engine’s Blueprint visual scripting system makes it easy for non-
programmers to perform tasks and create objects without writing a single line
of code. Scripting is done using a visual system of connecting nodes. Blueprint-
specific markup, available as part of Unreal Engine’s C++ implementation,
also enables programmers to create baseline systems that can be extended
by designers.
Game engines also make it easier for someone working in production to use a
real-time tool to communicate and solve a problem. Production efficiencies can
be developed by anyone in the field, and implemented sooner. Inefficiencies
can be spotted much faster, and by anyone, not just a high-level engineer.
Physical robots interact with the virtual world via sensors, and to train a robot
you need a tool that can take physical sensors and interact with them in the
simulation. Game engines have a certain amount of intelligence to handle
simulations, but for more complex scenarios, offloading those calculations
into a bespoke hardware-in-the-loop (HIL) simulation would likely be a
better solution.
Connecting a game engine to HIL means you have a conduit between the data
and the game engine visualization, enabling you to see high-quality visual
representations of simulation results. Because the data is visualized in a game
engine, it’s possible to turn the visualization into a game that you can interact
with in real time.
19
The Automotive Field Guide
“
The biggest benefit for
us has been the ability to
use Unreal Engine as a
Digital twin as a communications platform
As you can see, when you start thinking about your data in terms of a unified,
open platform, it becomes as valuable as it is malleable. By producing a digital
flexible platform that we twin of your manufacturing facility and your entire plant and its machines and
can integrate into several production processes, you can freely explore ways of creating new efficiencies.
different closed systems. Whether it’s by changing the timing of part delivery or moving your process to
”
a different part of the facility, with an open real-time platform you can see the
ripple effects that one suggestion made during a collaborative review can have
across multiple scenarios.
Jon Friström
UX Researcher & Cognitive
Design Engineer, GEISTT AB
By creating a digital twin of your vehicle in a game engine, you can simulate all
aspects of testing; you can even develop an app that runs on a human machine
interface (HMI). You can create the HMI interface as a game, connect the real
software it’s running to a real-time engine, and simulate situations where it
truly feels like you’re interacting with the touchscreen. Interacting with the
radio and screen in a simulation is a one-to-one experience of the real-world
model. Similarly, it’s possible to connect the simulation to a real car and test
that the wiring works, the back-up camera activates when in reverse, and HMI
controls for power windows function as expected.
Marketing
Automotive marketing in Unreal Engine began in 2016 with a cinematic trailer
and product configurator for the McLaren 570S. In 2017, Epic Games teamed
up with Chevrolet and visual effects production house The Mill to produce an
interactive short film. But rather than users choosing between preset camera
20
Chapter 1: A bird’s-eye view of
real-time for automotive
angles or making binary decisions about plot points, they were picking the
2017 Camaro ZL1 hero car’s visual options, and adjusting the appearance
from any angle using a massive touchscreen display. Whether they chose the
“
When you start playing
around in a real-time engine,
you quickly see the benefits
stock Red Hot paint job, or opted for something a little more on the classic of it as a platform for
side—a skin of a 1967 Camaro SS, for example—or anything in between, it was directors and storytellers.
”
all done in real time, and their choices were instantly reflected on-screen in
photorealistic detail.
The Human Race, the world’s first configurable live-action film, was built Alex Hammond
Head of 3D, The Mill
entirely in Unreal Engine. Lighting, tracking, and reflections are all convincing
enough that you’d be forgiven for thinking these were production models
carving out turns on cliffside coastal highways. The Human Race tells the story
of a human race-car driver dueling an artificial intelligence (AI) driver, but the
underlying narrative is about how real-time technology can propel automotive
marketing into the future.
Marketing innovations like these extend into the dealership. The BMW Group’s
Emotional Vehicle Experience (EVE) VR system, powered by Unreal Engine,
runs in all BMW dealerships worldwide. Vehicles can be configured, animated,
and swapped into different virtual environments, with the resulting images or
videos emailed to the potential buyer.
21
The Automotive Field Guide
“
We believe that in the future,
all product-related content
will be generated on demand
driveway. Or maybe the majority of their time was spent poring over different
upholstery and interior options, figuring out if they could fit two kids and a dog
in the backseat. The customer leaves the dealership, and as they’re driving
in a personalized way. home, they get an automated email from the dealer’s customer relationship
”
management system thanking them for coming in. Except instead of generic,
placeholder beauty shots of a hunter green convertible taken from the rear
end, the customer sees the exact bright blue coupé they just configured, and
S tephan Baier from the angle they spent the most time considering. This could be the push
Head of Immersive Experience, MHP
needed to turn the customer’s dream car into a reality.
The old days of picking a lone hero car for each market and going into
production, only to find that the color or appearance options are wrong for
that region—those days are over. With millions of dollars spent by the industry
each year on producing assets for different regional markets and dealer
groups, minimizing these inefficiencies has the potential to save significant
amounts of money. Now you can take a 3D model and configure it on the fly
for online and printed marketing materials, even for linear TV spots. The time
a customer spends configuring their dream car doesn’t have to be wasted:
that experience can be portable, and follow them through their entire purchase
journey. Every decision made along that journey is captured, and can be
converted into tailored marketing materials to convert might-be buyers into
lifelong customers.
When you use a real-time engine, it’s possible to create marketing materials
from approved design data much earlier in the production process. That goes
for both online and offline content for making 4K broadcast-ready materials,
interactive content for touchscreen video walls, or static content like print-
ready still images.
These can all be created on demand and accessed by anyone in your supply
chain and organization. Should the design change downstream, those changes
automatically apply to the real-time model and the data file in the connected
source application. With an open platform, everyone in the automotive value
chain always has the most up-to-date data. In short, your entire organization
has easy access to the source of truth, ensuring consistent, accurate content.
When a partner makes a request for marketing beauty shots, rather than
waiting hours, days, or longer for a response, they can have them instantly.
To create further efficiencies and reduce wait times, you can create easily
accessible asset libraries full of approved still images, interactive activations,
and broadcast materials.
22
Chapter 1: A bird’s-eye view of
real-time for automotive
Autonomous driving
Autonomous vehicles rely on physics-based sensors to detect the world
around them. Their physical cameras, radar, LiDAR, and AI require thousands
of hours and millions of miles of training in their collective effort to replace
human drivers. Success is measured by the sensor array’s ability to process
the enormous amount of data, and interpret the vehicle’s proximity to other
vehicles, pedestrians, cyclists, and road debris. An autonomous suite also has
to read the road in all weather and lighting conditions—lane markings, signal
lights, traffic signs—and react as a human driver would.
That testing can be done in two ways, but regardless of whether you’re doing
physical or virtual testing—or both—you need a tool for processing the volume
of data generated. When paired with the proper plugins, game engines can
not only visualize the reams of physical sensor data, but can also be used to
build complex scenes and test scenarios for visualization. If you opt for virtual
testing exclusively, game engines provide greater levels of adaptability, given
the sheer number of testing scenarios you can run and visualize overnight.
These traits also mean game engines are highly capable machine learning
tools, making them ideal solutions for training AI to perform quality-control
tasks in a factory’s paint shop, or any other visual tasks.
23
The Automotive Field Guide
Ultimately, these factors mean game engines are highly cost-effective for
autonomous driving testing, enabling you to run simulations and validate the
efficacy of your software faster, thus reducing both development time and time
to market.
Once a sensor is connected to a game engine, what you can do with the
output data is limited only by your ability. Jump into VR and assume the
perspective of a pedestrian for a ground-level view of a test scenario, adjust
sensor parameters in real time, render the changes overnight, and analyze the
next day.
There are a variety of ways to build environments for testing sensors. You
can start from scratch if you’re so inclined, but why do that when there are a
number of templates and pre-made test environments already available for
game engines?
Testing autonomous vehicles in the real world has inherent limitations, but a
dynamic, photorealistic virtual testing environment does not. These testing
environments give researchers and engineers granular control in monitoring
how their AI responds and adapts to situations that might be impossible to test
for in the physical world. You don’t have to be in snowy Michigan anymore to
test your computer vision system’s acuity in a blizzard. Instead, you can load
a point cloud of the American Center for Mobility’s test track (via a third-party
plugin) and control snow density and wind speed yourself—from anywhere.
Beyond AI training, game engines can be used to build virtual obstacle courses
for robotic race cars as well—after all, game engines were designed for making
entertainment. Imagine a Saturday night in the not-too-distant future where
you load the family into your autonomous car for a trip to the race track, except
instead of physical hazards on the asphalt, there are QR codes.
At this new type of race track, there are two video feeds running to the
jumbotron: a conventional camera and another showing what the cars “see.”
Each QR code represents a different hazard picked by audience members
before the race, giving fans a chance to have an investment in the action. Or,
fans could even customize their own barriers at home and upload them to the
stadium so everyone watching can see the piece that took out the first-place
driver on the final lap.
The open nature of game engines means you don’t have to reinvent the wheel
to start simulating high-fidelity test scenarios in a real-time environment, either.
24
Chapter 1: A bird’s-eye view of
real-time for automotive
Instead, you can spend more time authoring these scenarios in the source
applications best suited for the task.
Working within a game engine means that if a sensor isn’t placed correctly
and isn’t gathering enough data, the error is noticed within hours or days—not
months. You can change sensor geometry and test new fail-safe mechanisms
in a fraction of the time it would take to test in the real world, leading to greater
efficiencies in autonomous testing.
Virtual crash tests are commonplace, but real-time technology brings a great
deal of additional value to such tests. Now engineers can make a simulated
crash look exactly like a real crash, and pause the test at any time to visually
inspect how metal and plastic deform during an impact. Not only that, it’s
possible to peel away layers of the vehicle and see what’s happening to each
component in real time, from sheet metal to the wiring harness. A virtual crash
can be viewed from any angle—you can even watch the entire accident unfold
from inside the cabin. Virtual tests most likely won’t ever replace physical
crash tests, but the virtual environment means testing can happen earlier
and knowledge can be applied and iterated upon before physical test vehicles
are built.
Before vehicles get to Level 5 autonomy, they’re going to get smarter in other
ways first. Automatic reverse braking, lane detection systems, automatic
stop—among others—are the features that provide bridges to full autonomy.
The problem then becomes acclimating drivers to these features that change
the traditional driving experience quite substantially. If they’re accustomed to
setting the cruise speed to 85 MPH and weaving around slower-moving traffic,
then it’s going to feel foreign to use radar-assisted cruise control to maintain a
steady pace at a safe distance. By training drivers in a game, simulation, or VR
during the purchase process, the shock can dissipate well before they get their
new vehicle home, leaving them delighted rather than annoyed.
25
The Automotive Field Guide
VR/AR
Without game engines, modern virtual reality (VR) is a static affair. Before
game engines, you could look all the way around you but the experiences were
limited to linear two-dimensional events like concerts filmed with 360-degree
cameras. The advent of real-time technology changed all that and made
interactive room-scale experiences possible. Suddenly, you were able to
control your own view of the digital world, walk around an environment, and
use your hands and natural motion to interact with it like you would the real
world. VR presence—when your brain is convinced you’re in a real space—
quickly went from being an elusive target to an industry standard. But VR has
so much more to offer than fun and games; its applications in enterprise are
equally limitless.
Game engines excel at handling data and acting as open platforms. Using
them to set up your data means it won’t take a lot of extra labor to bring
photorealistic interactive environments or augmented reality holograms into
your work. Import existing CAD and design files into a real-time engine once
and the associated data preparation software will remove the myriad parts
not needed, producing high-quality assets that can be used across the entire
automotive value chain.
Game engines and the ecosystem of third-party tools around them enable
other efficiencies too, like real-time collaborative design sessions. Even if a
team is spread across the globe, one person could be sculpting a life-size
model with VR wand controllers while another is drawing on a Wacom tablet
and a third is reviewing designs as they’re finished. Beyond design, VR is the
perfect tool for retail and previsualization, too.
26
Chapter 1: A bird’s-eye view of
real-time for automotive
Copyrights: BMW AG
The model doesn’t have to be finished with pretty pixels and surfaces to be
useful, either. Simply having the line drawing in a real-time environment means
it can be exported to CAD without the artist needing any CAD training. So long
as the main dimensions exist, the next person down the production chain can
save time by not having to recreate the line drawing from scratch within their
CAD toolset, and can spend that time on refinements instead.
Augmented reality is the next area ripe for real-time innovation. Rather than
isolating someone from the real world, AR overlays critical information on top
of what they’re looking at. It quickly becomes much easier to train employees
when they can see exactly where fasteners and wiring need to go, without
them having to refer to a printed manual time and again throughout the day.
This can have a profound impact in the area of safety training, where trainees
can practice in simulated scenarios that would be unacceptably dangerous in
the real world.
Enabling the workforce in such a way will only lead to even greater productivity
as we move forward. These efficiencies at the heart of game engines empower
employees to maximize their time, and focus on what they do best.
27
The Automotive Field Guide
“
We can change or tweak
whatever we want in
the source code. This is
Challenges and opportunities
Real-time technology can help solve many problems across the automotive
unheard of in previous pipeline, and provide new opportunities to capitalize on. It can drive greater
automotive software. efficiencies in automotive processes, and gives us new ways to experience
”
these processes via different types of display technology. It also opens up a
world of new marketing possibilities.
John Bulmer
Visualization Manager, Process
Geely Design UK
Building a product with thousands of moving parts is no small feat. As vehicles
begin layering technological complexity on top of mechanical complexity,
it’s imperative that you devise a solution for comprehensively managing the
production life cycle from ideation through post-launch marketing materials.
Keeping vendor output and the various assets aligned—whether code, design
files, or creative media—is paramount to maintaining your production schedule
and budget throughout the automotive value chain. How, then, do you manage
this, when vendors themselves are all working remotely?
It makes a lot of sense to choose a game engine as your single platform, given
how customizable they are. With Unreal Engine’s open C++ codebase and
access to features like Datasmith, Blueprint, and Visual Dataprep, defining a
process for data input becomes less arduous.
Unreal Engine also offers access to pre-made asset libraries via the
Marketplace and Quixel Megascans, reducing time spent on redundant tasks.
Rather than having teams spend time creating every asset from scratch for a
marketing photo shoot on a virtual set, you can create a common pool of pre-
made photorealistic assets, and instead devote that time to fine-tuning the
vehicle and lighting models.
To optimize your workflow and complex scene operations, the engine comes
with Unreal Insights, a standalone profiling system that integrates with Unreal
Engine to collect, analyze, and visualize data coming out of the game engine.
Insights will tell you what’s running well in your scene, what isn’t, and what
takes the most time to load, among other things. Adding your profile data is
easy, and you can even record data remotely to minimize its impact on your
project’s execution.
In a model of your facility, you can define where people will walk and work,
enabling you to easily visualize and simulate what happens when you start
tweaking aspects of the manufacturing process for efficiency. Connecting
leading simulation tools and telematics suites to game engines makes it
relatively simple to jump into a photorealistic simulation via augmented reality
or virtual reality.
Copyrights: BMW AG
29
The Automotive Field Guide
Ergonomics testing
The real-time nature of game engines means you can track workflow and
simulate the process needed for seat installation. Using a game engine, you
could design the first part of the process, having the assembled seat delivered
to the seat installation workstation while avoiding other delivery carts along
the way. Then you can define the process for how the installer will have to twist
and turn to fit the seat through the vehicle’s door opening. Through simulation
and process design, discovering and implementing production efficiencies
becomes much easier.
Game engines have the ability to fundamentally shift automotive digital twins
to the next level and serve as the platform where everything comes together—
predictive analytics for data analysis, 3D simulation tools, the Internet of
Things, digital collaboration, and data system monitoring.
Everyone wants to have a digital twin for their products and processes, so it
then becomes a question of how to make the reams of data relevant to the
widest group of people. By representing the abstract concepts visually, you
can take all the data and make it easier to understand, which can lead to better
decisions, faster.
30
Chapter 1: A bird’s-eye view of
real-time for automotive
”
game engine to easily connect people in disparate environments and ensure
they’re all commenting on the same thing at the same time. This not only
increases cross-department understanding, but also saves a substantial
amount of time—in addition to the money that would otherwise be spent on
Jürgen Riegel
travel costs. Principal Software Architect,
Daimler Protics
31
The Automotive Field Guide
Interaction options
Different automotive processes require different types of display technology.
Engineering teams might evaluate ergonomics using VR headsets, while
salespeople might take customers through configuration options on a
touchscreen. Unreal Engine can be set up and optimized to work with different
display technologies depending on the use case.
LED screens were, until recently, always flat, and any curved shape was
created by placing flat screens at a slight angle to one another. Nowadays,
LED screens can be designed to almost any form, shape, resolution, or pixel
pitch. Flat LED screens can also be arranged in complex patterns to form
three-dimensional displays. Because the image data for an LED screen
comes via a cable rather than a projector, the seams between image portions
can be precisely lined up, and thus overlap/blending between portions is
not necessary.
Collab Viewer
The Collab Viewer template joins multiple people together in a shared
experience of the same 3D content. It’s intended to make it easier and quicker
for your team to review and communicate about designs in real time, so you
can identify problems and iterate on the content more efficiently.
HoloLens Viewer
The HoloLens Viewer template is an adaptation of the Collab Viewer template
that works on the Microsoft HoloLens 2. You can use it to see your 3D content
overlaid on your actual surroundings in the HoloLens viewer. You can also
interact with your models, moving them around and annotating them in real-
world space.
nDisplay opportunities
The nDisplay plugin for Unreal Engine distributes the rendering of real-time
content across a network of computers and generates images for as many
displays as required. It features accurate frame/time synchronization, correct
viewing frustum based on the topology of the screens in world space, and
deterministic content that is identical across the visualization system.
32
Chapter 1: A bird’s-eye view of
real-time for automotive
The same technology Disney used for the series The Mandalorian is
available to use for car commercials and marketing photo shoots. Real-
time photorealistic imagery is fed to an array of LED screens serving as
virtual backdrops while physical set dressings fill out the foreground.
Because game engines are capable of producing ray-traced imagery
and realistic lighting in real time, it becomes possible to light a scene
with virtual softboxes to achieve the exact look you envisioned.
Vehicles are glossy surfaces, which makes using an LED volume an easy
choice. Your vehicle model will reflect the LED walls naturally in camera, so
there’s no need to add artificial reflections in post. It then becomes possible
to do multiple creative shoots for different markets in dramatically different
environments, all from one location and in a fraction of the time. Other
efficiencies include the elimination of travel costs for the vehicle and film crew,
and a reduced chance of spy photos spoiling the surprise of your debut.
The integration of the OpenXR standard in Unreal Engine can futureproof your
applications for new devices.
Finally, through the nDisplay plugin, Unreal Engine can render to multi-screen
setups with projection warping and blending, such as powerwall LED displays
and CAVEs.
33
The Automotive Field Guide
You can put your vehicle practically anywhere for consumers to interact with—
regardless of how far away you are from a rolling prototype.
Marketers and designers can work on a project in parallel, leading to both time
and cost efficiencies. If you use an engine that runs online and you introduce
a new steering wheel option, you can generate all the imagery you want just
once, and all the videos, VR experiences, and interactive experiences—like
car configurators—will update behind the scenes. This asset reusability
facilitates iteration across multiple teams that isn’t possible with typical
offline workflows.
34
Chapter 1: A bird’s-eye view of
real-time for automotive
IOT
SIMULATION DESIGN RESEARCH &
CONNECT TO HARDWARE EVALUATION COLLABORATION | CONCEPTION | CREATION DEVELOPMENT
The Internet of Things (IoT) refers to
AND SOFTWARE SYSTEMS PRESENTATION
the connection of devices (other than
conventional computers and smart-
DIGITAL THREAD phones) via the internet.
REAL-TIME WORKFLOWS |
VIRTUAL PRODUCT
INTERACTION
TRAINING It implies a continuous connection
PROCESS AND ANALYTICS DRIVEN between a large number of devices that
VALIDATION
PRODUCTION perform automated data processing
AI SUPPORT | TRAINING |
FACILITY MANAGEMENT without human participation.
DIGITAL IOT
TWIN LIBRARIES
MATERIALS PDM
LOCATIONS
DATA PLATFORM
EXPERIENCE | AUTOMATED PRODUCTION |
COLLATERAL AND VIDEO
single, central system. This information
includes computer-aided design
(CAD) data, models, parts information,
DIGITAL THREAD manufacturing instructions, requirements,
REAL-TIME WORKFLOWS |
REAL PRODUCT notes, and documents.
35
CHAPTER 2
36
Taking a vehicle from initial
concept to the showroom floor
creates huge amounts of data
in many different formats.
If you want to build a single platform
that connects siloed automotive
departments together, you need
a way to bring in and work with
all these different data types.
In this chapter, we’ll look at how
automotive manufacturers can
quickly and easily export diverse
content and file types from across
the automotive pipeline for easy
import into Unreal Engine.
37
The Automotive Field Guide
“
With Datasmith, I can
literally do the same thing
I did in four weeks in one
What is Datasmith?
Digital platforms live and die by their data input and output capabilities. With
day and that is magic. Unreal Engine, the key entry point is Datasmith, a collection of tools and
”
plugins that bring entire scenes and assets constructed in other industry-
standard design toolsets into a real-time editor, no matter how large, dense, or
heavy they may be.
Carlos Cristerna
Visualization Director, Neoscape
You can save hours—if not days—when you import your fully assembled scene
data into Unreal Engine’s open platform with Datasmith.
From there, every aspect of the product life cycle can be managed: data linking,
data conversion, data updates, and life cycle connection. Even if you need to
reimport the complete Datasmith scene after a round of stakeholder feedback
and source scene adjustments, Unreal Engine has tools and workflows for
avoiding arduous and expensive rework.
MATERIAL MATERIALS
DESIGN APP
.MDL
.AXF
TEXTURES
EXPORT
DELTAGEN
38
Chapter 2: Bringing your data into the engine
The Unreal Editor can perform tasks like tessellation of NURBS data,
manipulation of triangle mesh geometry, decimation, UV generation, and many
other operations that are used in traditional pipelines. With the Visual Dataprep
system, Python scripting, and Blueprint visual scripting, Unreal Engine offers
automation capabilities to minimize manual work.
The Unreal Engine toolset can be integrated into your existing data preparation
processes in a number of ways.
One approach is to fully prepare data in legacy applications, then import it into
Unreal Engine for consumption. In this instance, users continue to use the
tools and processes they are familiar with, although the legacy software may
not offer the same array of operators to prepare and adapt the model for final
use and rendering in a game engine. For example, the data may not be fully
optimized for the target hardware platform.
This setup makes sense when matching specific platform requirements and
optimizations that cannot be achieved in other software, or performing some
preparation tasks related to the final look development and rendering.
The final case is to have Unreal Engine as the central data platform,
orchestrating the whole data preparation process. This does not preclude
Unreal Engine from round-tripping data to other third-party applications for
some specific operations. Unreal Engine can also be used from end to end,
from CAD to final pixels. This paradigm reduces the number of tools involved in
the process and minimizes friction induced by interfaces and interexchanges.
39
The Automotive Field Guide
• 3ds Max
• Revit
• Solidworks
• SketchUp Pro
• Cinema 4D
• IFC2x3
40
Chapter 2: Bringing your data into the engine
Editor tools
Unreal Engine comes preloaded with tools for fine-tuning your data after
import, ensuring you don’t need to return to the source application for
minor tweaks.
UV mapping: Some CAD programs don’t offer the ability to create high-
quality UV maps. Once you’ve imported CAD data into Unreal Engine, you can
easily unwrap mesh geometry and set up parameters to control the results
of the unwrapping. UVs can also be edited via Blueprints or Python scripts.
By default, the engine will automatically unwrap some UVs to ensure you can
utilize advanced scene performance efficiencies like baked lighting.
41
The Automotive Field Guide
Levels of detail (LODs): LODs are an effective way to optimize your meshes
and scenes for performance and frame rate goals. The LOD management
system in Unreal Engine chooses the most appropriate mesh to show at
runtime. LOD creation can be automated with Blueprints or Python scripts;
LODs are reusable from one mesh to another.
Mesh editing and defeaturing: You can easily fill holes and remove
protrusions from your meshes with simple drop-down menus. Any meshes
that are imported with Datasmith or FBX can be defeatured in a few clicks.
Defeaturing can also be applied with Blueprints and Python scripts.
Sculpting: Geometry Brushes are available for rapid prototyping of levels and
objects. These brushes are ideal for creating simple geometry for filling a gap
or space. You can also use the Subtractive brush to remove solid spaces.
Direct Link plugins: When you import data using Unreal Engine’s Direct
Link plugins, you can make changes in your source application and they will
automatically reimport to the scene you’re working on in the engine without
affecting the objects and assets you haven’t tweaked. For example, if you’ve
changed the materials and geometry in your source application for some
objects but not all, you can place the scene in a folder and import it, and Unreal
Engine will only alter the assets and objects that you’ve changed.
42
Chapter 2: Bringing your data into the engine
Datasmith does its best to maintain geometry, materials, and scene hierarchy
during import. CAD tools don’t always prepare raw data in a way that makes
sense for a real-time rendering engine like Unreal Engine, which is where
creating automation “recipes” with Visual Dataprep comes in.
With Visual Dataprep, you can create Actions by simply dragging and dropping
blocks of Filters and Operators into the Blueprint-like Dataprep Graph. From
there, you can connect multiple nodes in sequential order to make your own
custom import workflow recipe.
You can expose the parameters of those Filters and Operator blocks at
any time, making it easy to identify what each recipe does, and making it
efficient to reuse these workflows for other scenes and projects. The goal is
to automate a vast majority of the tedious dataprep work and leave the fine-
tuning to a designer.
Visual Dataprep also enables you to automate LODs, set up lightmap UVs,
substitute materials, and delete or merge objects based on class, name, size,
and metadata tags—all while previewing the processing results from your
current scene within the Visual Dataprep viewport. Dataprep automation can
be handled with Blueprints or Python scripts.
43
The Automotive Field Guide
Swim lane
In the diagram on the next page, you’ll see a traditional swim lane covering the
dozens of data sources typical in the automotive production workflow, with the
red line representing a drop-dead cut-off.
When you use Unreal Engine, the window for making additional changes
expands dramatically, enabling you to make changes up until the month before
production starts. With a typical offline process, you need around six months’
notice to make the same changes.
Copyrights: BMW AG
44
6.3.1 order of reference shooting
6.3.2 creation of ref shoot catalog OUT: reference catalog
6.3.3 update ref catalog library reference library
6.3.4 delivery of ref catalog (update) config doc
IN: order doc, options package, -XX Chapter 2: Bringing your data into the engine
7. COBA/PLM structure and control data structure
7.1 setup of Standard Structure
-COBA/PLM Control Systems
-Data Setup
7.2 structure Adaptations for Vis
7.3 structure Adaptations with ref catalog
OUT: 155% data set
data structure
IN: 155% data set -XX
8. Data Export data structure
8.1 local export of data Systems
8.2 intermediate save format CAT IA, ProE, PLM, CAD Systems
JT
8.2 transfer of data to ps/vis data prep
team OUT: raw 155% NURBS data
-plm / config info
-library catalog
-doc of missing geo
-doc of work files
Process Step
9.3 check on plm/bom information
9.4 check data part list and material list
A
A
Partners Information Flow Date
CGR data.
weeks before
data preparation
Approval and
with requests for updates. notes/wip 155% data set
Partner 1
Partner 2
Partner 3
OUT:
DEP 10
DEP 11
DEP 12
marketing
Garage
DEP 1
DEP 2
DEP 3
DEP 4
DEP 5
DEP 6
DEP 7
DEP 8
Real-time workflows
in detail
46
By now, you should have a good
sense of the many verticals in the
automotive pipeline that are ripe
for transformation by real-time
technology. Let’s take a deep dive
into different automotive departments
to find out how teams can leverage
Unreal Engine for specific workflows
and processes.
47
The Automotive Field Guide
Concepting
Real-time rendering’s impact on concepting
Game engines make the process of visually representing the idea in your head
much faster. Because you can get up and running quicker in real time, you can
spend more time iterating and creating better concepts than you would with
offline processes or drawing by hand within the same timeframe. Collaborative
design sessions enable cross-platform 3D design in virtual reality, with
multiple users around the world commenting and participating in real time
much as if they would when playing an online video game.
Depending on how your lighting model and shaders are defined, you can go
from complete abstractions to finely detailed pencil sketches relatively easily.
It’s up to you, depending on what you’re trying to achieve.
One need only look to the video games created with Unreal Engine for examples
of what’s possible. The open nature of game engines, and the sheer number
of game projects that have gone to market, mean there’s a robust worldwide
network of available support and premade assets so you can execute your own
vision faster.
48
Chapter 3: Real-time workflows in detail
Premade assets still leave plenty of room for vision and originality. The games
Rocket League and Assetto Corsa Competizione couldn’t look more different
despite both involving cars. Even though they’re both shooters, Fortnite looks
nothing like Microsoft’s gritty Gears 5, which is itself miles apart from the
cel-shaded chaos that Dragon Ball Fighter Z needs to remain faithful to its
source material.
Libraries
Libraries are a way for automotive manufacturers to create an asset once and
then reuse it across their entire organization, be that a Blueprint script, object,
AI behavior, texture, or anything else within a game engine. Libraries serve as
a centralized source of truth for the digital assets in your organization, and can
eliminate the need to start from scratch for every new process or design item.
49
The Automotive Field Guide
“
We no longer need to wait
to understand the impact
of a design decision or
Data as a platform across the automotive value chain
What we’ve witnessed in the automotive space is that the tools used to create
data and communicate the data visually are all vastly different from the tools
change, it’s just a smoother used for engineering. Because there’s no connection point for all the data
process all round.
”
sources, you have to recreate everything from scratch time and again with
different tools. This means you lose most, if not all, of the metadata and smart
features authored in purpose-built applications.
John Bulmer
Visualization Manager, If you use a game engine as the platform where all this data comes together,
Geely Design UK you can ensure those features and metadata are readily available across your
organization, unlocking many efficiencies along the way. And because everyone
in the automotive value chain can easily understand the concept you’re trying
to convey from the outset, you can gain a lot of trust from the people who are
working downstream.
Design
Using game engines for creative design
If you use game engines for illustrating a concept, the design process can go
much faster because the rough ideas have already been sketched out. From
there, designers can easily begin working from an approved concept without
having to recreate it from scratch.
With Datasmith and Visual Dataprep, each designer can author in the
application they’re most comfortable with and then import the data to Unreal
Engine so design reviews can take place all in the same platform. In a studio
where each designer is using the same game engine, it becomes possible to
have an apples-to-apples comparison of one-off iterations and apply unified
feedback across the board.
What’s more, software is now being developed that provides a live link between
applications to reduce the friction of data push and pull. Third-party plugins
such as the Mindesk Rhinoceros Live Link create a live connection between
Unreal Engine and 3D CAD software, ensuring that all changes made in
Unreal Engine will be reflected in their source applications—automatically—for
further efficiencies.
50
Chapter 3: Real-time workflows in detail
Not all design tools are set up for making stunning final-pixel visualizations,
or offer access to the helpful asset libraries during design. Thanks to
third-party live link plugins and Datasmith, you don’t have to pick a new
tool over the one you’re intimately familiar with just to create downstream
production efficiencies.
Because data is feeding back and forth between the 3D rendering tool and
CAD—with a game engine connecting the two—an artist can easily generate a
rough concept for designers to work into a prototype, all without ever touching
CAD themselves. These designs will have perfect Class-A surfaces, ready
to hand off to engineering, and the different departments can use them to
communicate intent.
Virtual courtyard
What if you could eliminate the challenges of waiting for the weather and
lighting to be just right, shipping pre-production models to faraway places, and
sidestepping prying eyes? These challenges can become a thing of the past
if you use the right tools. The virtual production principles used in Disney’s
The Mandalorian can be applied to the automotive world just as easily—except
instead of dressing a set to look like a forest encampment that’s mid-battle
with an Imperial walker, you can use panoramic LED screens to recreate the
typical outdoor courtyard evaluation area.
Evaluating a physical design in all light conditions from every angle can
happen in a matter of keystrokes. For marketing shoots, these walls create
photorealistic real-time reflections, enabling VFX to be captured in camera
and offering far greater flexibility in shot composition. With Quixel’s Megascan
library, it’s easy to change scenes so your press release photos look different
from the marketing campaign—it simply becomes a matter of picking from a
different set of premade assets.
The stage setups can also be used for car configurators and training clinics.
Because everything is happening indoors, it becomes less of an issue to make
evaluations based on a non-drivable concept model—and you won’t have to
camouflage the car to keep it disguised, either. By using a virtual courtyard,
you can have everything you want whenever you want it.
51
The Automotive Field Guide
In a perfect world, everyone who has a stake in a vehicle will be able to review
and provide feedback at the same design review session.
The reality is that cars aren’t made entirely in one location, or even one country.
As digital design tools have taken hold in the past 20 years, there’s been less
reliance on clay models, but the process of designing a vehicle and gathering
feedback requires the same amount of collaboration. Instead of sending the
design team across the globe to evaluate a rolling prototype or clay model, you
can save on budget by collaborating on a design review in virtual reality (VR).
Cabin review
When you use a game engine as a platform for your visual data, you can get
away from the traditional linear path—and it becomes possible to perform
tasks in parallel and share assets as they’re created. This unified process
gives downstream departments a way to see and understand data during
the design stage, as opposed to just before it gets to them. Because the
downstream portions of production have an earlier starting point, they’ll gain
the advantage of having a better idea of how the data works before they
commence their phases.
52
Chapter 3: Real-time workflows in detail
Every day, we get closer to merging the digital and physical worlds. Once
vehicles reach Level 5 autonomy, HMIs will offer a whole new world of
opportunity for in-car experiences. We predict that screens will increase in size
and number and will eventually become an important part of interior design at
every level, and may even replace traditional surfaces altogether.
With a digital twin, your design files for advanced HMI applications are stored
in a central place and can be applied to working virtual prototypes to test
user experience. The virtual product you create will translate to the virtual
display, and communicating complex ideas should become easier when your
stakeholders can directly interact.
For the present, using game engines in the HMI space allows for a more
personalized in-car experience. Instead of seeing a generic representation
of your personal vehicle on your in-dash HMI, you could see the exact vehicle
you’re driving: the same exterior color, the same wheel option, the same
trim level.
53
The Automotive Field Guide
Industry 4.0 has changed certain aspects, but the impact of immersive real-
time technology has yet to be fully realized. A new breed of automakers has
disrupted the field and has altered consumer expectations about how fast
changes should be applied. Not only that, but these automakers are smaller
and more nimble, and are able to incorporate change at a much faster pace.
54
Chapter 3: Real-time workflows in detail
Game engines are updated with new features, often requested by their
users, at a regular cadence. Unreal Engine 4.25, for example, added support
for Siemens PLM XML via the Datasmith data importer for large-scale CAD
projects. That’s in addition to the new Mixed Reality UX Tools plugin, which
provides mixed reality designers with a set of UX tools to speed up XR
development, and improvements to the material system such as the addition
of a proper anisotropy input, a new physically based translucency shading
model, and updates to the clear coat model.
These features rolled out to all users at the same time, and for free.
Engineering
Digital mockup exploration
When you use a real-time engine as a data platform, you can translate your
data into an interactive 3D model accessible across multiple departments.
Engineering data can travel back and forth between concept and design within
the same platform, unlocking efficiencies in time and budget because data
doesn’t need to be constantly replicated.
When you have earlier access to data, the window for experimentation
expands, enabling you to test scenarios you might otherwise not have time for.
Or, the extra time could be used to test new engineering scenarios to see how
changes to frame geometry can affect crush zones and component protection.
Building an environment for testing trunk size is another example. It’s relatively
simple to use design data to create an interactive digital version where you can
test if a trunk can fit golf clubs, suitcases, groceries, dog food, or patio bricks.
55
The Automotive Field Guide
Not everyone has an engineer’s ability to look at a dataset and easily discern
what the math means. Not everyone has access to, or familiarity with, Rhino
or CATIA, which is why it becomes important to have a platform that can
connect to multiple data sources and represent them in photorealistic visual
detail. However, loading a full simulated model to do a quick design check isn’t
feasible due to the time needed to load all that data.
Real-time engines can easily create reduced versions of those CAD files and
visualize them, enabling more people to access the models in far less time.
The data isn’t as malleable as it is in the source file, but you can work with
it in a collaborative AR/VR experience to check for visual references and
version status.
With a real-time engine, you can simulate not only vehicle assembly but also
the robots that are assembling your vehicle. From there, it becomes possible
to swap alternate parts and accessories to see which configuration works best
for which task, or which is faster at assembling a given part.
Manufacturing
Firms know that digital assets are important—it’s just that they’re so
entrenched in the mindset of producing physical goods that the digital
transformation is very hard. Legacy processes rule the day because they’ve
been battle-tested for decades, countless times over. That’s in stark contrast
to agile, digital-native firms using processes that may be only a few years old.
Problem areas
The process of making highly complex physical goods is rife with opportunity
for failure, and automakers have been perfecting their processes for getting
vehicles out the door for over a century. Unfortunately, these processes
haven’t been set up in a way that makes the business digitally savvy. Unlike a
CG car in a video game, physical car designs are not created to be used in a
game engine.
• Data weight
• Holes in meshes
• Missing data
• Presence of manufacturing elements that are superfluous in a
game engine
• Data in different formats
56
Chapter 3: Real-time workflows in detail
Moving forward, automakers and suppliers need to change the way they think
of digital assets. Instead of being viewed as a necessary evil for getting the
physical good, OEMs need to consider digital data as a key part of distributing
their goods across the entire automotive value chain.
AR and VR in manufacturing
The game engine can be configured to measure each aspect of each training
session, feeding data back for analysis. It’s easy to turn the training into
a game to make technicians faster as well. Your fastest, highest-scoring
technician’s technique can be digitized and used to train an AI algorithm,
which in turn can test countless edge-case scenarios overnight and develop
a different method that could shave off one minute of production time
per harness.
Training a forklift driver in a simulator or VR makes the most sense, given how
dangerous mistakes can be. You can easily create training simulators in a real-
time engine, then feed the scale-accurate digital version of your manufacturing
facility to screens mounted around a physical forklift. With the forklift on the
treadmill synced to photorealistic images coming from the game engine, you
have a highly accurate representation of what it feels like to drive through a
manufacturing facility, but in an incredibly safe way—one without the potential
for injuries or damage to the facility.
57
The Automotive Field Guide
58
Chapter 3: Real-time workflows in detail
Those use cases can be anything— computer vision applications for quality
control, for example, or even a movement pattern that takes its surroundings
into account and treats humans as obstacles that must be avoided. These
tests can use millimeter-accurate models of your manufacturing facility to test
hundreds of thousands of edge-case scenarios overnight, and create an AI
algorithm for that particular task.
Once a vehicle exits the concept phase, you can start creating digital assets
for PR and marketing materials, mobile applications, configurators, POS
experiences, and more—all based on existing and approved design data. The
data and images are all easily shareable with clients and third-party vendors.
The ability to create marketing materials this early translates to greater
efficiencies downstream. Should any of the design change over the course of
production, the changes are reflected in the universal data model, ensuring
every department remains in lockstep, and that marketing materials are
reliably accurate.
PRESENTATION
CONFIGURATION PR
Interactive Web
experience in configurator
and around
ownership
Connection Interactive digital
and loyalty touch points Streaming
experience
This diagram illustrates the user journey
of a customer purchasing a vehicle.
It shows how a streaming application
STREAMING can sit at the heart of that journey,
HIGH-QUALITY connecting all the customer touch points
APPLICATIONS by serving up the same content on a
consolidated platform.
PURCHASE EDUCATION
USAGE EMOTION
Immersive
interaction Guided
Virtual and experience
real product with human
interaction interaction
59
The Automotive Field Guide
“
Real-time ray tracing is
a quantum leap for the
whole realm 3D graphics.
When it comes time to debut your vehicle to the press, with real-time
technology you can create print-quality photorealistic collateral images up
to the minute the news release publishes. Game engines make the process
”
of creating PR materials smoother and more cost-efficient. Different assets
can be created for different territories—or different colors and configurations
for different campaigns—without the need to have several production models
Manuel Moser in various configurations on hand. This flexibility extends to creating highly
Head of Real-Time Development,
Mackevision customizable, 4K broadcast video assets as well. If you include video walls at
your activation, you can use the same assets from the app to create one-off
video content.
At your activation, you can show off the rolling concept and use the same
assets for apps that demonstrate new features and explain how they work. For
example, you could show how adaptive headlamps work in a simple tablet game
or experience that puts attendees behind the wheel of your new model. Setting
up a simple game with a branded racing wheel and cabin diorama becomes
an easy way to demonstrate autonomous features like assisted cruise control
and lane-keeping technology, using the same code that will power the features
in production.
Showrooms
Currently, BMW, Toyota, Audi, and many other companies use Unreal Engine-
powered point-of-sale and virtual showroom technology. Connecting a
configurator to your point of sale (POS) system means that the car your
customer configures is the exact one that’s built and delivered, increasing
customer satisfaction.
60
Chapter 3: Real-time workflows in detail
Configurators
Game engines are a perfect match for car configurators. Unreal Engine’s
best-in-class real-time rendering and ray tracing makes it a good choice
when you need photorealism on the fly. Getting started is simple: either create
your own configurator with Blueprint, or download a template from the Unreal
Marketplace. It’s of the utmost importance to ensure parity between the digital
configurator and the real vehicle.
Case in point: wheel options. Not only do different wheel options affect a
vehicle’s aesthetics, they affect ride height and, depending on the application,
tire options as well. Rather than have a wall of wheels mounted in your
dealership, you can replace that with an interactive video wall that shows life-
size versions of the car as your customer configures it, and shows how each
wheel option affects the car’s appearance.
Copyrights: BMW AG
61
The Automotive Field Guide
Knowing which wheel option shows more of the brake caliper—and how the
now-exposed brake caliper’s color coordinates with the body color and other
trim options—is key to earning customer confidence when they place an
order. The same can be said for accurately conveying changes in ride height
by going from a 17-inch alloy wheel to a 20-inch chrome option. Making that
fully complete real-time model available in a dealership setting can lead to
increased sales and provide insights about your customers. You can easily see
which options are the most popular, eliminate the ones that aren’t, and learn
how you can better predict your customers’ needs.
Game engines are built for this type of configuration and interaction; when
connected to a POS system, the potential to drive both sales and customer
satisfaction is baked in.
Trade shows
Your attendees will be able to interact with your exhibit on whatever platform is
most convenient for them, be it mobile, VR, browser, or game console. It could
be a virtual test drive, interactive exploded views of the vehicle, or images of
the car in otherwise impossible environments. With Unreal Engine’s virtual
production techniques, you can blend the physical and virtual. The only limit is
your creativity.
62
Chapter 3: Real-time workflows in detail
Operations
Virtual garage, digital twins, and the 150% model
A digital twin is the complete digital replica of a vehicle and all its variants.
From this model, you have all the data necessary to create any variation of the
vehicle, with any combination of parts and options. If your pickup truck has
three different bed options, the model is one variant, with the data from the
other bed options readily available.
MASTER UNREAL
ENGINE APP
COMPOSITING
ENGINE
Armed with a set of new AI-driven tools, a handful of key automotive partners
will be the pioneers of Industry 4.0: data-led manufacturing where information
from every stage of the product life cycle can be leveraged to build higher-
quality, more cost-efficient products at a faster pace.
Game engine technology will drive the automotive digital twin to the next
level, bringing together the key technologies of IoT, 3D simulation tools, and
predictive analytics to provide analysis of data and monitoring of systems that
solve problems even before they occur.
63
The Automotive Field Guide
Game engines are good choices for creating test scenarios for AI because they
are infinitely customizable tools that can create photorealistic images in real
time. Using synthetic data, it’s possible to achieve the millions and billions of
miles needed to validate autonomous driving systems. The simulations can
run on cloud-based hardware, and a few days later you can have a rudimentary
AI model, achieving in hours what would take years of real-world perception
training and testing.
Unreal Engine has been used by clients for testing edge cases, as well as AI
training. Unreal Engine’s open nature and C++ environment ensure you can
customize your experience to your company’s individual needs and demands.
64
Chapter 3: Real-time workflows in detail
CARLA
Powered by Unreal Engine and maintained by Toyota and Intel, CARLA is a free,
open-source simulator that’s been designed to support development, training,
and validation for autonomous driving systems. The simulator enables you to
visualize all your test scenarios and their results in real time. The open-source
nature democratizes autonomous testing, and ensures that even the smallest
startup has access to world-class testing tools and environments.
65
The Automotive Field Guide
Because Unreal Engine is an open platform, the options for engaging with
passengers in this configuration are limited only by creativity.
Human-machine interfaces
Once vehicles reach Level 5 autonomy, entertainment experiences can take
over the car because you no longer have to pay attention to driving. The
dashboard of the future is where your in-vehicle entertainment experience
will live. With origins in the world of entertainment, game engines can play
particularly well here.
66
Chapter 3: Real-time workflows in detail
The work lies in ensuring quick start-up times and lowering the amount of
resources you need to produce the results expected from Unreal Engine
projects. Using Unreal Engine for HMI applications enables you to quickly catch
up to new automotive startups that have had to invest heavily in their own
proprietary technology.
Android is a secure, reliable mobile operating system that has been optimized
to work on moderate hardware. With Android as the base layer of your HMI OS,
you can be sure that the vehicle’s main drive OS and essential features will
remain stable.
HMI is a key element of the automotive data platform that manufacturers must
consider. By integrating HMI experiences with the rest of your automotive data,
you can save time and create efficiencies with easily shareable data across
multiple departments. Unreal Engine’s capabilities as a data platform enables
you to begin work on visual design systems during the concept phase that
remain consistent across the entire workflow.
When every surface in a vehicle can be a screen, the options for customizing
a vehicle are limitless. One potential future application would be things like
user-selectable surfaces and textures; pick carbon fiber for the dashboard,
woodgrain for door panels, or just go for solid red. It could all be changed as
easily as the color of ambient lighting is today.
Every screen turns into an opportunity for entertainment and education. Play a
video game, watch a movie, or access educational material. You could also call
up an AR experience that overlays facts and information about landmarks as
you pass them.
67
68
Chapter 3: Real-time workflows in detail
There are a number of HMI workflows in Unreal Engine that you can leverage
right now.
HMI startup processes in Unreal Engine have been optimized to boot extremely
quickly. Content that is not needed at startup can be loaded after the initial
boot, reducing boot time even further.
Best-in-class visuals for production HMI include car paint materials and
reflections that bring the highest-quality real-time graphics to the vehicle.
Designers also have the opportunity to work with various automotive materials
and shaders that extend themselves to mobile HMI.
The engine’s suite of profiling tools, such as Unreal Insights, ensures designers
can keep applications running smoothly, guaranteeing high-performance, fluid
user interactions.
69
CHAPTER 4
Real-time
technology in action
70
While automotive companies are
increasingly bringing visualization
technology in house, many rely
on external agencies to provide
this expertise. Mackevision and
Burrows are two of the specialists
that firms like Ford and Mercedes-
Benz turn to when they need
to integrate real-time solutions.
These studios provide insights
into why automotive companies
should see real-time technology
as a ripe opportunity to streamline
processes, improve ROI, and provide
better customer experiences.
71
The Automotive Field Guide
Burrows
Robin Lowry is Head of Product Visualization at Burrows, a real-time visualization
and CGI studio that has worked with global automotive companies including Ford,
Volvo, and Mazda.
72
Chapter 4: Real-time technology in action
What types of projects do you work on for There’s no need to take up valuable time and resources
automotive clients? jumping in and out of different software packages—once
the data is in Unreal Engine, we can just stay in the engine
We produce some of the world’s first glimpses of products and do everything there. The most important thing for
using CGI, from simple images to major shots in a TV me is seeing the results instantly, so decisions on things
commercial to online and in-dealership configurators, for like look and style are made there and then, and can be
some of the most prestigious automotive manufacturers dealt with straight away. The old way of running things
in the world. back through a multi-software pipeline can get fantastic
results, but also takes a lot longer to get to the finish line.
At the beginning, we ingest the manufacturing CAD data,
so we have a fully configurable model to use to produce How useful is it to have the ability to extend Unreal
any content the client needs. Most of our content is used Engine’s functionality?
to launch new vehicles or what is known as pre-launch,
and then we produce content for all markets. We’ve found it very useful to create and modify our own
features when required. Most of the features we’ve
Specifically for automotive clients, we create a range of developed are driven by client needs; for example,
still imagery and animations of the car itself in multiple increasing the resolution of scene captures to produce 8K
paints, finishes, and variants, and also accessories and HDR stills before this feature became available as part of
the finer details such as trims, wheels, and optional extras. UE 4.25, or extending the functionality of Pixel Streaming
to enhance security. Because of Unreal Engine’s flexibility,
Burrows also produces immersive experiences and we’re able to treat it as a sandbox for ideas to achieve
product configurators, again visualizing the car in higher-quality results.
full detail.
73
The Automotive Field Guide
“
The automotive industry has long been hesitant to
change, traditionally requiring customers to rely on
dealerships and salespeople, offline marketing, and We’re able to use Unreal
physical viewings to learn about and purchase a car.
Engine to get live feedback,
Online purchase pathways have increased significantly
for automotive consumers, and they are becoming which would typically take
more accustomed to technology and how to use it. With
hours or days.
”
this shift, real-time technology can play a big role in a
customer’s buying decision, as well as how they consume
information. Given the flexibility and speed of real-
time technology, automotive brands have a lot to look
forward to.
74
Chapter 4: Real-time technology in action
Mackevision
Mackevision (part of Accenture Interactive) is an award-winning studio that
develops, implements, and runs data-driven content-creation solutions. Manuel
Moser is Head of Real-Time Development.
75
The Automotive Field Guide
What sort of automotive projects do you work on What are the benefits for Mackevision of working on an
at Mackevision? open platform such as a game engine?
The real-time department at Mackevision started out The benefits include access to source code to inspect
developing the proof of concept for a real-time/VR car functionality/implementation details, the community
configurator application. Since then, we’ve developed support and broader knowledge base, and the third-party
several globally deployed car configurator solutions for integrations of different kinds, including hardware such as
different automotive manufacturers. As a service provider VR headsets.
of those solutions, we’re involved from initial consulting
and solutioning to final delivery and ongoing maintenance How useful is it to have the ability to extend Unreal
and support. Engine’s functionality?
We are mainly involved in the later stages of the Adding functionality to the existing source code is key for
automotive pipeline, shortly before the start of developing more complex solutions. The ability to reuse
communication with customers. But we’ve also created modules really adds business value, whether these are
some applications for visualizing design studies of Unreal Engine plugins or even modifications to the engine
prototype cars as well. code itself. Automation tasks at the asset level have
become easier with the Python API for the Unreal Editor,
and it’s easy to integrate third-party libraries or data
pipeline tools.
76
Chapter 4: Real-time technology in action
“
Real-time ray tracing is a
quantum leap for the whole
realm of 3D graphics.
”
Image courtesy of BMW AG with Mackevision
Why did you choose Unreal Engine as your real- Why do you think the automotive industry should be
time platform? excited about real-time technology?
A huge benefit of using Unreal Engine is its artist- Real-time technology provides high-fidelity visualization
friendliness compared to other engines. The Unreal Editor of products such as cars, with the added benefit of
has extensive functionality, and offers the possibility to interactivity and the opportunity to “experience” a car—in
extend and expose custom build (C++) features via the VR, for example. The impression of a spacious interior is
Blueprint system and integrate proprietary functionality conveyed to the customer so much better in VR than in a
into the editor UI. set of plain 2D images.
What’s important for Mackevision about using tools And it’s not only exciting for product visualization—short-
that are production-proven? cycle design reviews and the ability to create real-time
simulations are also benefits.
As a company working with automakers, the production-
readiness of tools is of huge importance because the Real-time ray tracing is a quantum leap for the whole
clients’ expectations and requirements are very high. The realm of 3D graphics. There are still some technological
ability to provide globally deployed systems that factor in challenges that need to be solved—such as the ability to
the client’s requirements for reliability, security, and data work in 4K resolution with all ray-tracing effects—but the
protection comes into play. development of new hardware and technology will surely
make this available in the near future.
77
CHAPTER 5
Case studies
78
Forward-thinking automotive
companies are already harnessing
the power of real-time technology
across the automotive production
pipeline. In this section, we’ll take
a look at how real-time workflows
are transforming processes
and driving efficiencies among
some of the biggest vehicle
manufacturers in the world.
79
The Automotive Field Guide
BMW
Copyrights: BMW AG
The BMW Group has embraced real-time technology to an extent that few other
major automotive makers have, leveraging Unreal Engine across departments
from design to production planning to sales.
80
Chapter 5: Case studies
With Unreal Engine’s powerful real-time workflows at their process. This not only frees up the experienced worker,
fingertips, designers and engineers at the BMW Group can but also ensures that all new workers are getting the
now assess vehicle designs together in minute detail in a same training.
virtual environment. This collaboration can take place from
sites around the world, with the participants interacting Similarly, salespeople on the dealership floor now have
and working together as they would in a multiplayer game. more tools at their fingertips when it comes to selling
a car. Using the BMW Group’s Unreal Engine-powered
Real-time technology has similarly transformed Emotional Virtual Experience (EVE) VR system, vehicles
manufacturing at the company. Using Unreal Engine, can be configured, animated, and swapped into different
production processes can be tested out in a virtual virtual environments—and then an image or video is
assembly hall before being set up in the real world, to emailed to the potential buyer so they can show their
ensure they are safe and efficient. friends and family.
For BMW’s customers, real-time technology completely Cars are no longer purely mechanical entities; they are
changes the car purchasing experience. Now, they can increasingly a hybrid of software and hardware. In the
personally configure photorealistic vehicles in an Unreal future, computers will play an even great role. And it won’t
Engine-powered configurator and assess them from any be possible to design and test these elements purely in a
angle in a range of different virtual environments. They hardware mock-up—technologies like game engines will
can even sit in the driver’s seat of the virtual vehicle to try be required.
it out before they buy.
Viewed through this lens, the BMW Group’s wholesale
The VR and AR capability that real-time technology like adoption of real-time technology like Unreal Engine is
Unreal Engine provides is effecting sweeping change clearly a strategic initiative. It places them in an enviable
across the BMW Group. In the dealership, it’s about the position to conceive, design, and manufacture the vehicles
customer seeing their future car. In the design engineering we’ll be driving tomorrow.
phase, it’s about seeing the car that will be on the street
three to five years later. Find out exactly how the different departments at BMW
leverage Unreal Engine to enhance creativity, save time,
When it comes to production planning, VR is a powerful and improve ROI in this article.
tool for harvesting the knowledge of employees who
work on the factory floor day in and day out. Show them
what their future workspace will look like, and they’ll spot
problems that might otherwise be missed. For example,
they might point out that placing a box in a slightly
different spot will make screws easier to reach, shaving
seconds off their work process each time.
82
Chapter XX: XXXXXX
With multiple brands providing dozens of car lines, an appropriate solution on the market for a car manufacturer
industrialized approach is required. For the Volkswagen with huge product complexity and twice-yearly product
Group, the solution came with an innovation that began at lifecycle updates.
Audi—the Automotive Visualization Platform (AVP).
To deliver visualization on an industrial scale, you
Before the AVP, Audi and other brands in the Volkswagen need a reliable, powerful foundation. Unreal Engine
Group had a high dependency on services and technology provides that foundation for the AVP. “It offers best-
supplied by external providers. With all car models in-class visualization. It’s highly performant, offers
updated twice a year, a more cost-effective way was great customization possibilities, and is open to our
needed to convert all the new data and create new assets ideas,” says Thomas Zuchtriegel, Head of AVP at Audi
for all the cars. Business Innovation.
At Audi, they began experimenting with a new approach: The AVP has most recently been leveraged by the
taking the core components in-house, and building an Volkswagen Group for the release of the Audi e-tron GT
internal team of visualization experts proficient in Unreal in February and the Audi Q4 e-tron models in April. Many
Engine. The AVP is the culmination of this drive for images and videos of the car have been produced using
efficiency: a complete, automated digital pipeline—one in the platform, including a realistic visualization of the Audi
which it’s possible to update vehicles and content at the e-tron GT driving through the desert outside of Dubai. “We
push of a button. increased our overall content portfolio extensively in 2020
and we’re very proud to provide our images, videos, and
The technology has become the company’s central animations for the upcoming model launches in 2021 and
platform for the system-driven, industrial-scale beyond,” says Benjamin Berger, Head of Content (Digital
production and provisioning of product visualizations for Content) at Audi Business Innovation.
all digital sales channels. Its success at Audi has seen the
platform rolled out across the other brands that make up The Volkswagen Group has seen significant ROI benefits
the Volkswagen Group, too. from developing its new visualization platform. “From
a business value point of view, the AVP ensures cost
The AVP produces high-quality renderings of cars in savings of about 30%, and the time to market has been
real time for different touch points along the customer reduced from weeks to hours,” says Lorenz Schweiger,
journey. Officially starting in spring 2017, a core element of Head of Business Development and Strategy for the AVP.
the platform was the development and distribution of 3D
retail systems. Switching to an in-house approach with What’s more, the adoption of this real-time solution
their own internal team of game engine developers was has put them in a strategically advantageous position
a game changer, enabling them to more easily visualize in the market. “We can proudly say we are running the
data within their own walls, rather than constantly fastest visualization pipeline in the automotive industry,
sending confidential product data to external companies from CAD to end user,” says Zuchtriegel. “And we have
and providers. an all-star team of experts to react to fast-changing
customer expectations.”
This new method also allowed for far greater
customization of tools and pipelines, since there was no Read the full article to learn more about the Automotive
Visualization Platform.
83
The Automotive Field Guide
MHP | Pagani
84
Chapter 5: Case studies
The journey to customization is where MHP comes in. our team. Using Blueprint, our technical artists were good to
MHP is a Porsche-owned international management go, and with less effort compared to other solutions.”
and IT consultancy that focuses predominantly on the
automotive industry. The Blueprint visual scripting system enables non-
programmers like artists and designers to harness the power
The consultancy’s best-in-class Unreal Engine-powered of programming without writing a single line of code.
configurator enables Pagani clients to play with different
options and custom features—including a wide color palette, Beyond its dealership configurator, MHP is picking up the
different finishes, and a range of materials—then visualize baton from The Mill by creating a configurable automotive
their favorite layouts in real time at an unprecedented level film for Pagani.
of detail.
The MHP team is shooting the film at the Imola race track,
Leveraging the NVIDIA RTX platform on Google Cloud, and will integrate live footage of the real environment with
the interactive live 3D experiences created in this digital a digital Pagani hypercar. The project is funded by an Epic
showroom can be streamed to mobile and connected MegaGrant, and Pagani will use the film for the launch of
devices via Pixel Streaming. a new project from its atelier. To push the visuals as far as
they can go, the rendering will accurately reproduce the
Once the client has customized their vehicle, Pagani real lighting from the filmed environment using real-time
can create a video showing the car racing in different ray tracing.
environments, along with a digital brochure showcasing their
exact vehicle with the different options they’ve chosen. At a glance, the film might look like a regular TV ad, but there
is one big difference. Like its photorealistic configurator,
This highly personalized experience is the perfect match the cars in the film are instantly customizable. “You can
for a brand that prides itself on delivering unique products show each car as the film runs,” explains Baier. “This is not
to clients. possible with a conventional TV ad, because you’d have to
invest a lot of effort in creating tens of thousands of different
Stephan Baier is Head of Immersive Experience at MHP. User variations of the car on the racetrack.”
experience was at the forefront of his mind when it came to
developing the digital showroom for Pagani. “A dealership MHP believes that soon, all marketing content will be created
configurator should be as easy to use as your personal this way. “We don’t see product-related content being
Instagram account,” he says. pre-produced anymore,” says Baier. “We believe that in
the future, all product-related content will be generated on
MHP used Unreal Engine to build a configurator that demand in a personalized way.”
has usability at its heart. “This is the core benefit of our
solution—it’s really easy to use, even the installation It’s not hard to see brands buying into this vision, taking
process,” says Baier. “It’s basically the same as if you into account the cost savings of not having to pre-produce
download a game from the Epic Games launcher.” content, and the compelling proposition of having viewers
tailor-make their own ads. “This is only possible with real-
Of equal importance to Baier was empowering his whole time technology like Unreal Engine,” says Baier.
team to help develop the configurator, whether they were
programmers or artists. “Initially, our thoughts concerning Read the full story to find out more about how MHP created
Unreal were that it is really user-friendly for technical artists,” Pagani’s real-time dealership configurator and customizable
says Baier. “At that time, we didn’t have many developers on TV ads.
85
The Automotive Field Guide
The Mill
86
Chapter 5: Case studies
”
The initial brief for the project was for a montage of driving
shots of a single F1 car. Once the team had committed to
creating the ads via a fully CG route, this concept quickly
evolved into the narrative of racing drivers Charles Leclerc
and Lewis Hamilton going head-to-head down the straight “Unreal Engine is like a Pandora’s box of tools that are all
of the Abu Dhabi Grand Prix. designed to emulate real-world and camera effects,” says
Hammond. “When you start playing around in a real-time
Unreal Engine is one of the core components of The Mill’s engine, you quickly see the benefits of it as a platform for
real-time technology stack. Creative Director Russell directors and storytellers.”
Tickner played a key role in helping develop the story for
the ads, principally using Sequencer to create the edit, lay When it comes to the impact real-time technology is
out the car and camera animation, and handle versioning going to have in the future, Hammond is emphatic.
and revisions. “The real benefit of real-time tech is how “Simply put, real-time workflows will revolutionize the
nimbly you can work without the usual file creation and automotive industry,” he says. “At the Mill, we have
exchange between applications,” says Tickner. already used technologies to help car manufacturers
visualize their products for rapid prototyping and high-end
Alex Hammond is Head of 3D at The Mill. For him, real- commercial films.”
time workflows provide many advantages compared to
traditional methods of creating ads. “Unlike traditional, The ability to quickly reskin cars, alter paintwork, and
single, linear-narrative commercials, we develop 3D adjust details has all been made possible by advances
assets that take advantage of a real-time approach for in real-time technology. Having the capacity to show off
the purpose of generating multiple versions of content these quick alterations in automotive conferences and
that can be re-used with changes to details within the car showrooms is incredibly appealing, as it becomes a
designed environment or character,” he explains. powerful tool for selling vehicles customized exactly to the
consumer’s specifications. “Unreal Engine really shines—
Using real-time technology completely changes the way quite literally—when it comes to rendering cars,” says
you “tell the story” for automotive television advertising. Hammond. “Seeing super-slick vehicles rendered through
The exploratory nature of real-time enables The Mill to a VR headset is really quite impressive.
quickly try animation and editorial choices that were not
previously so fast to conceive. “The automotive sector for real-time is very exciting…
and who knows, perhaps you won’t need to visit a Formula
The team can take previs to a whole new level, showing One track in order to get up close to the race action in the
high-quality visuals and a much better representation near future.”
of the final film or content it is creating. Because of the
real-time feedback, they can instantly change camera Read the full story to find out more about The Mill’s
angles or lighting conditions. These will then automatically configurable advertising workflows.
populate the edit, allowing for faster iterations and
ultimately more outcomes to the film.
87
The Automotive Field Guide
Daimler Protics
88
Chapter 5: Case studies
“
There are massive time-
saving and ROI benefits
rely solely on 2D screens to communicate and visualize
designs—they also have the option to put on a VR
headset and interact with the design at full scale in a fully
immersive environment.
from taking this approach This innovation has enabled engineers to catch errors
”
even improved output. “It’s much easier to judge sizes
and assess problems if you see them with your own eyes,”
says Riegel. “Also, the ability to do an ad-hoc collaborative
Daimler AG is one such manufacturer. The German car session with your data increases productivity.”
giant’s 3D and digital data arm, Daimler Protics, has
developed a multi-user, online environment for engineers, Daimler’s design and engineering teams are located
all built on Unreal Engine. It’s used across the company all over the world. Having a means for multi-user
to provide virtual reality walkthroughs and real-time 3D collaboration in a shared immersive environment, one
visualization, slashing development time and costs as well that’s accessible from anywhere, has been a huge win. It’s
as helping to deliver higher-quality products. no longer necessary to wait for feedback on designs via
email or phone calls. Reviewing work, sharing ideas, and
Jürgen Riegel, Project Lead and Principal Software fixing issues can now happen on the spot, collaboratively
Architect, describes this Engineering Hub as a multiplayer in real time.
online game for engineers. “We needed a solution that
would enable an engineer to load CAD data directly into This boosts creativity and efficiency—and also has a
a game session,” he says. “We used a tool developed by significant effect on the bottom line. “There are massive
NetAllied to inject a CAD render into the Unreal Engine time-saving and ROI benefits from taking this approach to
render pipeline. That lets us have a fully Unreal network visualization,” says Riegel. “We have such a diverse design
game and load 3D data directly from the PDM system at process around the world. Reducing travel costs and
runtime—no data prep necessary.” removing barriers to fast problem-solving with 3D data is
a big time and cost saver.”
Daimler had been using UberEngine, a technical 3D CAD
renderer created by NetAllied Systems, to give engineers Daimler has leveraged real-time technology to remove
a quick look at large CAD datasets and make inspections the silos and guesswork that can hinder productivity,
of vehicle designs. The plugin developed by NetAllied driving it towards a more agile and collaborative
integrates UberEngine into Unreal Engine, enabling engineering process.
Daimler to open up the engineering visualization process
by leveraging the core functionality of a game engine. Find out more about this story.
89
The Automotive Field Guide
Geely
For the past two years, Geely Design UK has been using Unreal Engine to
showcase the latest designs from the studio in the highest fidelity. Today, roughly
60% of the CGI content the visualization team produces is rendered out of
Unreal Engine.
90
Chapter 5: Case studies
With designs undergoing continuous iteration, it’s vital says Bulmer. “This is unheard of in previous automotive
that the design and visualization teams keep track of software—every OEM has different processes and we
updates and changes. “The design is constantly changing need software that we can customize to our needs.”
and evolving, so having version control is essential,” says
John Bulmer, Visualization Manager at Geely Design UK. Bulmer feels now is the time for the automotive industry
“Custom tools in Unreal allow us to control new data, to wake up to real-time technology. “There are so many
specs, and variants.” parts of the development process that can benefit from
real-time tech,” he says. “We no longer need to wait to
To ensure everyone is on the same page during this back- understand the impact of a design decision or change. It’s
and-forth process, it’s essential to clearly communicate just a smoother process all around.”
design intent. To that end, executable files are created in
Unreal Engine and sent to the whole design team, giving For his team specifically, adopting Unreal Engine has
everyone access to the latest design and specifications. opened up a world of possibilities. “Unreal goes against
“With this, everyone has a point of reference for the latest the grain compared to previous automotive software. With
design,” says Bulmer. each new development and update, it keeps pushing the
status quo of visualization in the automotive industry,” he
Because the designs are rendered using high-fidelity says. “It’s the highest-quality real-time software currently
real-time ray tracing powered by NVIDIA RTX technology, out there, it renders faster than any of the competitors,
they are much less likely to be misunderstood or and it’s open source, which is often overlooked.”
misinterpreted. “With the quality of RTX, there’s no
confusion around the materials. We’re able to render This last point is what makes the engine such a good
materials as design intended,” says Bulmer. “People gain choice for Geely Design UK’s design visualization pipeline.
confidence in our visualization and this means fewer “If we don’t like how something works, we change it,”
feedback loops, which in turn saves time.” Bulmer says. “Everyone’s used to using a piece of
software as a tool. We see Unreal Engine as a platform we
The team can also now create instantly customizable can build on for the future.”
virtual sets to showcase designs. These let them change
“
the time of day or location at the click of a button and
understand how these changes impact the materials on
the vehicle, both in an interior and exterior setting. “It’s
extremely useful and sounds simple, but it’s not been It’s the highest-quality real-
done to this level before,” says Bulmer. “We don’t have to
wait a few seconds for the screen to update or to sort out
time software currently out
the anti-aliasing—this is at a high frame rate on a single
GPU, and it’s smooth.”
there, it renders faster than
any of the competitors, and
While the instantaneous nature of real-time tools naturally
enables faster iteration and greater creativity, Unreal it’s open source, which is
Engine extends this broadening of horizons to those
often overlooked.
”
looking to do more with the technology itself. “We can
change or tweak whatever we want in the source code,”
91
The Automotive Field Guide
Warwick University
Self-driving vehicles must be road tested for several billions of miles before they
can be considered safe. Researchers have different options when it comes to
performing these tests: they can do them in the real world, or they can simulate
them using real-time technology.
92
Chapter 5: Case studies
Real-world testing achieves the most accurate results— Unreal Engine’s Blueprint visual scripting system enables
but can be extremely dangerous—while simulation is more team members to take a hands-on role in the
repeatable, safer, and more cost-effective. One team project. “A lot of the time we have researchers who are
at the University of Warwick has built a system that very good at their specific field—like sensors or electronics
combines the best of both approaches. or mechanics—but who have limited experience with
coding and visual systems,” says Espineira. Blueprint
The Intelligent Vehicles Group is part of WMG, a multi- solves this by enabling non-programmers to script in a
disciplinary department of the University of Warwick more visual way, by connecting nodes rather than writing
that works with industry to undertake applied research lines of code.
in engineering, management, manufacturing, and
technology. It works on the testing and development of To build the simulator, Espineira started with Unreal
autonomous vehicles, including sensors, human factors, Engine’s vehicle template and created a road network
and communication. and environment around it. He used nDisplay to project
the visualization onto the 360-degree screen via
The group’s 3xD Simulator enables researchers to drive eight projectors.
in real vehicles and link them up to a virtual environment,
which is displayed on a full 360-degree screen. “We can He used the ObjectDeliverer plugin to flow data between
drive any vehicle inside, connect it to the system, and Unreal Engine and the car, and CAN bus hardware to
do road testing and development,” explains Juan Pablo connect to the vehicle’s physical components. Finally, he
Espineira, Project Engineer at WMG. used Blueprint to create sensor models in Unreal Engine
that could interpret data from the car’s non-camera
If you want to test out an autonomous emergency system, sensors, such as LiDAR and radar.
for example, you can put a person inside a physical car
that they operate as if driving. The car thinks it’s in the real Read the full story on how WMG built its state-of-the-art
world, and as an emergency scenario or accident occurs real-time simulator.
and the brakes are applied, you can test the reactions of
both the human driver and the vehicle.
93
The Automotive Field Guide
Scania | GEISTT AB
94
Chapter 5: Case studies
The firm has been working with the interaction design scenarios, often with higher and higher degrees of artificial
team at heavy-vehicle manufacturer Scania since 2013, intelligence, we expect to see an increased use of human-
providing R&D services, concept development, and in-the loop real-time simulations in the future.”
simulation-based testing. Scania uses these simulations
to analyze the effects of human machine interface Working on an open platform such as a game engine
(HMI) concepts for everything from new UI features to has considerable advantages for those testing different
procedures for interactions with autonomous vehicles. HMI concepts. “The biggest benefit for us has been the
ability to use Unreal Engine as a flexible platform that
Prior to Scania’s use of modern real-time technologies, we can integrate into several different closed systems,”
a typical development project would consist of a few explains Friström.
early user tests with wireframes on paper, perhaps an
interactive prototype on a computer, and then, at the end, His team’s simulator makes use of Unreal Engine’s built-
a study conducted in a resource-consuming purpose-built in vehicle models combined with third-party software
simulator for concept validation. such as TruckSim by Mechanical Simulation and Wwise by
Audiokinetic. “The flexibility to shape the real-time engine
This meant that a novel concept was tested only once to accommodate and use separate modules is a great
in a simulator environment, and the designers could not asset of Unreal Engine,” says Friström.
test their concept again after receiving feedback from the
simulator study. Now, using Unreal Engine, the team can The team also leverages nDisplay—the technology that
test concepts in immersive simulations earlier and more renders Unreal Engine scenes on multiple synchronized
often in the development process. The engine’s strong display devices—to offload work to many computer nodes
connection to C++ and its open-source nature enables the in a network instead of relying on only one computer. “By
team to easily adapt the technology for specific use cases. doing so, we can render the environment to an infinite
number of projection screens,” explains Friström.
The Scania interaction design team evaluates HMI
concepts in areas such as driver interaction and Friström and his team have fully embraced the switch to
human-automation interaction. This might include Unreal Engine for their HMI concept testing. “Compared to
testing prototypes of advanced driver-assistance the simulator platform we used before,” he says, “Unreal
systems (ADAS); evaluating advanced HMI features for Engine has a broader, more supportive community.”
specific user groups such as timber drivers; exploring
the relationship between the AI vehicle and a driver As for the future, Friström predicts simulations will play an
inside it; and researching how to remotely control increasingly important role in optimizing user experience
automated vehicles. for vehicles. “What we can see is that all automotive
companies are using different forms of UX simulations
Jon Friström, UX Researcher & Cognitive Design Engineer to an increasing degree, and this trend is clear in other
at GEISTT, consulting as Product Owner for Scania’s HMI industries as well,” he says.
simulation team, notes that research projects like these
are steadily on the rise. “We see a general increase in the Using Unreal Engine as a platform for concepting and
number of projects that research and test the relationship testing, GEISTT and Scania are well-placed to meet the
between humans and highly automated systems in HMI challenges of the future.
various industries,” he says. “Due to the increasing
complexity of the systems that interact in different Find out more about GEISTT AB’s workflow in the full story.
95
The Automotive Field Guide
Toyota
96
Chapter 5: Case studies
Fast-forward 50-plus years, and not a whole lot has Many software providers connect their tools and systems
changed. Many automotive ergonomics studies today to Unreal Engine via plugins—like the Mechanical
are still carried out on physical mock-ups of vehicles. Simulation CarSim plugin the team uses. That means
These are costly and time-consuming to build as well as Matsumoto’s team doesn’t have to jump through hoops to
inefficient—the design has often changed by the time the work with industry-leading third-party tools.
physical mock-up is available.
The team imports car model data into the engine via
At Toyota, however, one innovative team is using virtual Datasmith, enabling the team to go straight from CAD to
ergonomics technology rather than physical prototypes. Unreal Engine in a couple of clicks without using any third-
By testing the reactions of real people in a virtual party software in between.
environment, they can simulate human interaction with
a vehicle far more realistically. What’s more, the team Game engines make it easy to create complex scenarios
can validate designs faster and at a far lower cost while that include virtual vehicles and human characters. The
capitalizing on the open nature of Unreal Engine to Toyota team makes use of the Blueprint visual scripting
connect industry-leading software and technology. system in Unreal Engine to create these virtual scenarios
for each test.
Mikiya Matsumoto is the general manager of the Prototype
Division, Digital Engineering Department at Toyota. His Similarly, while other ergonomics tools used in
team leverages real-time technology to assess the user- the automotive industry generally do not offer VR
friendliness of vehicle designs. functionality, VR is ingrained in game engine DNA.
Instead of validating ergonomic tasks from a third-person
Testing starts with the import of a 3D vehicle model perspective, users can perform the task themselves in
into a virtual environment. A person sits in a real car the immersive realism of a VR experience generated by a
seat wearing a VR headset and experiences different game engine.
simulated scenarios designed to test the usability of
the vehicle. Scenarios involve common driving tasks like The workflow developed by Matsumoto’s team saves
appraising the visibility of other road users out of the time and money over traditional methods of ergonomic
rear quarter window of a car. The team also employs the assessment, and provides a more flexible development
setup to perform accessibility checks, using tracking path. “Real-time technology allows us to perform
gloves to evaluate how easy it is to reach various buttons virtual user experience testing,” explains Matsumoto.
and controls. “This reduces the cost of and time taken for proof-of-
concepting, leading to a more agile way of development.”
One critical aspect of the system is the open nature of the
technology that powers it. Hardware and plugins that can Read the full story to find out more about Toyota’s human
be integrated into the setup include HTC Vive headsets, factors engineering.
CarSim for vehicle dynamics, Leap Motion controllers for
hand tracking, and a combination of physical prototype
parts and VR simulation.
97
The Automotive Field Guide
Ike
“
Game engines are
tools for building living,
breathing worlds.
98
Chapter 5: Case studies
The key challenge is proving that safety aspect. In order to “Game engines are tools for building living, breathing
prove with statistical confidence that a truck can deal with worlds—levels in which the player makes decisions and
myriad rare and unexpected events, it would have to be the world reacts realistically,” says Simulation Lead
driven tens of millions of miles. Pete Melick. “That’s all a scenario is, except the player is
a robot.”
Rather than spend all that time on the road, the team
elected to use simulation as their primary validation tool. Ike also uses the Unreal Engine AI Perception system to
The team uses two types of simulation for autonomous add intelligent behaviors to its simulated agents. To enable
driving: log simulation, which involves feeding data designers to extend the range of scenarios, the team
from real driving into the automation system, and exposes functionality to Blueprint, Unreal Engine’s visual
virtual simulation, which uses fabricated scenarios and scripting system.
responsive Actors (objects), like a video game.
Using Blueprint, designers create new behaviors and
To develop its virtual simulation tool, the team turned choreography. For example, they can make another car in
to Unreal Engine, spending a year extending the many the environment weave left and right about its lane with
relevant out-of-the-box features for its specific needs. a parameterized period and amplitude. Or they can add
As a first step in that process, Ike customized the Unreal simulated noise to the detections fed to the autonomy
Engine Level Editor to be its scenario design tool. software to test its sensitivity to imperfect inputs.
Each of Ike’s trucks calculates its position in the world Blueprint is also the key to Ike’s variations system.
using high-definition maps consisting of LiDAR intensity A designer adds a variable parameter by adding a
and elevation data, which is collected and processed into Component to an Actor, such as another car, and
lane centers and boundaries. That same map data is implementing a Blueprint function that defines the effect
streamed into Unreal Engine using the Landscape API so of varying that parameter.
the team can design and run their scenarios on it.
This workflow enables the team to tweak scenarios
The automation system requires higher-resolution map in an infinite number of ways, including the position,
data than is easily found in open-source data formats; to orientation, speed, or size of objects—and even behavioral
capture the necessary data, Ike uses a special mapping elements like how aggressively vehicles drive. “If it can be
vehicle fitted out with two LiDAR scanners and physically expressed in Blueprint, it can be varied,” says Melick.
drives it down the highway. This makes the company
completely self-sufficient, giving it the power to simulate While we may still be some time away from seeing
anywhere it can drive its mapping vehicle. driverless trucks safely navigating our highways, every
scenario that Ike creates in Unreal Engine brings that goal
Once the maps are imported, most of the building blocks one step closer.
for scenario design are available out of the box: triggers
based on time or distance, splines for scene objects to Read the full article to find out more about Ike’s virtual
follow, an efficient environmental query system, a fully simulation tool.
featured and customizable GUI, and a scripting language
for designing arbitrarily complex choreographies.
99
The Automotive Field Guide
CARLA
The autonomous vehicles industry is already big business, with billions of dollars
invested to date. The notable lack of fully self-driving cars on our roads today,
however, illustrates that far more research and validation are needed before the
technology is deemed wholly safe.
100
Chapter 5: Case studies
Enter CARLA, a free, open-source simulator powered language. “Okay, you can have your own internal simulator,
by Unreal Engine, designed to support the development, you can have very sophisticated tools, and that’s fine,”
training, and validation of autonomous driving systems. says Ros. “At some point you’re going to need to share
with your competitors, you’re going to need to share
The desire to freely distribute CARLA was a key factor in results with legislators. Why don’t you use an open-
the development team’s choice of Unreal Engine, which source tool for that—one that speaks all the standards
is also free and comes with full source-code access. “We that need to be spoken and allows you to send your data
really wanted to share it with people,” says CARLA Team to the community in a transparent way so that they can
Lead Germán Ros. “We wanted people to be able to modify understand its current status?
everything. So the idea was to have something that
was completely open, and have access to all the source “That’s why we decided to deliver CARLA in an open-
code of the engine and the source code of our platform. source format where you can take everything from the
We believe that having something that is totally open is assets, to the code, to everything, and do anything you
helpful for the community, since it enables them to adapt want with it, including commercial use. Whatever you want,
it for different use cases and scenarios.” you’re free to take CARLA and do the best you can with it.”
Ros claims that virtually every university that is currently The CARLA team strives to offer the best possible
researching autonomous driving is using CARLA in some simulator to anyone who wishes to use it, modify it, or
way, as are virtually all corporations large enough to build on top of it. It’s become a powerful tool for those in
have an R&D unit. From the beginning of the simulator’s the autonomous driving simulation community, helping
development, the team understood the importance of the prepare the way for fully autonomous vehicles.
open-source model in helping it democratize autonomous
vehicle travel. Read the full article to find out more about how CARLA
democratizes autonomous vehicle R&D.
“Having the progress of autonomous driving be dependent
on just the huge corporations with big pockets is not
good enough,” says Ros. “We also need academics and
small companies to participate in this if we really want
to expedite and accelerate autonomous driving. We want
autonomous driving—we think it’s important, it’s going to
save lives, it’s going to make our lives better—so why don’t
we try to make it happen as soon as possible? For that, we
need the collaboration of the community.”
101
The Automotive Field Guide
CarSim
Mechanical Simulation has been making software for accurate ground vehicle
simulation since 1996. Their CarSim, TruckSim, and BikeSim product portfolio
includes modules for different types of passenger and commercial vehicles.
102
Chapter 5: Case studies
These products are focused on aggregating everything That’s when Mechanical Simulation decided to integrate
about a vehicle, its environment, and its motion to more of its product into Unreal Engine. “It was pretty
visualize or predict its behavior in a wide range of obvious that we could get the information from the road
driving conditions. and the sensors and interface tools like MATLAB/Simulink,
and let people integrate their own active controllers,”
CarSim, TruckSim, and BikeSim use vehicle data that says McGinnis.
describes suspension behavior, powertrain properties,
active controller behaviors, tire properties, and also road This gave Mechanical Simulation a clear path to upgrading
slope, obstacles, weather conditions, and asphalt type. its offerings using Unreal Engine, leaving the company
At the core of the software is a simulation solver that can more room to focus on its core technology: the solver
predict how the vehicle will react—for example, whether it inside its products. “Early on, our software did not have a
will tip or skid under specific conditions, or whether it will good way to build complex scenes for visualization,” says
brake quickly enough on a wet surface. The software also McGinnis. “One approach we took was to add an Unreal
produces a visual representation of the vehicle’s motion Marketplace plugin that allows a CarSim vehicle solver to
from the solved data. be loaded into the Unreal Editor. It allows people to create
scenes and scenarios using that tool all by themselves.”
On the flip side, the software can also import real-world
vehicle and map data and analyze for speed, response The VehicleSim Dynamics plugin gives CarSim and
time, and other aspects of the driving experience. Trucksim users a powerful tool for generating visual
While this application has obvious uses for accident representations with all the advantages Unreal Engine
reconstruction and training simulators, a new use has has to offer, such as physically based rendering (PBR)
emerged in recent years—data-gathering and machine materials, realistic lighting, landscape and foliage packs,
learning for autonomous vehicles. and cityscape items. The plugin works by converting the
solver data to Blueprints, which can then be easily queried
“As autonomous driving came on and people wanted to produce data about both the terrain and the vehicle.
to incorporate physics-based sensors, we started
presenting our technology as a general-purpose vehicle Mechanical Simulation sees the simplicity of the Unreal
simulation tool for vehicle dynamics and autonomous Engine plugin as a huge plus for their customers. “They
driving engineers,” says Robert McGinnis, Senior Account don’t want to be running $200,000 worth of software on
Manager at Mechanical Simulation. a single machine that requires another engineer just to
help the prime engineer get his job done,” says
At the same time, for the software’s visual Jeremy M. Miller, Lead Developer.
representations, Mechanical Simulation was aware
that it needed to keep pace with advances in computer The plugin has also proven to be useful for training,
graphics. The company found that to gain more options for testing, and previsualization of newly designed vehicles.
visualization, many customers were starting to port the The team is constantly looking to improve it to better
CarSim and TruckSim solvers’ results to Unreal Engine, serve their customers, for example recently adding an FBX
along with their own car models and environments. Unreal converter to bring in physical terrain models that will work
Engine’s readily available source code, C++ support, with the plugin.
coupled with its Blueprint visual scripting system, made
it an attractive choice for processing the volume of data Read the full article to find out more about how automotive
that driving tests generate. companies are leveraging CarSim to perform autonomous
vehicle testing.
103
CHAPTER 6
The future
104
Chapter 6: The future
We hope you’ve found this guide useful. By now, you should have a good
grasp of the different opportunities real-time technology provides across the
automotive pipeline.
But more than this, we hope you can see why the future of the automotive
pipeline lies in an open-platform approach.
This is where Unreal Engine can play a key role in the automotive industry:
as the standard data simulation and visualization platform across digital
transformation, extended reality, and digital collaboration.
105
The Automotive Field Guide
Glossary
Advanced driver-assistance systems Decimation
(ADAS) The process of reducing the polygon/face
Electronic systems in vehicles that assist count of a 3D model for better performance.
drivers when driving and parking.
Digital twin
Augmented reality (AR) A mathematically perfect representation of a
A technology that integrates CG elements into physical object and all its variants in a digital
a physical environment. space. Digital twins are used in the automotive
sector for creating the virtual model of a
Autonomous vehicle connected vehicle. They can capture the
A vehicle that is capable of sensing its behavioral and operational data of the vehicle
environment and moving safely with little or no and analyze the overall vehicle performance.
human input.
Extended reality (XR)
Autonomy levels An umbrella term for VR, AR, and MR, and all
A measure of the extent to which a self-driving future realities such technology might bring.
vehicle can be said to be truly autonomous.
Levels of driving automation range from 0 FBX
(fully manual) to 5 (fully autonomous). A file format (.fbx) used to provide
interoperability between digital content
Building Information Modeling (BIM) creation applications.
An intelligent 3D model-based process used
extensively in the architecture, engineering, Final pixels
and construction (AEC) industry. Images of high enough quality to be the final
output for film or TV.
Blueprint
A script created from the Blueprint visual Game engine
scripting language in Unreal Engine which A software development environment
defines how an asset interacts. designed for the creation of real-time
interactive content, initially intended
C++ for video games but now used in many
A popular general-purpose programming other applications.
language used to create computer programs.
One of the methods of programming in Unreal Graphics processing unit (GPU)
Engine, the others being Blueprint visual A specialized type of microprocessor
scripting and Python scripting. optimized to display graphics and do very
specific computational tasks. Modern
Computer-aided design (CAD) real-time engines rely heavily on GPUs
The use of computers to create, modify, for performance.
analyze, or optimize a design. Used by
designers and engineers to create 2D and 3D Hardware-in-the-loop (HIL)
models of physical components. A type of real-time simulation used in the
development and testing of complex real-time
Cave Automatic Virtual Environment (CAVE) embedded systems.
An immersive virtual reality environment
where imagery is projected onto three to six of Head-mounted display (HMD)
the walls of a room-sized cube. A device used to display CG content for VR,
AR, or MR.
Collab Viewer
An Unreal Engine project template that High dynamic range (HDR)
joins multiple people together in a shared Reproduction of a greater dynamic range of
experience of the same 3D content. luminosity than is possible with standard
digital imaging techniques. HDR images retain
Configurator detail in a fuller range of lights and darks than
A program that enables the user to standard images.
personalize a virtual model of a car, swapping
colors, materials, trims, and custom features HoloLens Viewer
instantly. Configurators are found both on car An adaptation of the Collab Viewer Template
maker websites and on the dealership floor. that works on the Microsoft HoloLens 2. You
can use it to see your 3D content overlaid on
Datasmith your actual surroundings.
A collection of tools and plugins that help you
bring content into Unreal Engine, significantly
reducing data import times.
106
Glossary
107
Additional resources
Here are some additional resources about the use of real-time technology
across the automotive industry.
The Pulse
108
Additional Resources
109
© 2020 Epic Games/All Rights Reserved.