Introduction To The Universal Render Pipeline For Advanced Unity Creators 2024
Introduction To The Universal Render Pipeline For Advanced Unity Creators 2024
Introduction To The Universal Render Pipeline For Advanced Unity Creators 2024
INTRODUCTION TO THE
Universal Render
Pipeline for advanced
Unity creators
Contents
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Lighting in URP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Lighting overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Light Inspector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Shadows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Shadow Cascades . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Light Modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Rendering Layers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Light Probes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Reflection Probes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Box Projection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Lens Flare. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Light Halos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Decals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Shaders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Custom shaders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Includes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Preprocessor macros. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Render Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Renderer Feature. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Post-processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Camera Stacking. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Coding a screengrab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Performance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Scalability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
Introduction
The limitations of the Built-in Render Pipeline have become apparent as the
number of platforms supported by Unity continues to grow. Each additional
platform and API adds complexity to modifying and maintaining Built-in Render
Pipeline architecture.
In 2018, Unity released two new Scriptable Render Pipelines (SRPs): the
High Definition Render Pipeline (HDRP) and URP. These SRPs enable you to
customize the culling of objects, their drawing, and the post-processing of
the frame without having to use low-level programming languages like C++.
You can also create your own fully customized SRP.
Nik Lever, the author of this e-book, has been creating real-time 3D content
since the mid-nineties and using Unity since 2006. For over 30 years he’s led
the small development company, Catalyst Pictures, and has provided courses
since 2018 with the aim of helping game developers expand their knowledge
in a rapidly evolving industry.
Felipe Lira is a senior manager of graphics and the URP. With over 13 years
of experience as a software engineer in the games industry, he specializes
in graphics programming and multiplatform game development.
Ali Mohebali is the graphics product management lead for Unity Runtime and
Editor. Ali has 20 years of experience working in the games industry, and has
contributed to hit titles such as Fruit Ninja and Jetpack Joyride, both by Halfbrick
Studios.
Important contributions were also made by Unity URP engineering and sample
project teams.
One of Unity’s biggest strengths is its platform reach. The ideal for all game
studios is to create once and efficiently deploy their game to their desired range
of platforms, from high-end PCs to low-end mobile.
The Built-in Render Pipeline was developed to be a turnkey solution for all
platforms supported by Unity. It supports a mix of graphics features and is
convenient to use with Forward and Deferred pipelines.
— It exposes callbacks in the rendering code that trigger sync points in the
pipeline. Those callbacks prevent multithreaded rendering optimizations,
enabling changes for injection of state at any point in the frame
dynamically by calling to C#
— Caching data to manage the persistence state for user injection is difficult
The image below illustrates how SRPs work. Use C# to control and customize
render passes and rendering control, as well as HLSL Shaders that can be
created using artist-friendly tools such as Shader Graph. Shaders give you
access to even the lower-level API and engine-layer abstractions.
The new graphics programmable model for the Scriptable Render Pipelines
An advanced user can create a new SRP from scratch or modify the HDRP
or URP. The graphics stack is open source and available for use on GitHub.
— Compatible with the latest tools: URP supports the latest artist-friendly
tools, such as Shader Graph, VFX Graph, and the Rendering Debugger.
Most Unity projects are now being built on URP or HDRP, however the Built-in
Render Pipeline will remain an available option in Unity 2022 LTS and Unity 6.
Follow this link for a comprehensive comparison of the Built-in Render Pipeline
and URP capabilities.
This section covers the steps for starting a new project with URP or converting
an existing project to URP.
Open a new project using URP via the Unity Hub. Click on New and verify that
the Unity version selected at the top of the window is 2022.2 or newer. Choose
a name and location for the project, select the 3D (URP) template or 3D Sample
Scene (URP), and click Create.
Creating a new project with the URP template, which might require you to download the template for the first time
One of the four environments included in the URP 3D Sample, available in the Unity Hub
You can create new scenes via File > New Scene, with essential GameObjects
such as Camera and Directional light, and even create your own scene
template with prepopulated objects. Read more in the URP Scene Templates
documentation.
Go to Edit > Project Settings and open the Graphics panel. To use URP in-
Editor, you must select a URP Asset from the Scriptable Render Pipeline
Settings. The URP Asset controls the global rendering and Quality settings
of a project and creates the rendering pipeline instance. Meanwhile, the
rendering pipeline instance contains intermediate resources and the render
pipeline implementation.
UniversalRP-HighFidelity is the default URP Asset selected, but you can switch
to UniversalRP-Balanced or UniversalRP-Performant.
© 2024 Unity Technologies 13 of 140 | unity.com
The Graphics panel in Project Settings
A later section of this guide details how to adjust the settings of a URP Asset.
Important: Be sure to backup your project using source control before following
the steps in this section. This process will convert assets, and Unity does not
provide an undo option. If you use source control, you will be able to revert to
previous versions of the assets if necessary.
If you upgrade an existing Built-in Render Pipeline project, you’ll need to add
the URP package to your project as it’s not included in Unity 2022.2 or 2022 LTS.
To create a URP Asset, right-click in the Project window and choose Create >
Rendering > URP Asset (with Universal Renderer). Name the asset.
Remember : If you create a new project using the Universal Render Pipeline
or 3D (URP) templates, these URP Assets and the URP package are already
available in the project.
Rather than creating a single URP Asset, URP uses two files, each with an Asset
extension.
Two Assets in URP, one for URP settings and the other for Renderer Data
© 2024 Unity Technologies The Inspector for UniversalRP_Renderer Data Asset 16 of 140 | unity.com
The other URP Asset serves to control settings for Quality, Lighting, Shadows, and
Post-processing. You can use different URP Assets to control the Quality settings,
a process outlined further down in this section. This Settings Asset is linked to the
Renderer Data Asset via the Renderer List. When you create a new URP Asset, the
Settings Asset will have a Renderer List containing a single item – the Renderer
Data Asset created at the same time, set as the default. You can add alternative
Renderer Data Assets to this list.
The default renderer is used for all Cameras, including the Scene view.
A Camera can override the default renderer by selecting another one from
the Renderer List. This can be done through the use of a script, as needed.
Despite following these steps to create a URP Asset, an open scene in the
Scene or Game view will still use the Built-in Render Pipeline. You must
complete one last step to make the switch to URP: Go to Edit > Project
Settings and open the Graphics panel. Click the small dot next to None
(Render Pipeline Asset). In the open panel, select UniversalRP.
A warning message will pop up regarding the switch, but just press Continue.
As there is no content in your project yet, changing the render pipeline will be
almost instantaneous. You’re now ready to use URP.
After you complete the above steps, you’ll find that your beautiful scenes are
suddenly colored magenta. This is because the shaders used by the materials
in a Built-in Render Pipeline project are not supported in URP. Fortunately,
there is a method for restoring your scenes to their original quality.
Materials in a scene appear in magenta because their Built-in Render Pipeline-based shaders must be converted for use
in URP.
Go to Window > Rendering > Render Pipeline Converter. Choose Convert Built-
In to 2D (URP) for a 2D project, or Built-In to URP for a 3D project. Assuming that
your project is 3D, you’ll need to select the appropriate converters:
— Material Upgrade: Use this to convert materials from the Built-in Render
Pipeline to URP.
— Animation Clip Converter: This converts animation clips. It runs once the
Material Upgrade converter finishes.
Custom shaders are not converted using the Material Upgrade converter.
The Shaders and New tools sections outline the steps for converting custom
Built-in Render Pipeline shaders to URP. Using Shader Graph is often the
quickest way to update a custom shader to URP.
Refer to this table in our URP documentation to see how each URP shader maps
to its Built-in Render Pipeline equivalent.
Once you select one or more of the above converters, either click Initialize
Converters or Initialize And Convert. Whichever option you choose, the project
will be scanned and those assets that need converting will be added to each
of the converter panels. If you choose Initialize Converters you can limit the
conversions by deselecting the items using the checkbox provided for each one.
At this stage, click Convert Assets to start the conversion process. If you choose
Initialize And Convert, the conversion starts automatically after the converters
are initialized. Once it’s complete you might be asked to reopen the scene that is
active in the Editor.
There are several default Quality options available in the Built-in Render Pipeline,
from Very low to Ultra. The Quality settings impact the fidelity of the scene,
including Texture resolution, lighting, shadow rendering, and so on.
Go to Edit > Project Settings and select the Quality panel. Here, you can switch
between these Quality options by picking the current quality. This will change
the render settings used by the Scene and Game views. You can also edit each
of the Quality options from this panel.
If you select the Rendering Settings option while using the Render Pipeline
Converter and switching from the Built-in Render Pipeline to URP, a set of URP
Assets that attempt to match the Built-in Render Pipeline Quality options will
be created. The first table below shows how the Built-in Render Pipeline maps
to URP for Low settings, while the second table displays a comparison for High
settings. In both the Built-in Render Pipeline and URP, settings are chosen via
the Quality panel. The URP Asset settings are available via the Inspector when
selecting a URP Asset. Refer to the URP documentation for more details.
* In URP, Pixel Light Count is handled using Additional Lights > (Per pixel ) >
Per Object Limit.
TAA selected
Quality settings were previously handled in the Quality panel of the Project
Settings dialog box. When using URP, settings are divided between the Quality
panel and those for each URP Asset. The following table shows where each
setting can be found.
VSync Count √
Depth Texture √
Opaque Texture √
Opaque Downsampling √
Terrain Holes √
HDR √
Textures
Texture Quality √
Anisotropic Textures √
Texture Streaming √
Particles
Particle Raycast Budget √
Shadows
Shadowmask Mode √
Shadow Resolution √
Shadow Distance √
Shadow Cascades √
Cascade splits √
Working unit √
Depth Bias √
Setting Quality panel URP Asset
Normal Bias √
Soft Shadows √
Async Asset Upload
Time Slice √
Buffer Size √
Persistent Buffer √
Level of Detail
LOD Bias √
Maximum LOD level √
Meshes
Skin Weights √
Lighting
Main Light: √
• Cast Shadows √
• Shadow Resolution √
Additional Lights: √
• Per Object Limit √
• Cast Shadows √
• Shadow Atlas Resolution √
• Shadow Resolution tiers √
• Cookie Atlas Resolution √
• Cookie Atlas Format √
Reflection Probes: √
• Probe Blending √
• Box Projection √
If you switch between Quality options, choose a Quality Level for the Render
Pipeline Asset in the Quality panel via Project Settings. Note that if the Quality
Level is not set, the Render Pipeline Asset will default to the one set as the
Scriptable Render Pipeline Asset in the Graphics panel. This can cause some
confusion as you attempt to adjust the Quality settings of a URP Asset.
For instance, you might accidentally assume that the Quality Level set in
the URP Asset is the one currently used by the Scene and Game views.
The Quality panel for a URP Asset allows you to set the HDR format to 64-bit
for better fidelity. However, be aware that this results in a performance hit and
requires additional memory, so avoid this setting on low-end hardware.
Another feature of the Quality panel is the option to enable LOD Cross Fade.
LOD is a technique to reduce the GPU cost needed to render distant meshes.
As the Camera moves, different LODs will be swapped to provide the right level
of quality. LOD Cross Fade allows for smoother transitions of different LOD
geometries and avoids the harsh snapping and popping that occurs during a
swap.
The Unity demo project Viking Village URP shows off URP capabilities with
Light Probes, Reflection Probes, water special effects that use a custom
ScriptableRenderPass, shaders converted via Shader Graph, and URP
post-processing. The project is available for free on the Unity Asset Store.
Open Viking Village URP in the Editor to follow along with the steps in this
section. Start by clicking Add to My Assets to add this demo to the Packages
list available in-Editor.
Then create a new 3D project from the Unity Hub (you don’t need to use the
URP template). Go to Window > Package Manager, select My Assets > Viking
Village URP from the Packages drop-down, and click Import.
A couple of warning messages will appear (see below). The first one warns
you that importing a complete project will affect your current Project Settings,
but since you’ve created an empty project it’s safe to proceed. The second
warning alerts you about installing or upgrading certain packages. Click on the
default blue button. This is required to avoid an incorrect lighting setup as the
URP default is a linear color space, opposite the Built-in Render Pipeline which
defaults to a gamma color space.
Once the download is complete, the panel shown below will open.
Make sure to leave everything selected and click Import.
Wait for all the assets to finish importing, then go to the demo located in Viking
Village > Scenes > The_Viking_Village. Click Window > Package Manager, and
in the drop-down select Unity Registry, followed by Universal RP. Update the
URP package to 14.x.
The URP Asset set in the Graphics panel, via the Scriptable Render Pipeline
Asset, is named Viking Village > Rendering > VikingVillageUniversal. It is
configured for high-end hardware, and therefore, might play at a low frame rate
on older hardware.
1. Generate a set of assets via Window > Rendering > Render Pipeline
Converter.
2. Choose the Built-in Render Pipeline to URP option, then select Rendering
Settings.
5. The URP Assets will be assigned to the available Quality levels via the
Project Settings > Quality panel.
7. To restore these, add the above renderer to the Renderer List and set it
as the default for each URP Asset used in Quality Levels. Now you can
quickly switch Quality Levels in the Quality panel.
This section shows how lighting in URP works and describes the differences
between the workflows of the two rendering pipelines.
— Lighting documentation
If you convert a project from the Built-in Render Pipeline to URP, you might
notice differences in the lighting. This is because the Built-in Render Pipeline
uses a gamma lighting model by default and URP uses a linear model. As such,
any light with an intensity value differing from 1.0 will need to be adjusted.
There are also differences in where to find the Settings controls in-Editor,
as well as how to handle the challenge of widely differing hardware specs.
The rest of this section covers some tricks you can use to achieve balance
between graphic fidelity and performance.
As before, you’ll set properties in the three places listed here. A and B are
essentially the same for both render pipelines, while C applies to URP only:
C. URP Asset Inspector: This is the principal place where you will set shadows.
Lighting in URP relies heavily on the settings chosen in this panel.
Quality settings are handled via Edit > Project Settings > Quality in the Built-in
Render Pipeline. In URP, this depends on the URP Asset settings which can be
swapped using the Quality panel (see the Quality settings section).
As the focus here is on lighting, the methods apply to materials that use the
shaders in the following table.
Shader Description
Complex Lit This shader has all the features of the Lit Shader. Select it
when using the Clear Coat option to give a metallic sheen to
a car, for example. The specular reflection is calculated twice
– once for the base layer, and again to simulate a transparent
thin layer on top of the base layer.
Lit The Lit Shader lets you render real-world surfaces, such as
stone, wood, glass, plastic, and metals with photorealistic
quality. The light levels and reflections look lifelike and react
across various lighting conditions, from bright sunlight to a
dark cave.
This is the default choice for most materials that use lighting.
It supports baked, mixed, and real-time lighting, and works
with Forward or Deferred rendering.
Another difference between the Built-in Render Pipeline and URP is how
they compute light falloff/attenuation that applies to Spot and Point lights.
The choice between a Lit Shader and Simple Lit Shader is largely an artistic
decision. It is easier for artists to get a realistic render using the Lit Shader,
but if a more stylized render is desired, Simple Lit provides stellar results.
Comparing scenes rendered using different shaders: The top-left image uses the Lit Shader, the top-right, the Simple Lit
Shader, and the bottom image, the Baked Lit Shader.
Lighting overview
Lights are divided into Main Light and Additional Lights in URP. The Main Light
is the most significant Directional light. This is either the brightest light or the
one set via Window > Rendering > Lighting > Environment > Sun Source.
Later in the guide, you’ll learn how to use the URP Asset settings to set the
number of dynamic lights that affect an object via the Object Per Light limit,
which is capped at eight for the URP Forward Renderer. However, the number
of dynamic lights that can be used per Camera is also limited by different
hardware, as shown in the following table.
Projects with a small number of dynamic lights might not encounter any issues,
however, as you add more lights, you might experience light popping as different
lights are dynamically culled. Of course, there is a performance cost to having more
dynamic lights in a scene. Each dynamic light will need to be culled against the
Camera and then sorted by priority. There is also the cost of rendering each light
per object. As always, try to maintain a balance between fidelity and performance.
Real-time and Mixed Mode lights are first culled against the Camera
frustum. If occlusion culling is enabled, lights hidden by other objects in
the scene are also culled.
The visible list of lights that survive the culling process is then sorted
by each light’s distance from the Camera. If there are visible lights, the
limits in the table above come into play. For example, if you have 1,000
lights in the scene but only 200 are visible to the Camera, all those
would fit the limit for non-mobile platforms.
Now the list of visible lights is culled per object. Lights are sorted
by intensity at the pivot of the object – this way, brighter lights are
prioritized first. If an object is affected by more than the maximum
number of lights allowed per object, the excessive lights are discarded.
— If the light’s position and intensity are static, bake it and use Light Probes
to add the light to the rendering of dynamic geometry. See the section on
Light Probes for more information.
— Limit the range of the light. This option does not apply to Directional lights,
as they’re global.
The light limits discussed here are those that apply with the Forward Renderer –
the default renderer when using URP.
Unity 2022 LTS provides three options for rendering: Forward, Forward+, and
Deferred.
• Octahedron encoding
(accurate normals,
might have significant
performance impact on
mobile GPUs)
Use the Universal Renderer Data asset to switch between the rendering paths.
— Main Light: The value of this property is Per Pixel, regardless of the value
you select.
— Additional Lights: The value of this property is Per Pixel, regardless of the
value you select.
— Additional Lights > Per Object Limit: Unity ignores this property.
— Reflection Probes > Probe Blending: Reflection Probe blending is always on.
Light Inspector
The Light Inspector is one of three places where you can set up lighting.
Just as with the Built-in Render Pipeline, URP supports Directional, Spot, Point,
and Area lights, though Area lights only work in Baked Indirect Mode. See the
Light Mode section for more details.
A side-by-side comparison of the Light Inspector panel in URP (left) and the Built-in Render Pipeline (right)
The image above shows how light properties are presented in the two versions
of the Light Inspector. The URP version has five groupings of controls, based
on whether the light is Directional or Point, and an additional Shape grouping
for Spot and Area lights.
This table lists the differences between the URP and Built-in Render Pipeline
Inspector
The first step to lighting a new scene for URP is to create a new Lighting
Settings Asset (see image above). Open Window > Rendering > Lighting, and
once you’re on the Scene tab, click New Lighting Settings, and give the new
asset a name. The settings that you apply in Lighting panels are now saved to it.
Switch between settings by switching the Lighting Settings Asset.
You can set Environment Lighting to use the scene’s Skybox, with an option to
adjust the Intensity, Gradient, or Color.
The biggest change from working with the Built-in Render Pipeline to URP
lies in how you set up shadows.
Shadow settings are no longer available via Project Settings > Quality.
As discussed earlier, you need a Renderer Data object and a Render Pipeline
Asset when using URP. The section on setting up a project for URP covers how
to view your scene via Render Pipeline Asset, which you can use to define the
fidelity of your shadows.
The Lighting and Shadow groups in the URP Asset are key to setting up shadows
in your scene. First, set the Main Light Shadow to Disabled or Per Pixel, then go
to the checkbox to enable Cast Shadows. The last setting is the resolution of the
shadow map.
If you’ve worked with shadows in Unity before, you know that real-time shadows
require rendering a shadow map that contains the depth of objects from the
perspective of the light. The higher the resolution of this shadow map, the
higher the visual fidelity – though both more processing power and increased
memory are required. Factors that increase shadow processing include:
— Shadow Receivers that are visible (you have to encompass them all)
The highest resolution isn’t always ideal. For example, the Soft Shadows option
has the effect of blurring the map. In the following image of a cartoon-like
haunted room, you can see that the chair in the foreground casts a shadow on the
desk drawers, which appears too crisp when the resolution is greater than 1024.
© 2024 Unity Technologies 41 of 140 | unity.com
Setting the Shadow Resolution for the Main Light: The resolution is set to 256 in the top-left image, 512 in the top-right image, 1024 in the
middle-left image, 2048 in the middle-right image, and 4096 in the bottom image.
Varying Max Distance for the Main Light Shadow: Top-left image – 10, top-right image – 30, bottom-left image – 60,
bottom-right image – 400
The Max Distance property needs to relate directly to what the user can see,
as well as the units used in the scene. Aim for the minimum distance that gives
acceptable shadows (see note below). If the player only sees shadows from
dynamic objects 60 units from the Camera, then set Max Distance to 60. When
the Lighting Mode for Mixed Lights is set to Shadowmask, the shadows of objects
beyond Shadow Distance are baked. If this was a static scene then you would
see shadows on all objects, but only dynamic shadows would be drawn up to the
Shadow Distance.
Note: URP only supports Stable Fit Shadow Projection, which relies
on the user to set up the Max Distance. The Built-in Render Pipeline
supports both Stable Fit and Close Fit for the Shadow Projection
property. In the latter mode, the bottom-left and -right images would
have the same quality, as Close Fit reduces the shadow distance plane
to fit the last caster. The disadvantage is that, since Close Fit changes
the shadow frustum “dynamically,” it can cause a shimmer/dancing
effect in the shadows.
Shadow Cascades
The images below show the shadow map of the scene with the chair and desk in
the haunted room. The cascade count is 1 in the image to the left. The map takes
up the whole area. In the image to the right, the cascade count is 4. Notice that the
map includes four different maps, with each area receiving a lower resolution map.
A cascade count of 1 is likely to give the best result for small scenes like
this. But if your Max Distance is a large value, then a cascade count of 2 or
3 will give better shadows for foreground objects, as these receive a larger
proportion of the shadow map. Notice that the chair in the left image is much
bigger, resulting in a sharper shadow.
You can adjust the start and end ranges for each section of the cascade
using the draggable pointers, or by setting the units in the relevant fields (see
following image). Always adjust Max Distance to a value that is a close fit for
your scene and choose the slider positions carefully. If you use metric as the
working unit, always choose the last cascade to be, at most, the distance of the
last Shadow Caster.
Having sorted the shadows for the Main Light, it’s time to move on to Additional
Lights Mode. Enable additional lights to cast shadows by setting the Additional
Lights Mode for the URP Asset to Per Pixel. While the mode can be set to Disabled,
Per Vertex, or Per Pixel (see above image), only the latter works with shadows.
Check the Cast Shadows box. Then, select the resolution of the Shadow Atlas.
This is the map that will be used to combine all the maps for every light casting
shadows. Bear in mind that a Point light casts six shadow maps, creating a
cubemap, since it casts light in all directions. This makes a Point light the most
demanding performance-wise. The individual resolution of an additional light
shadow map is set using a combination of the three Shadow Resolution tiers,
plus the resolution chosen via the Light Inspector when selecting the light in the
Hierarchy window.
In the haunted room, there is a Spot light over the mirror and a Point light over
the desk. There are also seven maps. To fit these seven maps onto a 1024px
square map, the size of each map needs to be 256px or smaller. If you exceed
this size, the resolution of shadow maps will shrink to fit the atlas, resulting in a
warning message in the console.
Number of maps Atlas tiling Atlas size (multiply shadow tier size by)
1 1x1 1
2–4 2x2 2
5–16 4x4 4
Setting the Shadow Atlas size based on the number of Additional Lights shadow maps and the tier size chosen per map
The image above shows the six maps used by the Point light where the
resolution is set to medium and the tier value to 256px. The Spot light has a
resolution set to high, with a tier value of 512px.
This is a low-polygon version of the haunted room, lit with a Main Directional light, a Point light over the desk, and a Spot
light over the mirror. All lights are real-time and casting shadows.
The workflow for lightmapping is unchanged between the Built-in Render Pipeline
and URP. Let’s go through the steps using an FPS Sample project by Unity.
Note: Low frequency refers to the fact that lightmaps are updated at
a much lower rate than screen updates. Specular Lobes can only be
computed by real-time lights. You can apply Global Illumination (GI) to
dynamic objects by using Light Probes, but those also only capture low
frequency diffuse light. The Built-in Render Pipeline supports Light Probe
Proxy Volume (LPPV), which provides the same level of quality for Light
Probes as lightmaps do for dynamic objects. However, in URP, LPPV is
not supported due to it being a relatively slow system that doesn’t scale.
Instead, URP plans to support Adaptive Probe Volumes, which could
replace lightmaps and work for both static and dynamic objects.
The following screenshots are from the Unity project FPS Sample: The Inspection,
which you can download here. The scene demonstrates how to use real-time and
baked lighting in URP.
1. The scene from the FPS sample project contains largely static geometry. To
include the geometry in lightmapping, click the Static box to the right side
of the Inspector.
5. Set Light Mode to Baked or Mixed. Select the light in the Hierarchy window
and use the Inspector. Mixed Lights will illuminate dynamic objects as well as
static ones.
a. Baked Indirect: Only the indirect light contribution will be baked into
the lightmaps and Light Probes (the bounces of the lights only). Direct
lighting and shadows will be real-time. This is an expensive option
and not ideal for mobile platforms. However, it does mean that you get
correct shadows and direct light for both static and dynamic geometry.
7. Adjust the Lightmap Scale via Asset > Inspector > Mesh Renderer >
Lightmapping > Scale In Lightmap, so that distant objects take up less
space on the lightmap. The following image shows the texel size of the
background rock lightmap with a setting varying from 0.05 to 0.5.
More resources:
— Lightmapping documentation
The Rendering Layers feature lets you configure certain lights to affect only
specific GameObjects so you can emphasize and draw attention to them in a
scene. In the image below, the syringe, a key collectable, appears in a shaded
part of the scene. With a Rendering Layer, it becomes visible and helps ensure
that the player doesn’t miss picking it up.
1. Select the URP Asset. In the Lighting section, click the vertical ellipsis icon
( ) and select Show Additional Properties.
2. A new setting, Use Rendering Layers, will appear under the Lighting section.
3. Rename a Rendering Layer via Project Settings > Graphics > URP Global
Settings.
5. With Rendering Layers enabled, you need to set up a custom shadow layer.
The new light can cast shadows from the scene’s Main Light or from its own
frustum.
6. Lastly, select the object this applies to in the Hierarchy window and then
set the Rendering Layer Mask.
Light Probes
As covered in an earlier section, you can combine baked and dynamic objects
in the Light Mode section using Mixed Lighting Mode. It’s recommended to also
add Light Probes to your scene when using this mode. Light Probes save the light
data at a particular position within an environment when you bake the lighting
by clicking Generate Lighting via Window > Rendering > Lighting panel. This
ensures that the illumination of a dynamic object moving through an environment
reflects the lighting levels used by the baked objects. In a dark area it will be dark,
and in a lighter area it will be brighter. Below, you can see the robot character
inside and outside of the hangar in the FPS Sample: The Inspection.
The robot inside and outside of the cave, with lighting level affected by Light Probes
Initially, there will be a cube of Light Probes, eight in total. To view and edit the
positioning of the Light Probes and add additional ones, select the Light Probe
Group in the Hierarchy window, and in the Inspector click Light Probe Group >
Edit Light Probes.
Add or remove Light Probes and modify their position from the Inspector.
The Scene view will now be in an editing mode where only Light Probes can be
selected. Use the Move tool to move them around.
Further details on how a Mesh Renderer works with Light Probes and how to
adjust the configuration can be found in this documentation.
Reflection Probes
A ray-tracing tool, such as Maya or Blender, can take the time to accurately
calculate the values for each frame pixel of a reflective surface. This process
takes far too long for a real-time renderer, which is why shortcuts are often used.
To create a Reflection Probe, right-click the Hierarchy window and select Light
> Reflection Probe.
Then position the probe and adjust its settings. Once the probe is placed
correctly and the settings are adjusted, click Bake to generate a cubemap.
The following image shows the two Reflection Probes used in FPS Sample:
The Inspection, one inside the hangar and one outside.
Blending is a great feature of Reflection Probes. You can enable blending via the
Renderer Asset Settings panel. Blending is always on when the Forward+ path
is chosen, regardless of the Renderer Asset setting.
Blending gradually fades out one probe’s cubemap, while fading in the other as
the reflective object passes from one zone to the other. This gradual transition
avoids the situation where a distinctive object suddenly “pops” into the
reflection as an object crosses the zone boundary.
Box Projection
Lens Flare
The workflow for creating a Lens Flare has been updated for URP. The first step in
configuring it is to create a Lens Flare (SRP) Data asset. Right-click in the Project
window, in a suitable Assets folder, and select Create > Lens Flare (SRP).
Use this asset to configure the shape of your flare by setting Type as Circle,
Polygon, or Image assets and adjusting their Tint and Intensity.
In the Settings panel for this component (see following image), assign the Lens
Flare Data asset you created to the Lens Flare Data property.
Light Halos
The Draw Halo option is not available for lights in URP, but it’s easily mimicked
with a billboard. Another option is to set the alpha transparency of a sphere.
The first image below shows the Shader Graph for such a shader, and the
second image depicts the result. For more information on using Shader Graph
to create this shader, see the Additional tools chapter.
Light Halo using a sphere with a material using the Shader Graph shader from above
Since ambient light does not consider geometry by default, high levels of
ambient light can lead to unconvincing renders. In the real world, a narrow gap
between two objects is likely to be darker than a much wider gap. Ambient
Occlusion can help deal with this issue in your Unity project. To use it with URP,
select the Renderer that the URP Asset is using. Go to Add Renderer Feature
and choose Screen Space Ambient Occlusion (SSAO).
— Radius: When Unity calculates the Ambient Occlusion value, the SSAO
effect takes samples of the normal texture within this radius from the
current pixel. A lower Radius value improves performance because the
SSAO Renderer Feature samples pixels closer to the source pixel.
— Falloff Distance: SSAO does not apply to objects farther than this distance
from the Camera. A lower value increases performance in scenes that
contain many distant objects.
— Direct Lighting Strength: This property defines how visible the effect is in
areas exposed to direct lighting.
A scene with only an Ambient Occlusion texture demonstrating a varying falloff distance
SSAO adds shading to narrow gaps. Let’s look at the following three images.
The haunted room screenshot, with no SSAO at the top, with SSAO applied in the middle, and rendered with SSAO at the bottom
Decal Projectors are a great way of adding detail to a mesh. Use them for elements
such as bullet holes, footsteps, signage, cracks, and more. Because they use a
projection framework, they conform to an uneven or curved surface. To use a Decal
Projector with URP, you need to locate your Renderer Data asset and add the Decal
Renderer Feature.
Now your scene is ready for Decals. Create a Decal by right-clicking in the
Hierarchy view and selecting Rendering > URP Decal Projector. By default,
the projector uses the material Decal, which will project a white square onto a
surface. Use the usual tools to position and orientate the projector. Adjust the
Width, Height, and Projection Depth in the Inspector.
To customize the Decal, create a material using the Shader Graph > Decal
shader. This shader has three inputs: Base Map, Normal Map, and Normal Blend.
Once the material is prepared, assign it to the Decal Projector.
The Inspector for a Decal Projector includes three Editing Mode buttons: Scale,
Crop, and Pivot/UV, which you read about here.
Refer back to the Rendering Layers section to learn about setting up and using
this rendering option. Here are the steps to set up a Decal:
1. Use Edit > Project Settings … > Graphics > URP Global Settings to name a
Rendering Layer.
2. Select the mesh/meshes that you want to receive the projector. In the
Inspector, find Mesh Renderer > Additional Settings > Rendering Layer
Mask, and add the named Rendering Layer to the mask.
The image below shows the scene with and without a Decal, and with a wall
projection limited by using Rendering Layers.
From left to right: No decal in the image, the decal hitting all objects, and the decal applied to the wall only using Rendering Layers
This section is for users who want to convert an existing custom shader to work
with URP and/or want to write a custom shader in code without using Shader
Graph. It provides the information required to port both basic and advanced
shaders to URP from the Built-in Render Pipeline. The tables included show helpful
samples of available HLSL shader functions, macros, and so on. In each case, a link
is provided to the relevant include containing many other useful functions.
For those who already have experience coding shaders, the includes provide
you with a clear idea of what’s available in HLSL to write compact and efficient
shaders. After considering the information here, hopefully porting your shaders
to URP won’t seem so daunting.
Another approach is to use Shader Graph to create versions of your custom shaders.
An introduction to Shader Graph is provided in the Additional tools section.
URP shaders use the ShaderLab structure, as seen in the code snippet below. As
such, Property, SubShader, Tags, and Pass will all be familiar to shader coders.
SubShader {
Tags {"RenderPipeline" = "UniversalPipeline" }
Pass {
HLSLPROGRAM
...
ENDHLSL
}
}
The first thing to notice when comparing a URP shader with a Built-in Render
Pipeline shader is the use of the key-value pair "RenderPipeline" =
"UniversalPipeline" in the SubShader tag.
© 2024 Unity Technologies 69 of 140 | unity.com
A SubShader tag with the name RenderPipeline tells Unity which render
pipelines to use this SubShader with. The value of UniversalPipeline
indicates that Unity should use this SubShader with URP.
Looking at the render Pass code, you’ll see the shader code contained between
the HLSLPROGRAM / ENDHLSL macros. This indicates the former CG (C for
Graphics) shader programming language has been replaced by HLSL (High
Level Shading Language) although the shader syntax and functionality are
near-identical. Unity switched to HLSL a long time ago, so this shouldn’t come
as a surprise, but now the CGPROGRAM / ENDCG macros are not recommended.
Using these macros implies using UnityCG.cginc. Mixing the SRP and Built-in
Render Pipeline shader libraries in this way can cause several problems.
For URP, the shader code inside those passes is written in HLSL. Although most of
the ShaderLab hasn’t changed compared to the Built-in Render Pipeline, shaders
written for the Built-in Render Pipeline are automatically disabled by URP.
The reason for this is the change in the internal lighting process. While the Built-
in Render Pipeline performs separate shader passes for every light that reaches
an object (multipass), the URP Forward Renderer evaluates all lighting in a light
loop in a single pass. This change leads to different data structures that store
light data and new shading libraries with new conventions.
Unity will use the first SubShader block that is supported on the GPU. If the first
SubShader block doesn’t have a “RenderPipeline” = ”UniversalPipeline”
tag, it won’t run in the URP. Instead, Unity will try to run the next SubShader, if
any. If none of the SubShaders are supported, Unity will render the well-known
magenta error shader.
A SubShader can contain multiple Pass blocks, but each of them should be
tagged with a specific LightMode. As URP uses a single-pass Forward Renderer,
only the first “UniversalForward” Pass supported by the GPU will be used to
render objects in Forward rendering.
As covered earlier, using Window > Rendering > Render Pipeline Converter
converts Built-in Render Pipeline shaders to URP shaders for all materials
automatically. But what about custom shaders?
Custom shaders
Custom shaders require some work when upgrading to URP. Listed below are the
actions needed to perform on legacy shaders as part of the upgrading process.
Github link
Notation for the space type at the end of the variable name:
Other shader functions, including fog and UV, can be found in this include, which is
added by default when you use Core.hlsl. The following table lists a few examples.
— Platforms-specific functions
— Texture utilities
— Depth encoding/decoding
— Space transformations
Some of them are listed in the table below. The type real is set in the file;
depending on various flags, it could be a half or a float.
Preprocessor macros
Preprocessor macros are handy and regularly used. When porting the
Built-in Render Pipeline shaders to new URP shaders, you’ll need to replace
the Built-in Render Pipeline macros with their URP equivalents.
** The _PARAM are macros that can be used to declare functions with texture
and sampler arguments. Check out this document for more information.
LightMode tags
The LightMode tag defines the role of Pass in the lighting pipeline. In the Built-in
Render Pipeline, most shaders that need to interact with lighting are written as
Surface Shaders with all the necessary details taken care of. However, custom
shaders in the Built-in Render Pipeline need to use the LightMode tag to
specify how the Pass is considered in the lighting pipeline.
The table below indicates the correspondence between the LightMode tags
used in the Built-in Render Pipeline and the tags that URP expects. Several
legacy Built-in Render Pipeline tags are not supported in URP: PrepassBase,
PrepassFinal, Vertex, VertexLMRGBM, and VertexLM. At the same time,
there are other tags in URP with no equivalent in the Built-in Render Pipeline.
When writing a shader for URP, it’s a good idea to look at the provided shaders
and the Passes they use. The following code example shows some of the code
from the Lit Shader. The complete shader is here.
Blend[_SrcBlend][_DstBlend], [_SrcBlendAlpha][_DstBlendAlpha]
ZWrite[_ZWrite]
Cull[_Cull]
AlphaToMask[_AlphaToMask]
HLSLPROGRAM
#pragma exclude_renderers gles gles3 glcore
#pragma target 4.5
…
#include "Packages/com.unity.render-pipelines.universal/Shaders/Lit-
Input.hlsl"
#include "Packages/com.unity.render-pipelines.universal/Shaders/Lit-
ForwardPass.hlsl"
ENDHLSL
}
Pass
{
Name "ShadowCaster"
Tags{"LightMode" = "ShadowCaster"}
ZWrite On
ZTest LEqual
ColorMask 0
Cull[_Cull]
HLSLPROGRAM
#pragma exclude_renderers gles gles3 glcore
…
#include "Packages/com.unity.render-pipelines.universal/Shaders/Lit-
Input.hlsl"
#include "Packages/com.unity.render-pipelines.universal/Shaders/
ShadowCasterPass.hlsl"
ENDHLSL
}
Note: A great resource for users planning to write shaders for URP
is this tutorial by Cyanilux.
A great feature of SRPs is that you can add code at just about any stage of the
rendering process using a C# script. Scripts can be injected at stages such as:
— Rendering shadows
— Rendering prepasses
— Rendering G-buffer
— Rendering opaques
— Rendering Skybox
— Rendering transparents
— Rendering post-processing
You can inject scripts in the rendering process via the Add Renderer Feature
option in the Inspector for the Universal Renderer Data Asset. Remember, when
using URP, there is a Universal Renderer Data object and a URP Asset. The
URP Asset has a Renderer List with at least one Universal Renderer Data object
assigned. It is the asset you assign in Project Settings > Graphics > Scriptable
Render Pipeline Settings.
If you are experimenting with multiple setting assets for different scenes, then
attaching the following script to your Main Camera can be useful. Set the
Pipeline Asset in the Inspector. Then it will switch the asset when the new
scene is loaded.
[ExecuteAlways]
public class AutoLoadPipelineAsset : MonoBehaviour
{
public UniversalRenderPipelineAsset pipelineAsset;
The next section covers two different types of Renderer Features, one for artists
and the other for experienced programmers.
Render Objects
1. First, you need a material to use when the character is masked. Create a
material and set the shader to Universal Render Pipeline > Lit or Unlit (the
previous image shows the Lit option). Set the Surface Inputs > Base Map
color. In this example, the material is called Character.
3. Select the Renderer Data object used by the URP Asset. Go to the Opaque
Layer Mask and exclude the SeeBehind layer. The character will then
disappear.
Set the Layer Mask to the SeeBehind layer, which was the layer chosen for the
character. Expand the Overrides and set the Override Mode to Material. Select
the material created in step 1. You’ll want to use Depth when rendering, without
having to update the depth buffer by writing to it. Set the Depth Test to Greater
so that this Pass only renders when the distance to the rendered pixel is further
from the Camera than the distance currently stored in the depth buffer.
6. At this stage, you only see the silhouette of the character when it’s behind
another object. You don’t see the character at all when it’s in full view. To
fix this, add another Render Objects feature. This time you don’t need to
update the Overrides panel. This Pass will draw the character when not
masked by another object.
Renderer Feature
A Renderer Feature can be used at any stage in URP to affect the final render.
Let’s go through a simple example of adding a post-processing effect. In a
project using the Built-in Render Pipeline, you would have to add a Graphics.
Blit using the OnRenderImage callback. This example uses the version of the
function with a material to process each pixel in the image.
Material material;
RTHandle cameraColorTarget;
Color color;
4. Add a constructor to the TintPass to initialize the material, and set the
position of this pass in the render pipeline.
6. Create a new shader, name it TintBlit, and copy the code below. Notice
the RenderPipeline tag. ZWrite and Cull are both off. Core.hlsl is
imported from com.unity.render-pipelines.universal and Blit.hlsl from
com.unity.render-pipelines.core. If you select the Opaque Texture in
the URP Asset Inspector, then the pipeline creates a Render Texture,
_CameraOpaqueTexture.
Shader "Custom/TintBlit"
{
SubShader
{
Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipe-
line"}
LOD 100
ZWrite Off Cull Off
Pass
{
Name "TintBlitPass"
HLSLPROGRAM
#include "Packages/com.unity.render-pipelines.universal/Shad-
erLibrary/Core.hlsl"
// The Blit.hlsl file provides the vertex shader (Vert),
// input structure (Attributes) and output structure (Vary-
ings)
#include "Packages/com.unity.render-pipelines.core/Runtime/
Utilities/Blit.hlsl"
TEXTURE2D(_CameraOpaqueTexture);
SAMPLER(sampler_CameraOpaqueTexture);
float4 _Color;
Material material;
material = CoreUtils.CreateEngineMaterial(shader);
renderPass = new TintPass(material);
10. Now that you have created and initialized an instance of the TintPass, add
it to the render queue. Add the next code snippet in the AddRenderPasses
method, and, once again, wrap the code inside an if statement, checking
the current camera type is Game.
if (renderingData.cameraData.cameraType == CameraType.Game)
renderer.EnqueuePass(renderPass);
ConfigureTarget(cameraColorTarget);
13. Now that everything is initialized, you can do the actual work of copying the
current Render Texture using a material to process the result. Add the code
below to the Execute method.
if (material == null)
return;
material.SetColor("_Color", color);
Blit(cmd, cameraColorTarget, cameraColorTarget, material, 0);
context.ExecuteCommandBuffer(cmd);
cmd.Clear();
CommandBufferPool.Release(cmd);
14. To see the effect in action, select the Renderer Data object and click Add
Renderer Feature. TintFeature will appear in the list.
15. Here is the complete TintFeature code, with the final result shown below.
using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
if (material == null)
return;
material.SetColor("_Color", color);
Blit(cmd, cameraColorTarget, cameraColorTarget, material, 0);
context.ExecuteCommandBuffer(cmd);
cmd.Clear();
CommandBufferPool.Release(cmd);
}
}
Material material;
If you want to allow a user to choose where in the render pipeline to use this you
could add an additional property, as outlined below.
Applying post-processing effects: The top-left image has no effects applied, the top-right image has Bloom applied, the
bottom-left has Vignette applied, and the bottom-right has Color Adjustment added.
1. The first step is to make sure your Main Camera has post-processing enabled.
Select the Main Camera in the Hierarchy window, go to the Inspector, and
expand the Rendering panel. Check the Post Processing option.
2. Right-click the Hierarchy window and select Create > Volume > Global
Volume to create a Global Volume.
3. With the Global Volume selected in the Hierarchy window, find the Volume
panel in the Inspector and create a new Profile by clicking on New.
4. Start adding post-processing effects. See the table further down that lists
available effects. Click Add Override and select Post-processing. In this
example, the Bloom effect is chosen.
5. Each effect has a dedicated Settings panel. The image here shows the
settings for Bloom.
With the Volume framework, you can configure the scene so that as a Camera
moves around it, different post-processing profiles are triggered. This is achieved
by adding a Local Volume. Let’s go through the steps for setting this up.
1. In the Hierarchy window, right-click and choose Create > Volume > Box
Volume. Alternatively, choose Sphere Volume if this shape is more suited
to your purpose, or Convex Mesh Volume for a tighter control over the
shape of the Collider that defines the Volume region.
a. Blend Distance: This is the furthest distance from the Volume’s Collider
that URP starts blending from, and the distance in Collider dimensions
where this profile fades in. At the edge of the Collider, the post-
processing effects will fade out and the Blend Distance from the edge
of the Collider will fully fade in.
c. Priority: Use this value to determine which Volume URP is used when
multiple Volumes have an equal amount of influence on the scene. The
higher the number, the higher the Priority. If you are merging Global
and Local, then keep Global at the default 0 setting and set the Local
Volume(s) to 1 or more.
3. Position the Volume and control its dimensions using the Box Collider
component, as shown in the image below.
Positioning and sizing a Box Volume using the attached Box Collider component
Effect Description
Bloom Adds a glow around pixels above a defined
brightness level.
Channel Mixer Modifies the influence of each input color channel
on the overall mix.
Chromatic Creates fringes of color along boundaries that
Aberration separate dark and light parts of the image.
Color Adjustments Use this effect to tweak the overall tone, brightness,
and contrast of the final rendered image.
Color Curves Grading curves are an advanced way to adjust
specific ranges in hue, saturation, or luminosity.
Color Lookup This maps the colors of each pixel to a new value using
a Lookup Texture.
You can also dynamically adjust your post-processing profile using a C# script.
The following code example shows how to adjust the intensity of the Bloom
effect. If a Vignette is applied, you can control the vignetting color via code.
For example, if the player character takes damage, you can temporarily tint it red.
using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
2. Use the Inspector > Camera Settings panel to set this Camera as Render
Type Overlay.
3. Create a new Layer for the Camera and the GameObjects it renders.
4. Update the Rendering > Culling Mask for the Camera using the Inspector.
6. Make sure the Main Camera does not render Overlay by updating its
Rendering > Culling Mask.
7. In the Stack panel, use the “+” button to add the Overlay Camera.
As with post-processing, you can control the stack from code, and add
or remove cameras dynamically during runtime. See this code example:
using UnityEngine;
using UnityEngine.Rendering.Universal;
Sometimes you might want to render your game to a different destination than
the user’s screen. The SubmitRenderRequest API is designed with this purpose
in mind. Let’s look at a possible use case.
Coding a screengrab
The script below will render the game to an off-screen RenderTexture when
the user presses the onscreen GUI. The script should be attached to the Main
Camera. A RenderTexture is created in the Start callback. It is 1920 x 1080
pixels with a bit depth of 24. When the user presses the “Render Request”
button, the RenderRequest method is called.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Rendering;
[RequireComponent(typeof(Camera))]
public class StandardRenderRequest : MonoBehaviour
{
[SerializeField]
RenderTexture texture2D;
void RenderRequest()
{
Camera cam = GetComponent<Camera>();
if (RenderPipeline.SupportsRenderRequest(cam, request))
{
// 2D Texture
request.destination = texture2D;
RenderPipeline.SubmitRenderRequest(cam, request);
SaveTexture(ToTexture2D(texture2D));
}
}
Another benefit of using URP is its compatibility with Unity’s latest authoring
tools that bring complex creation tasks into the reach of technical artists.
This chapter unpacks how to create shaders using Shader Graph, and how
to create particle effects using the Visual Effects (VFX) Graph.
Shader Graph
Shader Graph brings custom shaders to an artist’s workflow. The Shader Graph
tool is included when you start a project using the URP template or import the
URP package.
Covering Shader Graph warrants a separate guide, but let's go over some basic
yet crucial steps by creating the Light Halo shader from the Lighting chapter.
1. Right-click in the Project window, find a suitable folder, and choose Create
> Shader Graph > URP > Unlit Shader Graph. For this example, choose
Unlit. Name the new asset FresnelAlpha.
If you’re familiar with shaders, then you’ll recognize the Vertex and Fragment
nodes. By default, this shader will ensure any model with a material using
it that it is correctly placed in the Camera view using the Vertex node, and
that each pixel is set to a grey color using the Fragment node.
3. This shader is going to set the alpha transparency of the object. It therefore
needs to apply to the Transparent queue. Change the Graph Inspector
> Graph Settings > Surface Type to Transparent. You’ll see that the
Fragment node now has an Alpha input as well as Base Color.
6. Shader Graph functions by joining nodes together. A node will have one
or more inputs and an output. To add a node, right-click and choose
Create Node in the Search panel at the top, then enter Fre. The results
will show a Fresnel Effect node.
The alpha value should be lowest at the edge. You can flip the result using
a One Minus node. To do this, click Create Node and enter One. Select the
One Minus node. Now drag from Out(1) on the Fresnel Effect node to In(1)
on the One Minus node. The 1 means that the value type is a single float.
If it was 3, then it would be a vector with three components.
8. Let’s look at how to control the size of gradient and the overall
transparency. Use a Power node for sizing the gradient. Create a Power
node and connect One Minus Out(1) to Power A(1). Drag the Power property
to the graph and join it to Power B(1). The graph should now look like this:
10. Save the asset and create a new material. Assign this shader to the new
material, which is located in Shader Graphs/FresnelAlpha.
The Fullscreen Shader Graph is new to Unity URP 2022 LTS. It allows you to
create custom post-processing passes. Right-click in the Project pane and
select Create > Shader Graph > URP > Fullscreen Shader Graph.
You can access a pixel’s color for the fragment shader using a URP Sample
Buffer node that itself uses the BlitSource option. The graph below shows a
simple tint example. The URP Sample Buffer also gives access to world normals
and motion vectors that are useful for edge detection and motion trails.
With the active Renderer Data asset selected, use the Inspector to add a
Renderer Feature. Select Full Screen Pass Renderer Feature.
It just remains to update the settings for this Renderer Feature. Set the material
you created that uses the Fullscreen Shader Graph, then select the position in
the render pipeline.
The image below shows the tint effect on the left. The Fullscreen Shader Graph
is a useful way to create custom post-processing effects.
— This blog post goes through the Shader Graph process with an example
project and some advanced suggestions.
VFX Graph
The Visual Effect (VFX) Graph enables you to create myriad particle effects
with an artist-friendly, node-based graph. Use a VFX Graph to add fire, smoke,
mist, sparks, magic orbs, and many other effects to your project.
The target devices for any games containing effects created with VFX Graph
must be compute-capable because VFX Graph uses compute shaders running on
the GPU to ensure the best possible performance. Test your code and include a
non-compute fallback, and use VFX Graph sparingly for games targeting low-end
mobile devices.
To get better acquainted with VFX Graph, let’s go through the steps for creating
a smoke effect:
2. Once VFX Graph is installed, there will be a new option when you right-click
in the Project window > Assets folder. Choose Create > Visual Effects >
Visual Effect Graph, and name the new asset Smoke.
4. Select the Smoke VFX Graph as the Asset Template using the Component
Settings panel.
5. Now you can edit the VFX Graph. Double-click to launch the Visual Effect
Graph window. There you’ll find Spawn, Initialize, Update, and Output
Context nodes already prepopulated.
You’ll use a Texture in the form of an Atlas that contains an animated smoke
sprite. A series of 64 images in an 8x8 grid will act as the source for an
individual particle. At any single frame, a single particle will display just one
image from the grid. It will cycle through the images at a predefined rate
as each frame is rendered. Here is the Smoke Sprite Atlas:
7. Let’s look at the Spawn block. The default Spawn block comes with a
Constant Spawn Rate node. Set this to 20.
8. The next block, Initialize, defines how to handle a particle when it’s first created.
Remove the Set Lifetime Random node. Then add a Set Tex Index, and set it to
a random value from 0 to 63, so that each smoke particle has a different look.
This is important because the particle displays an image from the Smoke Sprite
sheet shown earlier and you’ll want the first index used to be 0.
Recall that you’re using a Sprite sheet for the image of each particle. In VFX
Graph, this means you’re using a Flipbook. Add a Flipbook Player node,
set its Mode to Constant, and the Frame Rate to 16. It will cycle through
consecutive frames in the Flipbook at 16 frame changes per second.
If you’re upgrading an existing project, then you need to find a suitable folder in
your project’s Assets folder. Right-click and select Create > Rendering > URP
Asset (with 2D Renderer). Give it a name, and select it using Project Settings >
Graphics > Scriptable Render Pipeline Settings. In the Scene view, be sure to
select the 2D button when editing.
Updating an existing project with URP 2D Renderer can result in rendering errors in your scene.
Fortunately, the Window > Rendering > Render Pipeline Converter has got you
covered. Select Built-in to 2D (URP) and click the Material and Material Reference
Upgrade panel. Then click Initialize Converters, followed by Convert Assets to
be able to deselect some items or Initialize And Convert to handle the process
with one click. If you still see magenta-colored sprites, you might need to manually
replace the shader in some of your materials. Choose one of the shaders in the
following table.
Add a light using the Hierarchy window. Right-click and choose Light > Global
Light 2D.
— Spot: Provides great control over the angle and direction of the selected
light. Use it as a Point light. By default, the inner and outer cones span 360
degrees. You can also adjust the inner and outer radius and decide whether
the light casts shadows, as well as the strength of those shadows.
The URP 2D Renderer provides all the tools necessary to create first-class 2D
games that will perform well on even low-end hardware.
An image from the Unity 2D demo Dragon Crashers; Unity’s 2D development e-book, 2D game art, animation, and
lighting for artists, was authored by the creative director of Dragon Crashers.
Related links:
— The Unity 2D demo Dragon Crashers is available on the Unity Asset Store.
— The free e-book 2D game art, animation, and lighting for artists is an
advanced development guide created for Unity developers and artists
planning to make a commercial 2D game.
This section looks at seven ways to improve the performance of your games:
— Profiler
URP is built with optimized real-time lighting in mind. The URP Forward Renderer
supports up to eight real-time lights per object and up to 256 real-time lights
per camera for desktop games, plus 32 real-time lights per camera for mobile
and other handheld platforms. URP also allows for configurable per-object Light
settings inside the Pipeline Asset for refined control over lighting.
As explained in the Lighting chapter, baked lighting is one of the best ways to
improve the performance of your scene. Real-time lighting can be expensive,
whereas baking lights can help you gain back performance, assuming the lights
in your scene are static. The baked lighting textures are batched into a single
draw call, without needing to be continuously calculated. This is especially
useful if your scene uses multiple lights. Another great reason to bake your
lighting is that it allows you to render bounced or indirect lighting in your
scene and improve the visual quality of the render.
When baked, areas of shadow in a scene receive the bounced light and are
illuminated. It can be subtle, but this technique spreads the light around a
scene more realistically and improves its overall appearance.
In the previous image, you can see that the specular highlights on the ground
are lost when baking. Baked lights only contain diffuse lighting. Whenever
possible, compute the direct lighting contribution from real-time, and have
Global Illumination come from Image Based Lighting (IBL)/shadow maps/Probes.
The effect of light baking on shadows: before baking on the left, and after baking on the right
Light Probes
As explained in the Lighting section, Light Probes sample the lighting data in
the scene during baking and allow the bounced light information to be used by
dynamic objects as they move or change. This helps them blend into and feel
more natural in the baked lighting environment.
Light Probes add naturalism to a render without increasing the processing time
for a rendered frame. This makes them suitable for all hardware, even low-end
mobile devices.
The effect of using Light Probes when rendering a dynamic object: with Light Probe on the left, and without on the right
Reflection Probes
You can also use Reflection Probes to optimize your scene. Reflection Probes
project parts of the environment onto nearby geometry to create more realistic
reflections. By default, Unity uses the Skybox as the reflection map. But by using
one or more Reflection Probes, the reflections will match their surroundings more
closely.
The effect of using Reflection Probes on smooth surfaces: with Reflection Probes on the left and without on the right
Camera settings
The URP enables you to disable unwanted renderer processes on your cameras
for performance optimization. This is useful if you’re targeting both high- and
low-end devices in your project. Disabling expensive processes, such as post-
processing, shadow rendering, or depth texture can reduce visual fidelity but
improve performance on low-end devices.
Occlusion culling
Another great way to optimize your Camera is with occlusion culling. By default,
the Camera in Unity will always draw everything in the Camera’s frustum,
including geometry that might be hidden behind walls or other objects.
There’s no point in drawing geometry that the player can’t see, and that takes
up precious milliseconds. This is where occlusion culling comes in.
Frustum culling in the image on left, and occlusion culling in the image on right
Open Window > Rendering > Occlusion Culling, and select the Bake tab.
In the bottom-right corner of the Inspector, press Bake. Unity generates
occlusion data, saving the data as an asset in your project and linking the
asset to the current scene.
As you move the Camera, you should see objects popping on and off.
The effect of occlusion culling off in the left image, and on in the right image
Pipeline settings
While the effects of changing the settings for the URP Asset and using different
Quality tiers were previously covered, here are some additional tips
for experimenting with Quality tiers to get the best results for your project:
— Disable features that your project does not require, such as depth texture
and opaque texture.
— Enable the SRP Batcher to use the new batching method. The SRP Batcher
will automatically batch together meshes that use the same shader
variant, thereby reducing draw calls. If you have numerous dynamic
objects in your scene, this can be a useful way to gain performance.
If the SRP Batcher checkbox is not visible, then click the three vertical
dots icon ( ) and select Show Additional Properties.
Frame Debugger
Adjusting the Debug Level can affect performance. Always turn it off when the
Frame Debugger is not in use.
The Frame Debugger shows a list of all the draw calls made before rendering
the final image and can help you pinpoint why certain frames are taking a long
time to render. It can also identify why your scene’s draw call count is so high.
Open the Frame Debugger by going to Window > Analysis > Frame Debugger.
When your game is playing, select the Enable button. This will pause the game
and let you examine the draw calls.
Clicking a stage in the render pipeline (left pane) will show a preview of this
stage in Game view.
The Frame Debugger shows every step of the rendering process in the Game View – in this case, the SSAO generation step.
Unity Profiler
Like the Frame Debugger, the Profiler is a great way to determine how long
it takes to complete a frame cycle in your project. It provides an overview of
rendering, memory, and scripting. You can identify scripts that take a long time
to complete, helping you to pinpoint potential bottlenecks in your code.
Open the Profiler via Window > Analysis > Profiler. When in Play Mode, the
window provides an overview of the overall performance of your game. You can
also pause the live view and use the Hierarchy Mode to get a breakdown of the
time taken to complete a single frame. The Profiler will show you each call Unity
has made during the frame.
The Profiler window using the low-level native plug-in Profiler API
A new URP 3D Sample is available via the Unity Hub. This sample project
replaces the construction scene that will be familiar to many developers who
have been using URP for a few years. The URP 3D Sample contains four distinct
environments that illustrate the capabilities of URP in Unity 2022 LTS.
The URP 3D Sample is available in the Unity HUB when you start a new project in Unity 22 LTS, you can see more about
the sample in this website
This scene illustrates how you can efficiently scale your content with URP to
suit multiple platforms, from mobile and console to high-end gaming desktops.
It features stylized PBR rendering, customizable vegetation, and rendering
numerous lights with the new Forward+ renderer that surpasses previous light
count limits.
The oasis
This is a photorealistic scene with highly detailed textures, VFX Graph effects,
SpeedTree, and a custom water solution. It targets devices that support
compute shaders.
This scene uses custom lighting code with Shader Graph. It’s designed for
untethered VR devices such as Meta Quest 2.
The terminal
This scene is the link between the other sample scenes, providing a transition
effect to move from one scene to the next, It also features the perfect setting
for you to drop in assets for look-dev.
The sample project uses a transition effect to move between scenes. The
transition effect uses an off-screen render target to render the incoming scene
before the transition is complete. The incoming scene is then rendered to large
monitors placed in the outgoing scene using a custom shader created with
Shader Graph, and the full-screen swap is handled using a stencil via a Render
Objects Renderer Feature.
All scene assets are loaded at load time, but only a single scene is enabled. The
cameras used at runtime, when starting from The Terminal scene, are the same
as those found in the FPS_Controller GameObject. MainCamera renders the
active scene, and ScreenCamera the scene displayed on the monitors.
During a transition, the incoming scene camera is rendered to the render target.
This creates a potential problem since URP only supports one main directional
light. A script called Scripts > SceneManagement > SceneTransitionManager.cs
runs before rendering, enabling the active scene’s main light and disabling the
other to keep to this restriction.
The same script handles switching the fog, reflection, and skybox to suit the
scene being rendered by adjusting the settings of the RenderSettings object.
The transition between the incoming and outgoing scenes is handled using a
Render Objects Renderer Feature. By writing a value to the stencil buffer, this
can be checked in a subsequent pass. If the pixel being rendered has a certain
stencil value, then you keep what is already in the color buffer; otherwise, you
can freely overwrite it. Renderer Features are a highly flexible way to build a
final render using combinations of passes.
Quality levels
Each option uses a different Render Pipeline Asset. As explained in the Quality
section, URP handles Quality using a combination of this panel together with the
settings of the Render Pipeline Asset.
As usual for a Toon shader, it combines the normal vector and the Main light
direction using a dot product to determine the lighting level. It then uses a
ramp to set staged levels of light rather than smoothly changing values. The
lighting model used in The Cockpit scene also uses Baked Global Illumination
in the calculation and does some edge detection to add a subtle outline effect.
Custom lighting is handled using Shader Graph.
See The Universal Render Pipeline cookbook for a tutorial about creating Toon
Shaders.
The screengrab on the next page shows the settings for the Mobile Forward+
asset.
— Reduce the number of pixels rendered. Most modern mobiles have a high
DPI or dots-per-inch count. For most games, a DPI of 96 is sufficient. If
Screen.DPI is 300, for example, then a render scale of 96/300 on a 2400
x 1200 screen would mean rendering 768 x 384 pixels, almost a tenth of
the pixels, which is a massive performance boost. You can set the render
scale in the URP Asset or adjust the value at runtime.
A careful study of these four scenes alongside their URP Asset settings and
documentation will help you learn how to use the techniques on display in your
own projects.
Neon White by Angel Matrix and Ben Esposito, published by Annapurna Interactive
Pixel Ripped 1978 by ARVORE Immersive Experiences Death in the Water 2 by Lighthouse Games Studio