Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Introduction To The Universal Render Pipeline For Advanced Unity Creators 2024

Download as pdf or txt
Download as pdf or txt
You are on page 1of 140

U N I T Y 2 0 2 2 LT S E D I T I O N ⟶ E - B O O K

INTRODUCTION TO THE

Universal Render
Pipeline for advanced
Unity creators
Contents

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Author and contributors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Evolution of rendering: From Built-in Render


Pipeline to SRP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

Why choose URP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

The conversion process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

How to open a new URP project. . . . . . . . . . . . . . . . . . . . . . . . 12

How to add URP to an existing Built-in Render


Pipeline project. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Converting the scenes of an existing project . . . . . . . . . . . . . 18

Converting custom shaders . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Comparing Quality options between the Built-in Render


Pipeline and URP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Built-in Render Pipeline to URP: Low settings . . . . . . . . . . 21

Built-in Render Pipeline to URP: High settings. . . . . . . . . 23

How to work with Quality settings . . . . . . . . . . . . . . . . . . . . . 25

Quality settings when using URP . . . . . . . . . . . . . . . . . . . . . . 25

Modifying a URP Asset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

Converting an example project from the Built-in Render


Pipeline to URP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

Lighting in URP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

URP shaders for lit scenes. . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

Built-in Render Pipeline vs URP lighting falloff and


attenuation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

Lighting overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

Camera light limits when using the URP


Forward Renderer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Rendering path comparison. . . . . . . . . . . . . . . . . . . . . . . . 37

Light Inspector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

Lighting a new scene. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

Ambient or Environment lighting. . . . . . . . . . . . . . . . . . . . 39

Shadows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

Main Light: Shadow Resolution. . . . . . . . . . . . . . . . . . . . . . 41

Main Light: Shadow Max Distance . . . . . . . . . . . . . . . . . . 42

Shadow Cascades . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

Additional Light Shadows. . . . . . . . . . . . . . . . . . . . . . . . . . 44

Light Modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

Rendering Layers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

Light Probes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

Reflection Probes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

Reflection Probe blending . . . . . . . . . . . . . . . . . . . . . . . . . 58

Box Projection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

Lens Flare. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

Light Halos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

Screen Space Ambient Occlusion. . . . . . . . . . . . . . . . . . . . . . 62

Decals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

Shaders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

Comparing URP and Built-in Render Pipeline shaders . . . . . 69

Custom shaders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

Includes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

Other useful HLSL includes. . . . . . . . . . . . . . . . . . . . . . . . . 71

Preprocessor macros. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

Light Mode tags . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74


Pipeline callbacks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

Render Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

Renderer Feature. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

Post-processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

Using the URP post-processing framework. . . . . . . . . . . . . . 90

Adding a Local Volume . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

Controlling post-processing with code . . . . . . . . . . . . . . . . . 95

Camera Stacking. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96

Controlling a stack with code. . . . . . . . . . . . . . . . . . . . . . . . . 98

The SubmitRenderRequest API . . . . . . . . . . . . . . . . . . . . . . . . . 99

Coding a screengrab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

Additional tools compatible with URP . . . . . . . . . . . . . . . . . . . . 102

Shader Graph. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

Fullscreen Shader Graph . . . . . . . . . . . . . . . . . . . . . . . . . 108

VFX Graph. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110

2D Renderer and 2D lights. . . . . . . . . . . . . . . . . . . . . . . . . . . . 115

Performance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120

Optimizing lighting and rendering in URP. . . . . . . . . . . . . . . . 121

Light Probes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122

Reflection Probes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122

Camera settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

Occlusion culling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

Pipeline settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125

Frame Debugger. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126

Unity Profiler. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127


URP 3D Sample. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129

The garden. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130

The oasis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130

The cockpit. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131

The terminal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131

Moving between the environments . . . . . . . . . . . . . . . . . . . 132

Scalability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135

Running the sample project on a mobile device . . . . . . . . . 136

Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
Introduction

This guide is intended to help experienced Unity developers and technical


artists migrate their projects from the Built-in Render Pipeline to the Universal
Render Pipeline (URP). Topics covered include how to:

— Set up URP for a new project, or convert an existing Built-in Render


Pipeline-based project to URP

— Update Built-in Render Pipeline-based lights, shaders, and special effects


for URP

— Understand callback differences between the two rendering pipelines,


performance optimization in URP, and more

The limitations of the Built-in Render Pipeline have become apparent as the
number of platforms supported by Unity continues to grow. Each additional
platform and API adds complexity to modifying and maintaining Built-in Render
Pipeline architecture.

In 2018, Unity released two new Scriptable Render Pipelines (SRPs): the
High Definition Render Pipeline (HDRP) and URP. These SRPs enable you to
customize the culling of objects, their drawing, and the post-processing of
the frame without having to use low-level programming languages like C++.
You can also create your own fully customized SRP.

© 2024 Unity Technologies 6 of 140 | unity.com


URP 3D Sample available in the Unity Hub

The aim of SRP architecture is to provide deep flexibility and customization,


enhanced performance across the gamut of supported and future platforms,
and quick iteration to unleash your creativity.

Multiplatform deployment is a key factor in the success of many games.


Players often play the same game on different devices, such as console and
mobile, meaning Unity developers require rendering options that scale up and
down for numerous devices, with as few steps and little complexity as possible.

After several years of dedicated development, URP technology is now solid


and production-ready. This guide will help you leverage its benefits for the
successful development of your game.

Author and contributors

Nik Lever, the author of this e-book, has been creating real-time 3D content
since the mid-nineties and using Unity since 2006. For over 30 years he’s led
the small development company, Catalyst Pictures, and has provided courses
since 2018 with the aim of helping game developers expand their knowledge
in a rapidly evolving industry.

© 2024 Unity Technologies 7 of 140 | unity.com


Unity contributors

Felipe Lira is a senior manager of graphics and the URP. With over 13 years
of experience as a software engineer in the games industry, he specializes
in graphics programming and multiplatform game development.

Ali Mohebali is the graphics product management lead for Unity Runtime and
Editor. Ali has 20 years of experience working in the games industry, and has
contributed to hit titles such as Fruit Ninja and Jetpack Joyride, both by Halfbrick
Studios.

Steven Cannavan is a senior development consultant on the Accelerate


Solutions Games team, specializing in the Scriptable Rendering Pipelines.
He has over 15 years of experience in the game development industry.

Important contributions were also made by Unity URP engineering and sample
project teams.

A scene made with URP

© 2024 Unity Technologies 8 of 140 | unity.com


Evolution of rendering:
From Built-in Render
Pipeline to SRP

One of Unity’s biggest strengths is its platform reach. The ideal for all game
studios is to create once and efficiently deploy their game to their desired range
of platforms, from high-end PCs to low-end mobile.

The Built-in Render Pipeline was developed to be a turnkey solution for all
platforms supported by Unity. It supports a mix of graphics features and is
convenient to use with Forward and Deferred pipelines.

However, as Unity continues to add support for more platforms, we have


perceived the following shortcomings surrounding the Built-in Render Pipeline:

— The bulk of the code is written in C++ and can’t be modified by


developers, making it a blackbox system

— The render flow and render passes are prestructured

— The rendering algorithm is hardcoded

— Unconstrained customization makes achieving good performance on all


platforms difficult

— It exposes callbacks in the rendering code that trigger sync points in the
pipeline. Those callbacks prevent multithreaded rendering optimizations,
enabling changes for injection of state at any point in the frame
dynamically by calling to C#

— Caching data to manage the persistence state for user injection is difficult

The solution: Scriptable Render Pipelines

The SRPs were developed to support an efficient multiplatform workflow by


providing:

© 2024 Unity Technologies 9 of 140 | unity.com


— Intelligent and reliable scaling for the maximum number of hardware
platforms, from high- to low-end devices

— The ability to customize rendering processes using C#, not C++.


Using C# means a new executable does not need to be compiled
for every change

— Flexibility to support architecture evolution

— Flexibility to create sharp graphics that are performant


across many platforms

The image below illustrates how SRPs work. Use C# to control and customize
render passes and rendering control, as well as HLSL Shaders that can be
created using artist-friendly tools such as Shader Graph. Shaders give you
access to even the lower-level API and engine-layer abstractions.

The new graphics programmable model for the Scriptable Render Pipelines

An advanced user can create a new SRP from scratch or modify the HDRP
or URP. The graphics stack is open source and available for use on GitHub.

Why choose URP

— Accessible to a wide range of users: URP is configurable by artists


and technical artists alike, providing more flexibility for prototyping and
refining rendering techniques for full game production.

— Extendable and customizable: URP allows users to modify existing


capabilities and extend the pipeline with new ones, making it a solid
choice for advanced users, including Asset Store and third-party
package creators, experienced studios, and advanced teams.

While the low-level rendering API is written in C++ for performance


purposes, a URP developer can write a simple C# script to be called

© 2024 Unity Technologies 10 of 140 | unity.com


during the render pipeline, enabling high-level customization without
sacrificing performance.

— Multiple rendering options: URP provides a Universal Renderer that


supports Forward, Forward+ and Deferred rendering paths, as well as a 2D
Renderer.

These renderers can be extended with additional features and Scriptable


Render Passes. The Render Objects feature can be used to render objects
from a given Layer Mask at different events in the rendering pipeline. It also
allows you to override material and other render states when rendering
those objects, making it possible to customize the rendering without code.
URP can be extended with custom renderers to suit specific needs.

— Better performance: URP provides equal, if not better performance than


the Built-in Render Pipeline for comparable Quality settings in the majority
of cases. In particular:

— URP evaluates real-time lighting more efficiently. In Forward


rendering it evaluates all lighting in a single pass. Forward+ improves
upon standard Forward rendering by culling lights spatially rather
than per object. This increases the overall number of lights that can
be utilized in rendering a frame. In Deferred rendering it supports the
Native RenderPass API, allowing G-buffer and lighting passes to be
combined into a single render pass.

— There are CPU and GPU improvements when drawing meshes.


This is due to SRP Batcher, which ensures fewer draw calls and
improvements on how depth is handled.

— URP makes more efficient use of tile memory on mobile devices,


leading to less power consumption, a longer battery life, and
therefore, the possibility of longer play sessions.

— URP comes with an integrated post-processing stack that allows for


better performance compared to the Built-in Render Pipeline. Using
the Volume framework, you can create post-processing effects that
are dependent on the Camera position without writing any code.

— Compatible with the latest tools: URP supports the latest artist-friendly
tools, such as Shader Graph, VFX Graph, and the Rendering Debugger.

Most Unity projects are now being built on URP or HDRP, however the Built-in
Render Pipeline will remain an available option in Unity 2022 LTS and Unity 6.

Follow this link for a comprehensive comparison of the Built-in Render Pipeline
and URP capabilities.

© 2024 Unity Technologies 11 of 140 | unity.com


The conversion process

This section covers the steps for starting a new project with URP or converting
an existing project to URP.

How to open a new URP project

Open a new project using URP via the Unity Hub. Click on New and verify that
the Unity version selected at the top of the window is 2022.2 or newer. Choose
a name and location for the project, select the 3D (URP) template or 3D Sample
Scene (URP), and click Create.

Creating a new project with the URP template, which might require you to download the template for the first time

© 2024 Unity Technologies 12 of 140 | unity.com


Note: The template ensures that your project is set to use a linear color
space, which is required for calculating lighting correctly.

One of the four environments included in the URP 3D Sample, available in the Unity Hub

You can create new scenes via File > New Scene, with essential GameObjects
such as Camera and Directional light, and even create your own scene
template with prepopulated objects. Read more in the URP Scene Templates
documentation.

The New Scene dialog displaying Scene Templates

Go to Edit > Project Settings and open the Graphics panel. To use URP in-
Editor, you must select a URP Asset from the Scriptable Render Pipeline
Settings. The URP Asset controls the global rendering and Quality settings
of a project and creates the rendering pipeline instance. Meanwhile, the
rendering pipeline instance contains intermediate resources and the render
pipeline implementation.

UniversalRP-HighFidelity is the default URP Asset selected, but you can switch
to UniversalRP-Balanced or UniversalRP-Performant.
© 2024 Unity Technologies 13 of 140 | unity.com
The Graphics panel in Project Settings

A later section of this guide details how to adjust the settings of a URP Asset.

How to add URP to an existing Built-in Render Pipeline project

Important: Be sure to backup your project using source control before following
the steps in this section. This process will convert assets, and Unity does not
provide an undo option. If you use source control, you will be able to revert to
previous versions of the assets if necessary.

If you upgrade an existing Built-in Render Pipeline project, you’ll need to add
the URP package to your project as it’s not included in Unity 2022.2 or 2022 LTS.

The Package Manager displaying the Unity Registry packages

Go to Window > Package Manager and click the Packages drop-down to


add URP to your project. Make sure to select the Unity Registry, followed by
Universal RP. Click Download in the lower-right corner of the window if the
URP package is not yet installed on your development computer. Then click
Install once it’s downloaded.

© 2024 Unity Technologies 14 of 140 | unity.com


Installing URP via the Package Manager

To create a URP Asset, right-click in the Project window and choose Create >
Rendering > URP Asset (with Universal Renderer). Name the asset.

Creating a URP Asset

Remember : If you create a new project using the Universal Render Pipeline
or 3D (URP) templates, these URP Assets and the URP package are already
available in the project.

Rather than creating a single URP Asset, URP uses two files, each with an Asset
extension.

Two Assets in URP, one for URP settings and the other for Renderer Data

© 2024 Unity Technologies 15 of 140 | unity.com


One is called UniversalRP_Renderer, a Renderer Data Asset that you can use
to filter the layers the renderer works on, and intercept the rendering pipeline
to customize how the scene is rendered. This way, you can facilitate the
creation of high-quality effects. See the section on Pipeline callbacks for more
information.

Additionally, the UniversalRP_Renderer controls high-level rendering logic and


passes for URP. It supports Forward and Deferred paths, and a 2D Renderer
that enables features such as 2D lights, 2D shadows, and Light Blend Styles.
You can even extend URP to create your own renderers.

© 2024 Unity Technologies The Inspector for UniversalRP_Renderer Data Asset 16 of 140 | unity.com
The other URP Asset serves to control settings for Quality, Lighting, Shadows, and
Post-processing. You can use different URP Assets to control the Quality settings,
a process outlined further down in this section. This Settings Asset is linked to the
Renderer Data Asset via the Renderer List. When you create a new URP Asset, the
Settings Asset will have a Renderer List containing a single item – the Renderer
Data Asset created at the same time, set as the default. You can add alternative
Renderer Data Assets to this list.

The default renderer is used for all Cameras, including the Scene view.
A Camera can override the default renderer by selecting another one from
the Renderer List. This can be done through the use of a script, as needed.

A URP Asset in the Inspector

Despite following these steps to create a URP Asset, an open scene in the
Scene or Game view will still use the Built-in Render Pipeline. You must
complete one last step to make the switch to URP: Go to Edit > Project
Settings and open the Graphics panel. Click the small dot next to None
(Render Pipeline Asset). In the open panel, select UniversalRP.

© 2024 Unity Technologies 17 of 140 | unity.com


Selecting a Scriptable Render Pipeline Asset

A warning message will pop up regarding the switch, but just press Continue.

As there is no content in your project yet, changing the render pipeline will be
almost instantaneous. You’re now ready to use URP.

Converting the scenes of an existing project

After you complete the above steps, you’ll find that your beautiful scenes are
suddenly colored magenta. This is because the shaders used by the materials
in a Built-in Render Pipeline project are not supported in URP. Fortunately,
there is a method for restoring your scenes to their original quality.

Materials in a scene appear in magenta because their Built-in Render Pipeline-based shaders must be converted for use
in URP.

Go to Window > Rendering > Render Pipeline Converter. Choose Convert Built-
In to 2D (URP) for a 2D project, or Built-In to URP for a 3D project. Assuming that
your project is 3D, you’ll need to select the appropriate converters:

© 2024 Unity Technologies 18 of 140 | unity.com


— Rendering Settings: Select this to create multiple Render Pipeline setting
assets that will match Built-in Render Pipeline Quality settings as closely
as possible. This lets you test different Quality Levels more efficiently.
See the section on comparing Built-in Render Pipeline and URP Quality
options for more details.

— Material Upgrade: Use this to convert materials from the Built-in Render
Pipeline to URP.

— Animation Clip Converter: This converts animation clips. It runs once the
Material Upgrade converter finishes.

— Read-only Material Converter: This converts the prebuilt, read-only


Materials included in a Unity project. It indexes the project and creates
the temporary .index file. Note that it can take significant time.

Converting custom shaders

Custom shaders are not converted using the Material Upgrade converter.
The Shaders and New tools sections outline the steps for converting custom
Built-in Render Pipeline shaders to URP. Using Shader Graph is often the
quickest way to update a custom shader to URP.

There are several different URP shaders including:

— Universal Render Pipeline/Lit: This physically based render (PBR) shader,


similar to the built-in Standard Shader, can be used to represent most
real-life materials. It supports all the Standard Shader features with both
metallic and specular workflows.

— Universal Render Pipeline/Simple Lit: This uses a Blinn-Phong model, and is


suitable for low-end mobile devices or games that don’t use PBR workflows.

— Universal Render Pipeline/Unlit: This is a GPU performant shader that


doesn’t use lighting equations.

— Universal Render Pipeline/Terrain/Lit: This is suitable to use with the


Terrain Tools package.

— Universal Render Pipeline/Particles/Lit: This particle shader uses a PBR


lighting model.

— Universal Render Pipeline/Particles/Unlit: This unlit particle shader is


light on the GPU.

Although Simple Lit replaces many legacy/mobile shaders, the performance is


not the same. Legacy/mobile shaders only partially evaluate lighting, whereas

© 2024 Unity Technologies 19 of 140 | unity.com


Simple Lit considers all lights as defined by the URP Asset.

Refer to this table in our URP documentation to see how each URP shader maps
to its Built-in Render Pipeline equivalent.

Once you select one or more of the above converters, either click Initialize
Converters or Initialize And Convert. Whichever option you choose, the project
will be scanned and those assets that need converting will be added to each
of the converter panels. If you choose Initialize Converters you can limit the
conversions by deselecting the items using the checkbox provided for each one.
At this stage, click Convert Assets to start the conversion process. If you choose
Initialize And Convert, the conversion starts automatically after the converters
are initialized. Once it’s complete you might be asked to reopen the scene that is
active in the Editor.

The Render Pipeline Converter

© 2024 Unity Technologies 20 of 140 | unity.com


Comparing Quality options between the Built-in Render Pipeline
and URP

There are several default Quality options available in the Built-in Render Pipeline,
from Very low to Ultra. The Quality settings impact the fidelity of the scene,
including Texture resolution, lighting, shadow rendering, and so on.

Go to Edit > Project Settings and select the Quality panel. Here, you can switch
between these Quality options by picking the current quality. This will change
the render settings used by the Scene and Game views. You can also edit each
of the Quality options from this panel.

If you select the Rendering Settings option while using the Render Pipeline
Converter and switching from the Built-in Render Pipeline to URP, a set of URP
Assets that attempt to match the Built-in Render Pipeline Quality options will
be created. The first table below shows how the Built-in Render Pipeline maps
to URP for Low settings, while the second table displays a comparison for High
settings. In both the Built-in Render Pipeline and URP, settings are chosen via
the Quality panel. The URP Asset settings are available via the Inspector when
selecting a URP Asset. Refer to the URP documentation for more details.

Built-in Render Pipeline to URP: Low settings

Setting Built-in Render URP URP Asset settings


Pipeline
Rendering
Pixel Light Count 0 Not applicable NA
(NA) *
Anti-aliasing Disabled NA Disabled
Render Scale NA NA 1
Real-time Reflection Probes No No
Resolution Scaling Fixed 1 1 NA
DPI Factor
VSync Count Don’t sync Don’t sync
Depth Texture NA NA No
Opaque Texture NA NA No
Opaque Downsampling NA NA NA
Terrain Holes NA NA Yes
HDR NA NA Yes
Textures
Texture Quality Half res Half res NA
Anisotropic Textures Disabled Disabled NA
Texture Streaming No No NA
Particles
Soft Particles No NA NA
Particle Raycast Budget 16 16 NA
Terrain
Billboards Face Camera No No NA
Position

© 2024 Unity Technologies 21 of 140 | unity.com


Setting Built-in Render URP URP Asset settings
Pipeline
Shadows
Shadowmask Mode Shadowmask Shadowmask NA
Shadows Disabled NA NA
Shadow Resolution Low resolution NA NA
Shadow Projection Stable fit NA NA
Shadow Distance 20 NA NA
Shadow Near Plane Offset 3 NA NA
Shadow Cascades No Cascades NA NA
Cascade splits NA NA NA
Working unit NA NA NA
Depth Bias NA NA NA
Normal Bias NA NA NA
Soft Shadows NA NA NA
Async Asset Upload
Time Slice 2 2 NA
Buffer Size 16 16 NA
Persistent Buffer Yes Yes NA
Level of Detail
LOD Bias 0.4 0.4 NA
Maximum LOD level 0 0 NA
Meshes
Skin Weights 4 bones 4 bones NA
Lighting
Main Light: NA NA Per pixel
• Cast Shadows NA NA No
• Shadow Resolution NA NA NA
Additional Lights: NA NA Disabled
• Per Object Limit NA NA NA
• Cast Shadows NA NA NA
• Shadow Atlas Resolution NA NA NA
• Shadow Resolution tiers NA NA NA
• Cookie Atlas Resolution NA NA NA
• Cookie Atlas Format NA NA NA
Reflection Probes: NA NA NA
• Probe Blending NA NA No
• Box Projection NA NA No
Post-processing
Grading Mode NA NA Low Dynamic Range
LUT size NA NA 16
Fast sRGB/Linear NA NA No
conversion

* In URP, Pixel Light Count is handled using Additional Lights > (Per pixel ) >
Per Object Limit.

© 2024 Unity Technologies 22 of 140 | unity.com


Built-in Render Pipeline to URP: High settings

Setting Built-in Render URP URP Asset settings


Pipeline
Rendering
Pixel Light Count 2 Not applicable NA
(NA)
Anti-aliasing Disabled NA 2x
Render Scale NA NA 1
Real-time Reflection Yes Yes NA
Probes
Resolution Scaling Fixed 1 1 NA
DPI Factor
VSync Count Every V Blank Every V Blank NA
Depth Texture NA NA No
Opaque Texture NA NA No
Opaque Downsampling NA NA NA
Terrain Holes NA NA Yes
HDR NA NA Yes
Textures
Texture Quality Full res Full res NA
Anisotropic Textures Disabled Disabled NA
Texture Streaming No No NA
Particles
Soft Particles No NA NA
Particle Raycast Budget 256 256 NA
Terrain
Billboards Face Camera Yes Yes NA
Position
Shadows
Shadowmask Mode Distance Distance NA
Shadowmask Shadowmask
Shadows Hard and Soft NA NA
Shadows
Shadow Resolution Medium NA 2048
resolution
Shadow Projection Stable fit NA NA
Shadow Distance 40 NA 50
Shadow Near Plane Offset 3 NA NA
Shadow Cascades 2 Cascades NA 2
Cascade splits 33/67 NA 12.5/33.8/3.8
Working unit Percent Percent Metric

© 2024 Unity Technologies 23 of 140 | unity.com


Setting Built-in Render URP URP Asset settings
Pipeline
Depth Bias NA NA 1
Normal Bias NA NA 1
Soft Shadows NA NA Yes
Async Asset Upload
Time Slice 2 2 NA
Buffer Size 16 16 NA
Persistent Buffer Yes Yes NA
Level of Detail
LOD Bias 1 1 NA
Maximum LOD level 0 0 NA
Meshes
Skin Weights Unlimited Unlimited NA
Lighting
Main Light: NA NA Per pixel
• Cast Shadows NA NA Yes
• Shadow Resolution NA NA
Additional Lights: NA NA Per pixel
• Per Object Limit NA NA 4
• Cast Shadows NA NA Yes
• Shadow Atlas Resolution NA NA 2048
• Shadow Resolution tiers NA NA 512/1024/2048
• Cookie Atlas Resolution NA NA 2048
• Cookie Atlas Format NA NA Color high
Reflection Probes: NA NA
• Probe Blending NA NA Yes
• Box Projection NA NA No
Post-processing
Grading Mode NA NA Low Dynamic
Range
LUT size NA NA 32
Fast sRGB/Linear NA NA No
conversion

© 2024 Unity Technologies 24 of 140 | unity.com


An option available in URP in Unity 2022 LTS is selecting Temporal Anti-aliasing
(TAA) as an anti-aliasing option for the Camera, via Camera > Rendering > Anti-
aliasing.

TAA selected

How to work with Quality settings

Quality settings were previously handled in the Quality panel of the Project
Settings dialog box. When using URP, settings are divided between the Quality
panel and those for each URP Asset. The following table shows where each
setting can be found.

Quality settings when using URP

Setting Quality panel URP Asset


Rendering
Anti-aliasing √
Render Scale √
Resolution Scaling
Fixed DPI Factor

VSync Count √
Depth Texture √
Opaque Texture √
Opaque Downsampling √
Terrain Holes √
HDR √
Textures
Texture Quality √
Anisotropic Textures √
Texture Streaming √
Particles
Particle Raycast Budget √

© 2024 Unity Technologies 25 of 140 | unity.com


Terrain
Billboards Face
Camera Position

Shadows
Shadowmask Mode √
Shadow Resolution √
Shadow Distance √
Shadow Cascades √
Cascade splits √
Working unit √
Depth Bias √
Setting Quality panel URP Asset
Normal Bias √
Soft Shadows √
Async Asset Upload
Time Slice √
Buffer Size √
Persistent Buffer √
Level of Detail
LOD Bias √
Maximum LOD level √
Meshes
Skin Weights √
Lighting
Main Light: √
• Cast Shadows √
• Shadow Resolution √
Additional Lights: √
• Per Object Limit √
• Cast Shadows √
• Shadow Atlas Resolution √
• Shadow Resolution tiers √
• Cookie Atlas Resolution √
• Cookie Atlas Format √
Reflection Probes: √
• Probe Blending √
• Box Projection √

© 2024 Unity Technologies 26 of 140 | unity.com


Post-processing
Grading Mode √
LUT size √
Fast sRGB/Linear

conversion

If you switch between Quality options, choose a Quality Level for the Render
Pipeline Asset in the Quality panel via Project Settings. Note that if the Quality
Level is not set, the Render Pipeline Asset will default to the one set as the
Scriptable Render Pipeline Asset in the Graphics panel. This can cause some
confusion as you attempt to adjust the Quality settings of a URP Asset.
For instance, you might accidentally assume that the Quality Level set in
the URP Asset is the one currently used by the Scene and Game views.

Setting the Quality Level for the Render Pipeline Asset

Modifying a URP Asset

Note: If you have the URP 2D Renderer


enabled, some of the options related
to 3D rendering in the URP Asset will
not impact your final app or game. The
2D Renderer Asset is available under
Scriptable Render Pipeline Settings
via Edit > Project Settings > Graphics.

A URP Asset in the Inspector

© 2024 Unity Technologies 27 of 140 | unity.com


This image shows a URP Asset in the Inspector with all of its available settings.
See the URP documentation to learn more about each setting.

The Quality panel for a URP Asset allows you to set the HDR format to 64-bit
for better fidelity. However, be aware that this results in a performance hit and
requires additional memory, so avoid this setting on low-end hardware.

Another feature of the Quality panel is the option to enable LOD Cross Fade.
LOD is a technique to reduce the GPU cost needed to render distant meshes.
As the Camera moves, different LODs will be swapped to provide the right level
of quality. LOD Cross Fade allows for smoother transitions of different LOD
geometries and avoids the harsh snapping and popping that occurs during a
swap.

Converting an example project from the Built-in Render


Pipeline to URP

The Unity demo project Viking Village URP shows off URP capabilities with
Light Probes, Reflection Probes, water special effects that use a custom
ScriptableRenderPass, shaders converted via Shader Graph, and URP
post-processing. The project is available for free on the Unity Asset Store.

Open Viking Village URP in the Editor to follow along with the steps in this
section. Start by clicking Add to My Assets to add this demo to the Packages
list available in-Editor.

Viking Village URP on the Unity Asset Store

Then create a new 3D project from the Unity Hub (you don’t need to use the
URP template). Go to Window > Package Manager, select My Assets > Viking
Village URP from the Packages drop-down, and click Import.

© 2024 Unity Technologies 28 of 140 | unity.com


Viking Village URP in Package Manager

A couple of warning messages will appear (see below). The first one warns
you that importing a complete project will affect your current Project Settings,
but since you’ve created an empty project it’s safe to proceed. The second
warning alerts you about installing or upgrading certain packages. Click on the
default blue button. This is required to avoid an incorrect lighting setup as the
URP default is a linear color space, opposite the Built-in Render Pipeline which
defaults to a gamma color space.

Warning messages that appear while importing


Viking Village URP

Once the download is complete, the panel shown below will open.
Make sure to leave everything selected and click Import.

© 2024 Unity Technologies 29 of 140 | unity.com


Importing the demo project

Wait for all the assets to finish importing, then go to the demo located in Viking
Village > Scenes > The_Viking_Village. Click Window > Package Manager, and
in the drop-down select Unity Registry, followed by Universal RP. Update the
URP package to 14.x.

© 2024 Unity Technologies 30 of 140 | unity.com


Viking Village in Game view

The URP Asset set in the Graphics panel, via the Scriptable Render Pipeline
Asset, is named Viking Village > Rendering > VikingVillageUniversal. It is
configured for high-end hardware, and therefore, might play at a low frame rate
on older hardware.

Follow these steps to test different Quality Levels:

1. Generate a set of assets via Window > Rendering > Render Pipeline
Converter.

2. Choose the Built-in Render Pipeline to URP option, then select Rendering
Settings.

3. Click Initialize Converters.

4. A number of Settings options will appear in the panel; Click Convert


Assets to create the URP Assets.

5. The URP Assets will be assigned to the available Quality levels via the
Project Settings > Quality panel.

6. The highest quality asset will replace VikingVillageUniversal in the


Graphics panel. The Viking Village > Rendering > VikingVillageUniversal_
Renderer makes use of Renderer Features and water effects.

7. To restore these, add the above renderer to the Renderer List and set it
as the default for each URP Asset used in Quality Levels. Now you can
quickly switch Quality Levels in the Quality panel.

© 2024 Unity Technologies Viking Village in Scene view 31 of 140 | unity.com


Lighting in URP

This section shows how lighting in URP works and describes the differences
between the workflows of the two rendering pipelines.

Start with these resources if you are new to lighting in Unity:

— Lighting documentation

— Introduction to lighting and rendering

— The art of lighting game environments

— Real-time lighting in Unity

— Harnessing light with the URP and the GPU Lightmapper

If you convert a project from the Built-in Render Pipeline to URP, you might
notice differences in the lighting. This is because the Built-in Render Pipeline
uses a gamma lighting model by default and URP uses a linear model. As such,
any light with an intensity value differing from 1.0 will need to be adjusted.

There are also differences in where to find the Settings controls in-Editor,
as well as how to handle the challenge of widely differing hardware specs.
The rest of this section covers some tricks you can use to achieve balance
between graphic fidelity and performance.

As before, you’ll set properties in the three places listed here. A and B are
essentially the same for both render pipelines, while C applies to URP only:

© 2024 Unity Technologies 32 of 140 | unity.com


A. Window > Rendering > Lighting: This panel allows you to set lightmapping
and environment settings, and view real-time and baked lightmaps.
It is unchanged from the Built-in Render Pipeline to URP.

B. Light Inspector: There are significant differences between the


Built-in Render Pipeline and URP Inspectors. See the Light Inspector
section for details.

C. URP Asset Inspector: This is the principal place where you will set shadows.
Lighting in URP relies heavily on the settings chosen in this panel.

Quality settings are handled via Edit > Project Settings > Quality in the Built-in
Render Pipeline. In URP, this depends on the URP Asset settings which can be
swapped using the Quality panel (see the Quality settings section).

As the focus here is on lighting, the methods apply to materials that use the
shaders in the following table.

URP shaders for lit scenes

Shader Description
Complex Lit This shader has all the features of the Lit Shader. Select it
when using the Clear Coat option to give a metallic sheen to
a car, for example. The specular reflection is calculated twice
– once for the base layer, and again to simulate a transparent
thin layer on top of the base layer.
Lit The Lit Shader lets you render real-world surfaces, such as
stone, wood, glass, plastic, and metals with photorealistic
quality. The light levels and reflections look lifelike and react
across various lighting conditions, from bright sunlight to a
dark cave.

This is the default choice for most materials that use lighting.
It supports baked, mixed, and real-time lighting, and works
with Forward or Deferred rendering.

It is a physically based shading (PBS) model. Due to the


complexity of the shading calculations, it’s best to avoid
using this shader on low-end mobile hardware.
Simple Lit This shader is not physically based. It uses a non-energy
conserving Blinn-Phong shading model and gives a less
photorealistic result. Nonetheless, it can provide an excellent
visual appearance. It is more suited to use on non-physically
based projects when targeting low-end mobile devices.
Baked Lit This shader provides a performance boost for objects that
don’t need to support real-time lighting, including distant
static objects that will never be affected by dynamic objects,
real-time lights, or dynamic shadows.

© 2024 Unity Technologies 33 of 140 | unity.com


Built-in Render Pipeline vs URP lighting falloff and attenuation

Another difference between the Built-in Render Pipeline and URP is how
they compute light falloff/attenuation that applies to Spot and Point lights.

URP uses the physically based InverseSquared falloff, described here.


The Built-in Render Pipeline uses the Legacy falloff, which is not physically
based, as described on the same page. The light radius affects the falloff, which
can result in lights with a big radius, and thereby impact culling performance
as the light will touch more objects than necessary.

Lit or Simple Lit?

The choice between a Lit Shader and Simple Lit Shader is largely an artistic
decision. It is easier for artists to get a realistic render using the Lit Shader,
but if a more stylized render is desired, Simple Lit provides stellar results.

Comparing scenes rendered using different shaders: The top-left image uses the Lit Shader, the top-right, the Simple Lit
Shader, and the bottom image, the Baked Lit Shader.

It’s possible to implement your own custom lighting model by writing


a custom shader or using Shader Graph (see the Additional tools chapter).

Lighting overview

Lights are divided into Main Light and Additional Lights in URP. The Main Light
is the most significant Directional light. This is either the brightest light or the
one set via Window > Rendering > Lighting > Environment > Sun Source.

© 2024 Unity Technologies 34 of 140 | unity.com


Setting the Sun Source

Later in the guide, you’ll learn how to use the URP Asset settings to set the
number of dynamic lights that affect an object via the Object Per Light limit,
which is capped at eight for the URP Forward Renderer. However, the number
of dynamic lights that can be used per Camera is also limited by different
hardware, as shown in the following table.

Camera light limits when using the URP Forward Renderer

Light type Category Maximum Maximum Maximum Supports


possible possible possible shadows
lights lights lights
rendered rendered rendered
(non-mobile) (mobile) (OpenGLES
2.0)
Directional Main 1 1 1 True
Spot Additional 256* 32* 16* True
Point Additional 256* 32* 16* True
Directional Additional 256* 32* 16* False
* All Additional Lights share the same budget.

When you cull a scene, choose up to the maximum number of supported


dynamic lights for that frame. Meanwhile, when rendering an object, choose
only the most significant of these lights to light each object dynamically.

Projects with a small number of dynamic lights might not encounter any issues,
however, as you add more lights, you might experience light popping as different
lights are dynamically culled. Of course, there is a performance cost to having more
dynamic lights in a scene. Each dynamic light will need to be culled against the
Camera and then sorted by priority. There is also the cost of rendering each light
per object. As always, try to maintain a balance between fidelity and performance.

© 2024 Unity Technologies 35 of 140 | unity.com


Real-time and Mixed Mode lights

Real-time and Mixed Mode lights are first culled against the Camera
frustum. If occlusion culling is enabled, lights hidden by other objects in
the scene are also culled.

The visible list of lights that survive the culling process is then sorted
by each light’s distance from the Camera. If there are visible lights, the
limits in the table above come into play. For example, if you have 1,000
lights in the scene but only 200 are visible to the Camera, all those
would fit the limit for non-mobile platforms.

Now the list of visible lights is culled per object. Lights are sorted
by intensity at the pivot of the object – this way, brighter lights are
prioritized first. If an object is affected by more than the maximum
number of lights allowed per object, the excessive lights are discarded.

Consider the following options if you are hitting light limits:

— If the light’s position and intensity are static, bake it and use Light Probes
to add the light to the rendering of dynamic geometry. See the section on
Light Probes for more information.

— Use Light Layers to limit which geometry is affected by which light.

— Limit the range of the light. This option does not apply to Directional lights,
as they’re global.

— Fake the lighting using emissive materials.

The light limits discussed here are those that apply with the Forward Renderer –
the default renderer when using URP.

The Forward Renderer uses a single-pass approach to calculate the lighting of


an object in a single draw call. This is a performant option, however, as GPU
limitations restrict the number of lights that an object can consider when setting
the color for a pixel. If you’re targeting high-end hardware, you can avoid these
limitations by using the Deferred Rendering Path in URP.

© 2024 Unity Technologies 36 of 140 | unity.com


Rendering path comparison

Unity 2022 LTS provides three options for rendering: Forward, Forward+, and
Deferred.

Feature Forward Forward+ Deferred

Maximum number 9 Unlimited; Unlimited


of real-time lights per-Camera
per object limit applies
Per pixel normal No encoding No encoding Two options:
encoding (accurate (accurate
normal normal • Quantization of normals
values) values) in G-buffer (loss of
accuracy, better
performance)

• Octahedron encoding
(accurate normals,
might have significant
performance impact on
mobile GPUs)

For more information,


see Encoding of normals in
G-buffer.
MSAA Yes Yes No
Vertex lighting Yes No No
Camera stacking Yes Yes Supported with a limitation:
Unity renders only the base
Camera using the Deferred
path; Unity renders all
overlay Cameras using the
Forward Rendering path

Use the Universal Renderer Data asset to switch between the rendering paths.

Choosing a rendering path

© 2024 Unity Technologies 37 of 140 | unity.com


When using Forward+, a number of URP Asset Lighting settings are overridden:

— Main Light: The value of this property is Per Pixel, regardless of the value
you select.

— Additional Lights: The value of this property is Per Pixel, regardless of the
value you select.

— Additional Lights > Per Object Limit: Unity ignores this property.

— Reflection Probes > Probe Blending: Reflection Probe blending is always on.

Light Inspector

The Light Inspector is one of three places where you can set up lighting.

Just as with the Built-in Render Pipeline, URP supports Directional, Spot, Point,
and Area lights, though Area lights only work in Baked Indirect Mode. See the
Light Mode section for more details.

A side-by-side comparison of the Light Inspector panel in URP (left) and the Built-in Render Pipeline (right)

The image above shows how light properties are presented in the two versions
of the Light Inspector. The URP version has five groupings of controls, based
on whether the light is Directional or Point, and an additional Shape grouping
for Spot and Area lights.

This table lists the differences between the URP and Built-in Render Pipeline
Inspector

URP Light Inspector Description Built-in Render


properties Pipeline Light
Inspector properties
Light Appearance Choose between Color NA
or Filter and Temperature. Color sets
the emitted light color. Filter and
Temperature use both a color (filter)
and a temperature to switch between
cool and warm lighting.

© 2024 Unity Technologies 38 of 140 | unity.com


Bias Bias controls shadow acne. The default Bias/Normal Bias
is to use the URP Asset. Alternatively,
you can set custom values using this
Inspector.
Light Cookie If a texture is set to use a light cookie Cookie
and the light type is Directional, then a
new panel will allow you to control the
x and y size of the cookie, as well as its
offset. A cookie for a Point light must
be a cubemap. URP supports colored
cookies, whereas the Built-in Render
Pipeline is greyscale only.
Shape: Spot You can now control both the inner and Spot Angle, Range
outer cone angles for Spot lights.
Shape: Area This is used to control the shape of an Shape, Width, Height,
Area light. Radius
NA This is easily reproduced using a Draw Halo
billboard or a Fresnel shader controlling
the alpha value of a sphere that sits at
the center of the light. See the section
on Halo light for more information.
NA Check out the Lens Flare section to see Flare
how to implement a Lens Flare in URP.

Lighting a new scene

Creating a Lighting Settings Asset

The first step to lighting a new scene for URP is to create a new Lighting
Settings Asset (see image above). Open Window > Rendering > Lighting, and
once you’re on the Scene tab, click New Lighting Settings, and give the new
asset a name. The settings that you apply in Lighting panels are now saved to it.
Switch between settings by switching the Lighting Settings Asset.

Ambient or Environment lighting

There is no change in the way that Ambient/Environment lighting is defined from


the Built-in Render Pipeline to URP. The main ambient light is calculated from
the panel accessible via Window > Rendering > Lighting > Environment.

© 2024 Unity Technologies 39 of 140 | unity.com


The available settings for lighting in the Environment panel

You can set Environment Lighting to use the scene’s Skybox, with an option to
adjust the Intensity, Gradient, or Color.

Environment Lighting options

© 2024 Unity Technologies 40 of 140 | unity.com


Shadows

The biggest change from working with the Built-in Render Pipeline to URP
lies in how you set up shadows.

Shadow settings are no longer available via Project Settings > Quality.
As discussed earlier, you need a Renderer Data object and a Render Pipeline
Asset when using URP. The section on setting up a project for URP covers how
to view your scene via Render Pipeline Asset, which you can use to define the
fidelity of your shadows.

The URP Asset

Main Light Shadow Resolution

The Lighting and Shadow groups in the URP Asset are key to setting up shadows
in your scene. First, set the Main Light Shadow to Disabled or Per Pixel, then go
to the checkbox to enable Cast Shadows. The last setting is the resolution of the
shadow map.

If you’ve worked with shadows in Unity before, you know that real-time shadows
require rendering a shadow map that contains the depth of objects from the
perspective of the light. The higher the resolution of this shadow map, the
higher the visual fidelity – though both more processing power and increased
memory are required. Factors that increase shadow processing include:

— The number of Shadow Casters rendered in the shadow map –


this number for the Main Light depends on the Shadow Distance
(far plane of shadow frustum)

— Shadow Receivers that are visible (you have to encompass them all)

— Shadow Cascades splits

— Shadow filtering (Soft Shadows)

The highest resolution isn’t always ideal. For example, the Soft Shadows option
has the effect of blurring the map. In the following image of a cartoon-like
haunted room, you can see that the chair in the foreground casts a shadow on the
desk drawers, which appears too crisp when the resolution is greater than 1024.
© 2024 Unity Technologies 41 of 140 | unity.com
Setting the Shadow Resolution for the Main Light: The resolution is set to 256 in the top-left image, 512 in the top-right image, 1024 in the
middle-left image, 2048 in the middle-right image, and 4096 in the bottom image.

Main Light: Shadow Max Distance

Varying Max Distance for the Main Light Shadow: Top-left image – 10, top-right image – 30, bottom-left image – 60,
bottom-right image – 400

© 2024 Unity Technologies 42 of 140 | unity.com


Another important setting for the Main Light Shadow is Max Distance. This is set
in scene units. In the image above, the poles are 10 units apart. The Max Distance
varies from 10 to 400 units. Notice that only the first pole casts a shadow, and this
is cut short at 10 units from the Camera location. At 60 units (bottom-left image),
all shadows are in view – the shadow fidelity is adequate. When the Max Distance
is much greater than the visible assets, the shadow map is being spread over too
large an area. This means that the region in-shot has a much lower resolution than
required.

The Max Distance property needs to relate directly to what the user can see,
as well as the units used in the scene. Aim for the minimum distance that gives
acceptable shadows (see note below). If the player only sees shadows from
dynamic objects 60 units from the Camera, then set Max Distance to 60. When
the Lighting Mode for Mixed Lights is set to Shadowmask, the shadows of objects
beyond Shadow Distance are baked. If this was a static scene then you would
see shadows on all objects, but only dynamic shadows would be drawn up to the
Shadow Distance.

Note: URP only supports Stable Fit Shadow Projection, which relies
on the user to set up the Max Distance. The Built-in Render Pipeline
supports both Stable Fit and Close Fit for the Shadow Projection
property. In the latter mode, the bottom-left and -right images would
have the same quality, as Close Fit reduces the shadow distance plane
to fit the last caster. The disadvantage is that, since Close Fit changes
the shadow frustum “dynamically,” it can cause a shimmer/dancing
effect in the shadows.

Shadow Cascades

As assets disappear into the distance due to perspective, it is convenient to


decrease Shadow Resolution, thereby devoting more of the shadow map to
shadows closer to the Camera. Shadow Cascades can help with this.

The images below show the shadow map of the scene with the chair and desk in
the haunted room. The cascade count is 1 in the image to the left. The map takes
up the whole area. In the image to the right, the cascade count is 4. Notice that the
map includes four different maps, with each area receiving a lower resolution map.

A cascade count of 1 is likely to give the best result for small scenes like
this. But if your Max Distance is a large value, then a cascade count of 2 or
3 will give better shadows for foreground objects, as these receive a larger
proportion of the shadow map. Notice that the chair in the left image is much
bigger, resulting in a sharper shadow.

© 2024 Unity Technologies 43 of 140 | unity.com


Shadow map when cascade count is set to 1 (left image) and 4 (right image)

You can adjust the start and end ranges for each section of the cascade
using the draggable pointers, or by setting the units in the relevant fields (see
following image). Always adjust Max Distance to a value that is a close fit for
your scene and choose the slider positions carefully. If you use metric as the
working unit, always choose the last cascade to be, at most, the distance of the
last Shadow Caster.

Additional Light Shadows

Settings available for Additional Lights in URP Asset

© 2024 Unity Technologies 44 of 140 | unity.com


Note: URP does not support shadows for additional Directional lights.
Remember, the Main light is always the brightest Directional light. For
Additional lights with shadows, use a Point or Spot light.

Having sorted the shadows for the Main Light, it’s time to move on to Additional
Lights Mode. Enable additional lights to cast shadows by setting the Additional
Lights Mode for the URP Asset to Per Pixel. While the mode can be set to Disabled,
Per Vertex, or Per Pixel (see above image), only the latter works with shadows.

Check the Cast Shadows box. Then, select the resolution of the Shadow Atlas.
This is the map that will be used to combine all the maps for every light casting
shadows. Bear in mind that a Point light casts six shadow maps, creating a
cubemap, since it casts light in all directions. This makes a Point light the most
demanding performance-wise. The individual resolution of an additional light
shadow map is set using a combination of the three Shadow Resolution tiers,
plus the resolution chosen via the Light Inspector when selecting the light in the
Hierarchy window.

Shadows group in the Light Inspector

In the haunted room, there is a Spot light over the mirror and a Point light over
the desk. There are also seven maps. To fit these seven maps onto a 1024px
square map, the size of each map needs to be 256px or smaller. If you exceed
this size, the resolution of shadow maps will shrink to fit the atlas, resulting in a
warning message in the console.

Number of maps Atlas tiling Atlas size (multiply shadow tier size by)
1 1x1 1
2–4 2x2 2
5–16 4x4 4
Setting the Shadow Atlas size based on the number of Additional Lights shadow maps and the tier size chosen per map

© 2024 Unity Technologies 45 of 140 | unity.com


Shadow Atlas for Additional Lights

The image above shows the six maps used by the Point light where the
resolution is set to medium and the tier value to 256px. The Spot light has a
resolution set to high, with a tier value of 512px.

This is a low-polygon version of the haunted room, lit with a Main Directional light, a Point light over the desk, and a Spot
light over the mirror. All lights are real-time and casting shadows.

© 2024 Unity Technologies 46 of 140 | unity.com


Light Modes

Environments have predominantly static geometry, so that if a light is static,


you don’t need to calculate the lighting and shadows for it repeatedly. You can
calculate this once at design time, and then use that data when rendering the
geometry. This is called lightmapping or baking.

The workflow for lightmapping is unchanged between the Built-in Render Pipeline
and URP. Let’s go through the steps using an FPS Sample project by Unity.

Note: Low frequency refers to the fact that lightmaps are updated at
a much lower rate than screen updates. Specular Lobes can only be
computed by real-time lights. You can apply Global Illumination (GI) to
dynamic objects by using Light Probes, but those also only capture low
frequency diffuse light. The Built-in Render Pipeline supports Light Probe
Proxy Volume (LPPV), which provides the same level of quality for Light
Probes as lightmaps do for dynamic objects. However, in URP, LPPV is
not supported due to it being a relatively slow system that doesn’t scale.
Instead, URP plans to support Adaptive Probe Volumes, which could
replace lightmaps and work for both static and dynamic objects.

The following screenshots are from the Unity project FPS Sample: The Inspection,
which you can download here. The scene demonstrates how to use real-time and
baked lighting in URP.

The scene from FPS Sample: The Inspection by Unity

1. The scene from the FPS sample project contains largely static geometry. To
include the geometry in lightmapping, click the Static box to the right side
of the Inspector.

© 2024 Unity Technologies 47 of 140 | unity.com


2. Choose the lightmapping settings via Window > Rendering > Lighting >
Scene. Keep the Lightmap Resolution low while adjusting the settings. Once
you have your desired settings, increase the value when generating the final
lightmaps. Choose Progressive GPU (Preview) to speed up the lightmap
generation, if your GPU supports it.

© 2024 Unity Technologies 48 of 140 | unity.com


3. Filtering blurs the map to minimize noise. This can result in gaps in a
shadow where one object meets another. Use A-Trous filtering to minimize
this artifact. See Progressive Lightmapping documentation for more details
on the settings available for lightmapping.

How filtering affects the shadow between objects

4. Make sure all static geometry has no overlapping UV values, or is generating


lighting UVs on import.

5. Set Light Mode to Baked or Mixed. Select the light in the Hierarchy window
and use the Inspector. Mixed Lights will illuminate dynamic objects as well as
static ones.

© 2024 Unity Technologies 49 of 140 | unity.com


6. When using Mixed Lights, set the Light Mode to Baked Indirect,
Subtractive, or Shadowmask via Window > Rendering > Lighting > Scene.

a. Baked Indirect: Only the indirect light contribution will be baked into
the lightmaps and Light Probes (the bounces of the lights only). Direct
lighting and shadows will be real-time. This is an expensive option
and not ideal for mobile platforms. However, it does mean that you get
correct shadows and direct light for both static and dynamic geometry.

b. Subtractive: Here, you bake the direct lighting from a Directional


light set to Mixed into the static geometry, and subtract the lighting
from shadows cast by dynamic geometry. This results in the static
geometry unable to cast a shadow on dynamic objects, unless Light
Probes are used, which can cause unpleasant visual discontinuities.
URP calculates an estimate of the contribution of the light from the
Directional Light and subtracts that from the baked Global Illumination.
The estimate is clamped by the Real-time Shadow Color setting in the
Environment section of the Lighting window, so the color subtracted is
never darker than this color. Then choose the minimum color of your
subtracted value and the original baked color. This is the most suitable
option for low-end hardware.

c. Shadowmask: Though similar to Baked Indirect Mode, Shadowmask


combines both dynamic and baked shadows, rendering shadows at
a distance. It does this by using an additional Shadowmask texture
and storing additional information in the Light Probes. This provides
the highest fidelity shadows, but is also the most expensive option in
terms of memory use and performance. Visually, it’s identical to Baked
Indirect for shots up close. The difference is apparent when looking in
the far distance, making it well-suited for open-world scenes. Due to the
processing cost, it’s recommended for mid- to high-end hardware only.

7. Adjust the Lightmap Scale via Asset > Inspector > Mesh Renderer >
Lightmapping > Scale In Lightmap, so that distant objects take up less
space on the lightmap. The following image shows the texel size of the
background rock lightmap with a setting varying from 0.05 to 0.5.

© 2024 Unity Technologies 50 of 140 | unity.com


8. Click Generate Lighting to bake. The baking time depends on the number
of static objects, lights set to Mixed or Baked mode, and the settings
chosen for lightmapping, particularly the Max Lightmap Size and the
Lightmap Resolution.

Highlighting an object using Light Layers

More resources:

— Lightmapping documentation

— Lighting Settings Asset documentation

— Lighting Explorer documentation

— 5 common lightmapping problems and tips to help you fix them

© 2024 Unity Technologies 51 of 140 | unity.com


Rendering Layers

The Rendering Layers feature lets you configure certain lights to affect only
specific GameObjects so you can emphasize and draw attention to them in a
scene. In the image below, the syringe, a key collectable, appears in a shaded
part of the scene. With a Rendering Layer, it becomes visible and helps ensure
that the player doesn’t miss picking it up.

Highlighting an object using Rendering Layers

Here are the steps for setting up Rendering Layers.

1. Select the URP Asset. In the Lighting section, click the vertical ellipsis icon
( ) and select Show Additional Properties.

2. A new setting, Use Rendering Layers, will appear under the Lighting section.

3. Rename a Rendering Layer via Project Settings > Graphics > URP Global
Settings.

© 2024 Unity Technologies 52 of 140 | unity.com


4. Now that Rendering Layers are enabled, the Light Inspector will include a
Rendering Layer drop-down. A light can contribute to more than one layer.

5. With Rendering Layers enabled, you need to set up a custom shadow layer.
The new light can cast shadows from the scene’s Main Light or from its own
frustum.

6. Lastly, select the object this applies to in the Hierarchy window and then
set the Rendering Layer Mask.

© 2024 Unity Technologies 53 of 140 | unity.com


This can also be dynamically set in code.

Renderer renderer = GetComponent<Renderer>();


int layerID = 1;
int mask = 1 << layerID;
renderer.renderingLayerMask = (uint)mask;

Light Probes

As covered in an earlier section, you can combine baked and dynamic objects
in the Light Mode section using Mixed Lighting Mode. It’s recommended to also
add Light Probes to your scene when using this mode. Light Probes save the light
data at a particular position within an environment when you bake the lighting
by clicking Generate Lighting via Window > Rendering > Lighting panel. This
ensures that the illumination of a dynamic object moving through an environment
reflects the lighting levels used by the baked objects. In a dark area it will be dark,
and in a lighter area it will be brighter. Below, you can see the robot character
inside and outside of the hangar in the FPS Sample: The Inspection.

The robot inside and outside of the cave, with lighting level affected by Light Probes

© 2024 Unity Technologies 54 of 140 | unity.com


To create Light Probes, right-click in the Hierarchy window and choose Light >
Light Probe Group.

Creating a new GameObject for the Light Probe Group

Initially, there will be a cube of Light Probes, eight in total. To view and edit the
positioning of the Light Probes and add additional ones, select the Light Probe
Group in the Hierarchy window, and in the Inspector click Light Probe Group >
Edit Light Probes.

Add or remove Light Probes and modify their position from the Inspector.

The Scene view will now be in an editing mode where only Light Probes can be
selected. Use the Move tool to move them around.

Moving a Light Probe


© 2024 Unity Technologies 55 of 140 | unity.com
Light Probes should be positioned, first, in an area where a dynamic object
might move to, and second, where there is a significant change in lighting level.
When calculating the lighting level for an object, the engine finds a pyramid of
the nearest Light Probes and uses those to determine an interpolated value for
the illumination level.

The nearest Light Probes for the selected crate

Positioning Light Probes can be time-consuming, but a code-based approach


such as this one can speed up your editing, especially for a large scene.

Further details on how a Mesh Renderer works with Light Probes and how to
adjust the configuration can be found in this documentation.

Reflection Probes

A ray-tracing tool, such as Maya or Blender, can take the time to accurately
calculate the values for each frame pixel of a reflective surface. This process
takes far too long for a real-time renderer, which is why shortcuts are often used.

Reflections in a real-time renderer use environment maps (pre-rendered


cubemaps). Unity supplies a default map using the SkyManager. Having a
single map as the source of reflections from all locations in a scene can lead to
unconvincing reflections. Take the example of the robot shown in this section.
If the metal parts of this character always reflect the sky, then it will look very
strange when inside the hangar where the sky is not visible. This is where
Reflection Probes are helpful.

A Reflection Probe is simply a pre-rendered cubemap placed at a key position in the


scene. You can use several Reflection Probes in a single scene. As a dynamic object

© 2024 Unity Technologies 56 of 140 | unity.com


moves through the scene, it can select the nearest Reflection Probe and use that as
the basis of its reflections. You can also set up the scene to blend between probes.

To create a Reflection Probe, right-click the Hierarchy window and select Light
> Reflection Probe.

Creating a Reflection Probe

Then position the probe and adjust its settings. Once the probe is placed
correctly and the settings are adjusted, click Bake to generate a cubemap.

Reflection Probe settings

The following image shows the two Reflection Probes used in FPS Sample:
The Inspection, one inside the hangar and one outside.

© 2024 Unity Technologies 57 of 140 | unity.com


Each Reflection Probe captures an image of its surroundings in a cubemap texture.

Reflection Probe blending

Blending is a great feature of Reflection Probes. You can enable blending via the
Renderer Asset Settings panel. Blending is always on when the Forward+ path
is chosen, regardless of the Renderer Asset setting.

Blending gradually fades out one probe’s cubemap, while fading in the other as
the reflective object passes from one zone to the other. This gradual transition
avoids the situation where a distinctive object suddenly “pops” into the
reflection as an object crosses the zone boundary.

Box Projection

Normally, the reflection cubemap is assumed to be at an infinite distance from


any given object. Different angles of the cubemap will be visible as the object
turns, but it’s not possible for the object to move closer or further away from the
reflected surroundings. While this works well for outdoor scenes, its limitations
show in an indoor scene. The interior walls of a room are clearly not an infinite
distance away, and the reflection of a wall should get larger as the object nears it.

The Box Projection option enables you to create a reflection cubemap at a


finite distance from the probe, allowing objects to show reflections of different
sizes according to their distance from the cubemap’s walls. The size of the
surrounding cubemap is determined by the probe’s zone of effect, depending
on its Box Size property. For example, with a probe that reflects the interior
of a room, you should set the size to match the dimensions of the room.

Lens Flare

The workflow for creating a Lens Flare has been updated for URP. The first step in
configuring it is to create a Lens Flare (SRP) Data asset. Right-click in the Project
window, in a suitable Assets folder, and select Create > Lens Flare (SRP).

© 2024 Unity Technologies 58 of 140 | unity.com


Creating a Lens Flare (SRP) Data asset

Use this asset to configure the shape of your flare by setting Type as Circle,
Polygon, or Image assets and adjusting their Tint and Intensity.

Adding and configuring Lens Flare elements

© 2024 Unity Technologies 59 of 140 | unity.com


To render a Lens Flare, choose the light source that will cause the flare and then
select Add Component > Rendering > Lens Flare (SRP).

Setting up rendering for a Lens Flare

In the Settings panel for this component (see following image), assign the Lens
Flare Data asset you created to the Lens Flare Data property.

Settings for the Lens Flare (SRP) componente

The system is very flexible.

© 2024 Unity Technologies 60 of 140 | unity.com


An example of a Lens Flare

Light Halos

The Draw Halo option is not available for lights in URP, but it’s easily mimicked
with a billboard. Another option is to set the alpha transparency of a sphere.
The first image below shows the Shader Graph for such a shader, and the
second image depicts the result. For more information on using Shader Graph
to create this shader, see the Additional tools chapter.

Fresnel transparency using Shader Graph

Light Halo using a sphere with a material using the Shader Graph shader from above

© 2024 Unity Technologies 61 of 140 | unity.com


Screen Space Ambient Occlusion

Since ambient light does not consider geometry by default, high levels of
ambient light can lead to unconvincing renders. In the real world, a narrow gap
between two objects is likely to be darker than a much wider gap. Ambient
Occlusion can help deal with this issue in your Unity project. To use it with URP,
select the Renderer that the URP Asset is using. Go to Add Renderer Feature
and choose Screen Space Ambient Occlusion (SSAO).

Add Renderer Feature

Either use the default SSAO settings or adjust as needed:

The SSAO settings

© 2024 Unity Technologies 62 of 140 | unity.com


— Method: This property defines the type of noise the SSAO effect uses.

— Intensity: This property defines the intensity of the darkening effect.

— Radius: When Unity calculates the Ambient Occlusion value, the SSAO
effect takes samples of the normal texture within this radius from the
current pixel. A lower Radius value improves performance because the
SSAO Renderer Feature samples pixels closer to the source pixel.

— Falloff Distance: SSAO does not apply to objects farther than this distance
from the Camera. A lower value increases performance in scenes that
contain many distant objects.

— Direct Lighting Strength: This property defines how visible the effect is in
areas exposed to direct lighting.

A scene with only an Ambient Occlusion texture demonstrating a varying falloff distance

SSAO adds shading to narrow gaps. Let’s look at the following three images.

© 2024 Unity Technologies 63 of 140 | unity.com


The top image has no SSAO. The middle image shows the calculated SSAO,
while the bottom image shows the result of SSAO. Notice that the grinder and
scales have a stronger edge where they meet the desk.

The haunted room screenshot, with no SSAO at the top, with SSAO applied in the middle, and rendered with SSAO at the bottom

SSAO is a post-processing technique the details of which are covered later in


this guide.

© 2024 Unity Technologies 64 of 140 | unity.com


Decals

Decal Projectors are a great way of adding detail to a mesh. Use them for elements
such as bullet holes, footsteps, signage, cracks, and more. Because they use a
projection framework, they conform to an uneven or curved surface. To use a Decal
Projector with URP, you need to locate your Renderer Data asset and add the Decal
Renderer Feature.

Adding the Decal Renderer Feature

© 2024 Unity Technologies 65 of 140 | unity.com


For most purposes, you can accept the default settings.

Now your scene is ready for Decals. Create a Decal by right-clicking in the
Hierarchy view and selecting Rendering > URP Decal Projector. By default,
the projector uses the material Decal, which will project a white square onto a
surface. Use the usual tools to position and orientate the projector. Adjust the
Width, Height, and Projection Depth in the Inspector.

To customize the Decal, create a material using the Shader Graph > Decal
shader. This shader has three inputs: Base Map, Normal Map, and Normal Blend.
Once the material is prepared, assign it to the Decal Projector.

Decal Projector settings

The Inspector for a Decal Projector includes three Editing Mode buttons: Scale,
Crop, and Pivot/UV, which you read about here.

© 2024 Unity Technologies 66 of 140 | unity.com


By default, the projector will affect any surface within its frustum. The Decal
Renderer Feature includes the setting Use Rendering Layers. Enable this to
facilitate targeting specific meshes.

Decal Renderer Feature settings

Refer back to the Rendering Layers section to learn about setting up and using
this rendering option. Here are the steps to set up a Decal:

1. Use Edit > Project Settings … > Graphics > URP Global Settings to name a
Rendering Layer.

2. Select the mesh/meshes that you want to receive the projector. In the
Inspector, find Mesh Renderer > Additional Settings > Rendering Layer
Mask, and add the named Rendering Layer to the mask.

Adding Rendering Layer to the Mesh Renderer Rendering Layer Mask

© 2024 Unity Technologies 67 of 140 | unity.com


3. Select the URP Decal Projector and, in the Inspector, select the named
Rendering Layer for the Rendering Layers property.

The image below shows the scene with and without a Decal, and with a wall
projection limited by using Rendering Layers.

From left to right: No decal in the image, the decal hitting all objects, and the decal applied to the wall only using Rendering Layers

© 2024 Unity Technologies 68 of 140 | unity.com


Shaders

This section is for users who want to convert an existing custom shader to work
with URP and/or want to write a custom shader in code without using Shader
Graph. It provides the information required to port both basic and advanced
shaders to URP from the Built-in Render Pipeline. The tables included show helpful
samples of available HLSL shader functions, macros, and so on. In each case, a link
is provided to the relevant include containing many other useful functions.

For those who already have experience coding shaders, the includes provide
you with a clear idea of what’s available in HLSL to write compact and efficient
shaders. After considering the information here, hopefully porting your shaders
to URP won’t seem so daunting.

Another approach is to use Shader Graph to create versions of your custom shaders.
An introduction to Shader Graph is provided in the Additional tools section.

Comparing URP and Built-in Render Pipeline shaders

URP shaders use the ShaderLab structure, as seen in the code snippet below. As
such, Property, SubShader, Tags, and Pass will all be familiar to shader coders.

SubShader {
Tags {"RenderPipeline" = "UniversalPipeline" }
Pass {
HLSLPROGRAM
...
ENDHLSL
}
}

The basic structure of a SubShader block

The first thing to notice when comparing a URP shader with a Built-in Render
Pipeline shader is the use of the key-value pair "RenderPipeline" =
"UniversalPipeline" in the SubShader tag.
© 2024 Unity Technologies 69 of 140 | unity.com
A SubShader tag with the name RenderPipeline tells Unity which render
pipelines to use this SubShader with. The value of UniversalPipeline
indicates that Unity should use this SubShader with URP.

Looking at the render Pass code, you’ll see the shader code contained between
the HLSLPROGRAM / ENDHLSL macros. This indicates the former CG (C for
Graphics) shader programming language has been replaced by HLSL (High
Level Shading Language) although the shader syntax and functionality are
near-identical. Unity switched to HLSL a long time ago, so this shouldn’t come
as a surprise, but now the CGPROGRAM / ENDCG macros are not recommended.
Using these macros implies using UnityCG.cginc. Mixing the SRP and Built-in
Render Pipeline shader libraries in this way can cause several problems.

For URP, the shader code inside those passes is written in HLSL. Although most of
the ShaderLab hasn’t changed compared to the Built-in Render Pipeline, shaders
written for the Built-in Render Pipeline are automatically disabled by URP.
The reason for this is the change in the internal lighting process. While the Built-
in Render Pipeline performs separate shader passes for every light that reaches
an object (multipass), the URP Forward Renderer evaluates all lighting in a light
loop in a single pass. This change leads to different data structures that store
light data and new shading libraries with new conventions.

Unity will use the first SubShader block that is supported on the GPU. If the first
SubShader block doesn’t have a “RenderPipeline” = ”UniversalPipeline”
tag, it won’t run in the URP. Instead, Unity will try to run the next SubShader, if
any. If none of the SubShaders are supported, Unity will render the well-known
magenta error shader.

A SubShader can contain multiple Pass blocks, but each of them should be
tagged with a specific LightMode. As URP uses a single-pass Forward Renderer,
only the first “UniversalForward” Pass supported by the GPU will be used to
render objects in Forward rendering.

As covered earlier, using Window > Rendering > Render Pipeline Converter
converts Built-in Render Pipeline shaders to URP shaders for all materials
automatically. But what about custom shaders?

Custom shaders

Custom shaders require some work when upgrading to URP. Listed below are the
actions needed to perform on legacy shaders as part of the upgrading process.

We made this step-by-step


video tutorial on how to convert
a custom unlit Built-in shader
to URP, including a Unity
project to follow.

© 2024 Unity Technologies 70 of 140 | unity.com


Includes

Replace .cginc includes files with the following HLSL equivalents:

Built-in Render HLSL


Pipeline
UnityCG.cginc Github link
AutoLight.cginc Github link

Github link

Other useful HLSL includes

Space transform-related functions are found in this include, which is added by


default when you use Core.hlsl.

HLSL space transform functions

URP helper function Description


float4x4 GetObjectToWorldMatrix() Returns UNITY_MATRIX_M matrix that
converts from Object to World Space

This is the equivalent of Built-in Render


Pipeline unity_ObjectToWorld.
float4x4 GetWorldToObjectMatrix() Returns UNITY_MATRIX_I_M matrix that
converts from World to Object Space

This matrix is the inverse of UNITY_


MATRIX_M. It is the equivalent of
Built-in Render Pipeline
unity_WorldToObject.
float4x4 GetWorldToHClipMatrix() Returns UNITY_MATRIX_VP matrix that
converts from World to Clip Space

float4x4 GetViewToHClipMatrix() Returns UNITY_MATRIX_P matrix that


converts from View to Clip Space
float3 TransformObjectToWorld(float3 Given a position in Object Space, returns the
positionOS) position in World Space

float3 Given a direction in Object Space, returns


TransformObjectToWorldDir(float3 the direction in World Space
dirOS, bool doNormalize = true)

float3 TransformWorldToObject(float3 Given a position in World Space, returns the


positionWS) position in Object Space

© 2024 Unity Technologies 71 of 140 | unity.com


float3 TransformWorldToView(float3 Given a position in World Space, returns the
positionWS) position in View Space

real3x3 CreateTangentToWorld(real3 Create a Tangent to World matrix given a


normal, real3 tangent, real normal and a tangent
flipSign)

real3 TransformTangentToWorld(real3 Given a normal in Tangent Space, returns a


normalTS, real3x3 tangentToWorld, normal in World Space
bool doNormalize = false)

real3 TransformWorldToTangent(real3 Given a normal in World Space, returns a


normalWS, real3x3 tangentToWorld, normal in Tangent Space
bool doNormalize = true)

Notation for the space type at the end of the variable name:

— WS: World Space

— TS: Tangent Space

— VS: View Space

— OS: Object Space

Other shader functions, including fog and UV, can be found in this include, which is
added by default when you use Core.hlsl. The following table lists a few examples.

URP helper function Description


VertexPositionInputs Given a position in Object Space, returns
GetVertexPositionInputs(float3 a struct containing position in World, View,
positionOS) and Clip Space

This function should be used only in


vertex shader.
VertexNormalInputs Given a normal in Object Space, returns a
GetVertexNormalInputs(float3 struct with the World Space normal, tangent,
normalOS) and bitangent vectors

These vectors can be used to construct


a Tangent to World matrix using
CreateTangentToWorld. Returns input.
tangentWS, input. bitangentWS, input.
normalWS.
float3 GetCameraPositionWS() Returns the Camera position in World Space

This is similar to the Built-in Render


Pipeline’s _WorldSpaceCameraPos variable.
float3 GetViewForwardDir() Returns the forward (central) direction of
the current view in World Space
float3 GetWorldSpaceViewDir(float3 Computes the World Space view direction
positionWS) (pointing toward
the viewer)

© 2024 Unity Technologies 72 of 140 | unity.com


Shader helper functions are fundamental for shader coding. They not only
save you time, but are highly optimized implementations of commonly used
calculations. This include contains many helper functions related to:

— Platforms-specific functions

— Common math functions

— Texture utilities

— Texture format sampling

— Depth encoding/decoding

— Space transformations

— Terrain/brush heightmap encoding/decoding and miscellaneous utilities

Some of them are listed in the table below. The type real is set in the file;
depending on various flags, it could be a half or a float.

Helper function Helper function


real DegToRad(real deg) real RadToDeg(real rad)
bool IsPower2(uint x) real FastACosPos(real inX)
real FastASin(real x) real FastATan(real x)
uint FastLog2(uint x) real3 Orthonormalize(real3 tangent,
real3 normal)
real Pow4(real x) float4x4 Inverse(float4x4 m)
float ComputeTextureLOD(float2 float Linear01Depth(float depth,
uv, float bias = 0.0) float4 zBufferParam)

Preprocessor macros

Preprocessor macros are handy and regularly used. When porting the
Built-in Render Pipeline shaders to new URP shaders, you’ll need to replace
the Built-in Render Pipeline macros with their URP equivalents.

This table highlights a few examples.

Built-in Render Pipeline URP


UNITY_PROJ_COORD(a) Replace with a.xy/a.w
UNITY_INITIALIZE_OUTPUT(type, ZERO_INITIALIZE(type, name)
name)

© 2024 Unity Technologies 73 of 140 | unity.com


Shadow mapping*
UNITY_DECLARE_SHADOWMAP(tex) TEXTURE2D_SHADOW_PARAM(textureName,
samplerName)**
UNITY_SAMPLE_SHADOW(tex, uv) SAMPLE_TEXTURE2D_
SHADOW(textureName, samplerName,
coord3)
UNITY_SAMPLE_SHADOW_PROJ(tex, SAMPLE_TEXTURE2D_
uv) SHADOW(textureName, samplerName,
coord4.xyz/coord4.w)
Texture/sampler declaration***
UNITY_DECLARE_TEX2D(name) TEXTURE2D(textureName);
SAMPLER(samplerName);
UNITY_DECLARE_TEX2D_ TEXTURE2D(textureName);
NOSAMPLER(name)
UNITY_SAMPLE_TEX2D_SAMPLER( SAMPLE_TEXTURE2D(textureName,
name,samplername,uv) samplerName, coord2)

Notes for table:

* Shadow mapping macros need this shadow include.

** The _PARAM are macros that can be used to declare functions with texture
and sampler arguments. Check out this document for more information.

*** For Built-in Render Pipeline texture/sampler declaration, read this


documentation.

LightMode tags

The LightMode tag defines the role of Pass in the lighting pipeline. In the Built-in
Render Pipeline, most shaders that need to interact with lighting are written as
Surface Shaders with all the necessary details taken care of. However, custom
shaders in the Built-in Render Pipeline need to use the LightMode tag to
specify how the Pass is considered in the lighting pipeline.

The table below indicates the correspondence between the LightMode tags
used in the Built-in Render Pipeline and the tags that URP expects. Several
legacy Built-in Render Pipeline tags are not supported in URP: PrepassBase,
PrepassFinal, Vertex, VertexLMRGBM, and VertexLM. At the same time,
there are other tags in URP with no equivalent in the Built-in Render Pipeline.

When writing a shader for URP, it’s a good idea to look at the provided shaders
and the Passes they use. The following code example shows some of the code
from the Lit Shader. The complete shader is here.

© 2024 Unity Technologies 74 of 140 | unity.com


Built-in Render Description URP (read more here)
Pipeline (read
more here)
Always Always rendered; no lighting applied -
ForwardBase Used in Forward rendering; Ambient, UniversalForward
main Directional light, vertex/SH lights,
and lightmaps applied
ForwardAdd Used in Forward rendering; Additive UniversalForward
per-pixel lights applied, one Pass per
light
Deferred Used in Deferred Shading; renders UniversalGBuffer
G-buffer
ShadowCaster Renders object depth into the shadow ShadowCaster
map or a depth texture
MotionVectors Used to calculate per-object motion MotionVectors
vectors
URP uses this tag value in the Forward UniversalForwardOnly
Rendering Path; the Pass renders
object geometry and evaluates all light
contributions.
- URP uses this tag value in the 2D Universal2D
Renderer; the Pass renders objects and
evaluates 2D light contributions.
- The Pass renders only depth DepthOnly
information from the perspective of a
Camera into a depth texture.
- This Pass is executed only when baking Meta
lightmaps in the Unity Editor; Unity
strips this Pass from shaders when
building a Player.
- Use this tag value to draw an extra SRPDefaultUnlit
Pass when rendering objects; it is valid
for both the Forward and Deferred
Rendering Paths.

URP uses this tag value as the default


value when a Pass does not have a
LightMode tag.

The UniversalForward and ShadowCaster Passes involve many pragmas and


two include files. Examining the code in the include files will help you create the
custom version that suits your needs.

© 2024 Unity Technologies 75 of 140 | unity.com


// Forward pass. Shades all light in a single pass. GI + emission +
Fog
Pass
{
// Lightmode matches the ShaderPassName set in
// UniversalRenderPipeline.cs. SRPDefaultUnlit and passes with
// no LightMode tag are also rendered by Universal Render Pipeline
Name "ForwardLit"
Tags{"LightMode" = "UniversalForward"}

Blend[_SrcBlend][_DstBlend], [_SrcBlendAlpha][_DstBlendAlpha]
ZWrite[_ZWrite]
Cull[_Cull]
AlphaToMask[_AlphaToMask]

HLSLPROGRAM
#pragma exclude_renderers gles gles3 glcore
#pragma target 4.5

#pragma vertex LitPassVertex


#pragma fragment LitPassFragment

#include "Packages/com.unity.render-pipelines.universal/Shaders/Lit-
Input.hlsl"
#include "Packages/com.unity.render-pipelines.universal/Shaders/Lit-
ForwardPass.hlsl"
ENDHLSL
}

Pass
{
Name "ShadowCaster"
Tags{"LightMode" = "ShadowCaster"}

ZWrite On
ZTest LEqual
ColorMask 0
Cull[_Cull]

HLSLPROGRAM
#pragma exclude_renderers gles gles3 glcore

#pragma vertex ShadowPassVertex


#pragma fragment ShadowPassFragment

#include "Packages/com.unity.render-pipelines.universal/Shaders/Lit-
Input.hlsl"
#include "Packages/com.unity.render-pipelines.universal/Shaders/
ShadowCasterPass.hlsl"
ENDHLSL
}

Note: A great resource for users planning to write shaders for URP
is this tutorial by Cyanilux.

© 2024 Unity Technologies 76 of 140 | unity.com


Pipeline callbacks

A great feature of SRPs is that you can add code at just about any stage of the
rendering process using a C# script. Scripts can be injected at stages such as:

— Rendering shadows

— Rendering prepasses

— Rendering G-buffer

— Rendering Deferred lights

— Rendering opaques

— Rendering Skybox

— Rendering transparents

— Rendering post-processing

You can inject scripts in the rendering process via the Add Renderer Feature
option in the Inspector for the Universal Renderer Data Asset. Remember, when
using URP, there is a Universal Renderer Data object and a URP Asset. The
URP Asset has a Renderer List with at least one Universal Renderer Data object
assigned. It is the asset you assign in Project Settings > Graphics > Scriptable
Render Pipeline Settings.

If you are experimenting with multiple setting assets for different scenes, then
attaching the following script to your Main Camera can be useful. Set the
Pipeline Asset in the Inspector. Then it will switch the asset when the new
scene is loaded.

© 2024 Unity Technologies 77 of 140 | unity.com


using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;

[ExecuteAlways]
public class AutoLoadPipelineAsset : MonoBehaviour
{
public UniversalRenderPipelineAsset pipelineAsset;

// Start is called before the first frame update


void OnEnable()
{
if (pipelineAsset)
{
GraphicsSettings.renderPipelineAsset = pipelineAsset;
}
}
}

Script to switch Universal Render Pipeline Asset on scene load

The next section covers two different types of Renderer Features, one for artists
and the other for experienced programmers.

Render Objects

A common problem in games is losing sight of the player character as they


disappear behind environment objects. You could attempt to move the Camera
so that the character is always in view, or adjust the environment to be as open
as possible. But such options are not always available. A good trick is to show a
silhouette of the character when an environment model appears between
the character and the Camera, as shown in the image below.

Showing a silhouette when an environment model masks the character

Here’s how you can create this silhouette:

1. First, you need a material to use when the character is masked. Create a
material and set the shader to Universal Render Pipeline > Lit or Unlit (the
previous image shows the Lit option). Set the Surface Inputs > Base Map
color. In this example, the material is called Character.

© 2024 Unity Technologies 78 of 140 | unity.com


2. To avoid rendering the character more times than necessary, let’s place it
on a special layer. Select the character, add a SeeBehind layer to the Layers
list and select it for the character.

3. Select the Renderer Data object used by the URP Asset. Go to the Opaque
Layer Mask and exclude the SeeBehind layer. The character will then
disappear.

4. Click Add Renderer Feature and select Render Objects (Experimental).

© 2024 Unity Technologies 79 of 140 | unity.com


5. Fill out the settings for this Render Object’s Pass. Give it a name and
choose when the render should be triggered. In this example, it’s called
AfterRenderingOpaques.

Set the Layer Mask to the SeeBehind layer, which was the layer chosen for the
character. Expand the Overrides and set the Override Mode to Material. Select
the material created in step 1. You’ll want to use Depth when rendering, without
having to update the depth buffer by writing to it. Set the Depth Test to Greater
so that this Pass only renders when the distance to the rendered pixel is further
from the Camera than the distance currently stored in the depth buffer.

6. At this stage, you only see the silhouette of the character when it’s behind
another object. You don’t see the character at all when it’s in full view. To
fix this, add another Render Objects feature. This time you don’t need to
update the Overrides panel. This Pass will draw the character when not
masked by another object.

© 2024 Unity Technologies 80 of 140 | unity.com


The silhouette trick is a good example of using the URP workflow to add effects
that are difficult to achieve with the Built-in Render Pipeline workflow due to its
reliance on coding.

Renderer Feature

A Renderer Feature can be used at any stage in URP to affect the final render.
Let’s go through a simple example of adding a post-processing effect. In a
project using the Built-in Render Pipeline, you would have to add a Graphics.
Blit using the OnRenderImage callback. This example uses the version of the
function with a material to process each pixel in the image.

1. Start by finding a suitable folder in the project Assets folder. Right-click


and choose Create > Rendering > URP Renderer Feature. Give it the name
TintFeature.

2. Double-click the default TintFeature file. It is a C# script containing


boilerplate for a Renderer Feature.

Default code for a Renderer Feature

© 2024 Unity Technologies 81 of 140 | unity.com


3. Rename CustomRenderPass to TintPass and add these properties to the
CustomRenderPass class. The material will contain the shader you apply to
the current state of the rendered image.

Material material;
RTHandle cameraColorTarget;
Color color;

4. Add a constructor to the TintPass to initialize the material, and set the
position of this pass in the render pipeline.

public TintPass(Material mat)


{
material = mat;
renderPassEvent = RenderPassEvent.BeforeRenderingPostProcessing;
}

5. Add a SetTarget method to initialize the cameraColorTarget and color


properties of the TintPass class.

public void SetTarget(RTHandle colorHandle, Color col)


{
cameraColorTarget = colorHandle;
color = col;
}

6. Create a new shader, name it TintBlit, and copy the code below. Notice
the RenderPipeline tag. ZWrite and Cull are both off. Core.hlsl is
imported from com.unity.render-pipelines.universal and Blit.hlsl from
com.unity.render-pipelines.core. If you select the Opaque Texture in
the URP Asset Inspector, then the pipeline creates a Render Texture,
_CameraOpaqueTexture.

Selecting Opaque Texture for the URP Asset

© 2024 Unity Technologies 82 of 140 | unity.com


The shader samples this and modulates it using the _Color value.

Shader "Custom/TintBlit"
{
SubShader
{
Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipe-
line"}
LOD 100
ZWrite Off Cull Off
Pass
{
Name "TintBlitPass"

HLSLPROGRAM
#include "Packages/com.unity.render-pipelines.universal/Shad-
erLibrary/Core.hlsl"
// The Blit.hlsl file provides the vertex shader (Vert),
// input structure (Attributes) and output structure (Vary-
ings)
#include "Packages/com.unity.render-pipelines.core/Runtime/
Utilities/Blit.hlsl"

#pragma vertex Vert


#pragma fragment frag

TEXTURE2D(_CameraOpaqueTexture);
SAMPLER(sampler_CameraOpaqueTexture);

float4 _Color;

half4 frag (Varyings input) : SV_Target


{
float4 color = SAMPLE_TEXTURE2D(_CameraOpaqueTexture,
sampler_CameraOpaqueTexture, input.texcoord);
return color * _Color;
}
ENDHLSL
}
}
}

7. Replace CustomRenderPass m_ScriptablePass with the following properties.


You can set Shader and Color in the Inspector for the Renderer Data asset.

public Shader shader;


public Color color;

Material material;

TintPass renderPass = null;

© 2024 Unity Technologies 83 of 140 | unity.com


8. Add the following code to the TintFeature Create method. This function is
called when the TintFeature is created. It will be used to initialize a material
from the supplied shader, a new instance of the TintPass class using the
custom constructor, and the new Material just created from the TintBlit
shader.

material = CoreUtils.CreateEngineMaterial(shader);
renderPass = new TintPass(material);

9. A SetupRenderPasses override needs to be added to prepare the render


pass. You only want the tinting in the Game view so the code is wrapped in
an if statement. Calling ConfigureInput with the ScriptableRenderPassInput.
Color argument ensures that the opaque texture is available to the
Render Pass. Finally, you ensure the renderPass has the necessary
cameraColorTarget and color by calling the SetTarget method created
earlier.

public override void SetupRenderPasses(ScriptableRenderer renderer,


in RenderingData renderingData)
{
if (renderingData.cameraData.cameraType == CameraType.Game)
{
renderPass.ConfigureInput(ScriptableRenderPassInput.Color);
renderPass.SetTarget(renderer.cameraColorTargetHandle,
color);
}
}

10. Now that you have created and initialized an instance of the TintPass, add
it to the render queue. Add the next code snippet in the AddRenderPasses
method, and, once again, wrap the code inside an if statement, checking
the current camera type is Game.

if (renderingData.cameraData.cameraType == CameraType.Game)
renderer.EnqueuePass(renderPass);

11. Since a material is created, it should be destroyed by adding a Dispose


override.

protected override void Dispose(bool disposing)


{
CoreUtils.Destroy(material);
}

© 2024 Unity Technologies 84 of 140 | unity.com


12. Back to the TintPass. You need to configure the cameraColorTarget. Add the
next code snippet to OnCameraSetup:

ConfigureTarget(cameraColorTarget);

13. Now that everything is initialized, you can do the actual work of copying the
current Render Texture using a material to process the result. Add the code
below to the Execute method.

var cameraData = renderingData.cameraData;


if (cameraData.camera.cameraType != CameraType.Game)
return;

if (material == null)
return;

CommandBuffer cmd = CommandBufferPool.Get();

material.SetColor("_Color", color);
Blit(cmd, cameraColorTarget, cameraColorTarget, material, 0);

context.ExecuteCommandBuffer(cmd);
cmd.Clear();

CommandBufferPool.Release(cmd);

14. To see the effect in action, select the Renderer Data object and click Add
Renderer Feature. TintFeature will appear in the list.

15. Here is the complete TintFeature code, with the final result shown below.

using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;

public class TintFeature : ScriptableRendererFeature


{
class TintPass : ScriptableRenderPass
{
Material material;
RTHandle cameraColorTarget;
Color color;

public TintPass(Material mat)


{
material = mat;
renderPassEvent = RenderPassEvent.
BeforeRenderingPostProcessing;
}

public void SetTarget(RTHandle colorHandle, Color col)

© 2024 Unity Technologies 85 of 140 | unity.com


{
cameraColorTarget = colorHandle;
color = col;
}

public override void OnCameraSetup(CommandBuffer cmd, ref


RenderingData renderingData)
{
ConfigureTarget(cameraColorTarget);
}

public override void Execute(ScriptableRenderContext context, ref


RenderingData renderingData)
{
var cameraData = renderingData.cameraData;
if (cameraData.camera.cameraType != CameraType.Game)
return;

if (material == null)
return;

CommandBuffer cmd = CommandBufferPool.Get();

material.SetColor("_Color", color);
Blit(cmd, cameraColorTarget, cameraColorTarget, material, 0);

context.ExecuteCommandBuffer(cmd);
cmd.Clear();

CommandBufferPool.Release(cmd);
}
}

public Shader shader;


public Color color;

Material material;

TintPass renderPass = null;

public override void Create()


{
material = CoreUtils.CreateEngineMaterial(shader);
renderPass = new TintPass(material);
}

public override void SetupRenderPasses(ScriptableRenderer renderer,


in RenderingData renderingData)
{
if (renderingData.cameraData.cameraType == CameraType.Game)
{
// Calling ConfigureInput with the ScriptableRenderPassInput.
Color argument
// ensures that the opaque texture is available to the Render
Pass.
renderPass.ConfigureInput(ScriptableRenderPassInput.Color);
renderPass.SetTarget(renderer.cameraColorTargetHandle,
color);
}
}

public override void AddRenderPasses(ScriptableRenderer renderer,

© 2024 Unity Technologies 86 of 140 | unity.com


ref RenderingData renderingData)
{
if (renderingData.cameraData.cameraType == CameraType.Game)
renderer.EnqueuePass(renderPass);
}

protected override void Dispose(bool disposing)


{
CoreUtils.Destroy(material);
}
}

Effect of TintFeature: Unprocessed to the left, tinted on the right

If you want to allow a user to choose where in the render pipeline to use this you
could add an additional property, as outlined below.

public RenderPassEvent renderEvent;



//Create method
renderPass = new TintPass(material, renderEvent);

//TintPass constructor
public TintPass(Material mat, RenderPassEvent renderEvent)
{
material = mat;
renderPassEvent = renderEvent;
}

© 2024 Unity Technologies 87 of 140 | unity.com


Use the Renderer Data asset to assign the properties in the Inspector.

In this video tutorial, we show


you three practical exercises
using Renderer Features
– namely, how to create a
custom post-processing
effect, stencil effect, and
characters occluded by
their environment.

You can find more community-driven examples of Renderer Feature best


practices, including this video tutorial on how to control a custom Renderer
Feature by Ned Makes Games.

© 2024 Unity Technologies 88 of 140 | unity.com


Post-processing

The Built-in Post-Processing Stack v2 package is not compatible with URP.


URP does not require an additional package for post-processing effects. Instead,
it uses a Volume framework. When you add Volumes to a scene, you can choose
which post-processing effects apply to the Volume. A Volume can be Global or
Local. If Global, the Volume affects the Camera everywhere in the scene.
With the Mode set to Local, Volumes affect the Camera if it’s within the
bounds of the Collider.

Applying post-processing effects: The top-left image has no effects applied, the top-right image has Bloom applied, the
bottom-left has Vignette applied, and the bottom-right has Color Adjustment added.

© 2024 Unity Technologies 89 of 140 | unity.com


Using the URP post-processing framework

1. The first step is to make sure your Main Camera has post-processing enabled.
Select the Main Camera in the Hierarchy window, go to the Inspector, and
expand the Rendering panel. Check the Post Processing option.

2. Right-click the Hierarchy window and select Create > Volume > Global
Volume to create a Global Volume.

3. With the Global Volume selected in the Hierarchy window, find the Volume
panel in the Inspector and create a new Profile by clicking on New.

4. Start adding post-processing effects. See the table further down that lists
available effects. Click Add Override and select Post-processing. In this
example, the Bloom effect is chosen.

© 2024 Unity Technologies 90 of 140 | unity.com


Selecting the Bloom effect

5. Each effect has a dedicated Settings panel. The image here shows the
settings for Bloom.

© 2024 Unity Technologies 91 of 140 | unity.com


6. You can easily add multiple effects (such as Vignette in this example) and
configure each one using their Settings panel.

Adding a Local Volume

With the Volume framework, you can configure the scene so that as a Camera
moves around it, different post-processing profiles are triggered. This is achieved
by adding a Local Volume. Let’s go through the steps for setting this up.

1. In the Hierarchy window, right-click and choose Create > Volume > Box
Volume. Alternatively, choose Sphere Volume if this shape is more suited
to your purpose, or Convex Mesh Volume for a tighter control over the
shape of the Collider that defines the Volume region.

© 2024 Unity Technologies 92 of 140 | unity.com


2. From the Volume panel in the Inspector, create a new Profile to store this
Volume data. The panel can also be used to set:

a. Blend Distance: This is the furthest distance from the Volume’s Collider
that URP starts blending from, and the distance in Collider dimensions
where this profile fades in. At the edge of the Collider, the post-
processing effects will fade out and the Blend Distance from the edge
of the Collider will fully fade in.

b. Weight: Weight defines the maximum strength of the post-processing


effects. If Weight is set to 1, then the effect will reach full strength. A
setting of 0 means there is no effect, while 0.5 sets the strength of the
effect at a maximum of 50%.

c. Priority: Use this value to determine which Volume URP is used when
multiple Volumes have an equal amount of influence on the scene. The
higher the number, the higher the Priority. If you are merging Global
and Local, then keep Global at the default 0 setting and set the Local
Volume(s) to 1 or more.

Settings for a Local Volume

3. Position the Volume and control its dimensions using the Box Collider
component, as shown in the image below.

Positioning and sizing a Box Volume using the attached Box Collider component

© 2024 Unity Technologies 93 of 140 | unity.com


Post-processing can weigh heavily on your processor, so carefully consider the
effects on low-end hardware and mobile devices. If your project must use it,
then test on the target hardware. Some filters are less processor intensive than
others. This document outlines the mobile-friendly effects.

These are the available post-processing effects in URP.

Effect Description
Bloom Adds a glow around pixels above a defined
brightness level.
Channel Mixer Modifies the influence of each input color channel
on the overall mix.
Chromatic Creates fringes of color along boundaries that
Aberration separate dark and light parts of the image.
Color Adjustments Use this effect to tweak the overall tone, brightness,
and contrast of the final rendered image.
Color Curves Grading curves are an advanced way to adjust
specific ranges in hue, saturation, or luminosity.

Color Lookup This maps the colors of each pixel to a new value using
a Lookup Texture.

Depth of Field This effect simulates the focus properties


of a camera lens.
Film Grain This simulates the random optical texture of
photographic film.
Lens Distortion Distorts the final rendered picture to simulate
the shape of a real-world camera lens.
Lift Gamma Gain Use the different trackballs to affect different ranges
within the image. Adjust the slider under the trackball
to offset the color lightness of that range.
Motion Blur This simulates the blur that occurs in an image when a
real-world camera films objects moving faster than the
camera’s exposure time.
Panini Projection This effect helps you render perspective views in
scenes with a very large field of view.
Shadows Midtones This effect separately controls the shadows, midtones,
Highlights and highlights of the render.
Split Toning Use this to add different color tones to the
shadows and highlights in your scene.
Tonemapping Tonemapping is the process of remapping the
HDR values of an image to a new range of values.
Vignette This effect comprises darkening toward the edges
of an image compared to the center.
White Balance Removes unrealistic color casts, so items that
would appear white in real life render as white in
your final image.

© 2024 Unity Technologies 94 of 140 | unity.com


Controlling post-processing with code

You can also dynamically adjust your post-processing profile using a C# script.
The following code example shows how to adjust the intensity of the Bloom
effect. If a Vignette is applied, you can control the vignetting color via code.
For example, if the player character takes damage, you can temporarily tint it red.

using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;

public class PPController : MonoBehaviour


{
// Start is called before the first frame update
void Start()
{
Volume volume = GetComponent<Volume>();
Bloom bloom;
if (volume.profile.TryGet<Bloom>(out bloom))
{
bloom.intensity.value = 0;
}
}
}

© 2024 Unity Technologies 95 of 140 | unity.com


Camera Stacking

A common requirement in games is the ability to combine geometry viewed


from different cameras in a single render. The image above shows a shelf in the
foreground acting as an inventory within the game. Collected items are added
to the shelf and can be selected at key points by the player. Notice that it has a
different field of view, as well as different lighting and post-processing.
This has been set up using the Camera Stacking feature in URP.

An example using Camera Stacking

© 2024 Unity Technologies 96 of 140 | unity.com


Let’s look at how to set this feature up.

1. Create a Camera by right-clicking the Hierarchy view and choosing


Create > Camera. Remove the audio listener component.

2. Use the Inspector > Camera Settings panel to set this Camera as Render
Type Overlay.

3. Create a new Layer for the Camera and the GameObjects it renders.

4. Update the Rendering > Culling Mask for the Camera using the Inspector.

© 2024 Unity Technologies 97 of 140 | unity.com


5. Move the Camera to a suitable place in the scene, then add and position
GameObjects by placing them in Layer Overlay.

6. Make sure the Main Camera does not render Overlay by updating its
Rendering > Culling Mask.

7. In the Stack panel, use the “+” button to add the Overlay Camera.

Controlling a stack with code

As with post-processing, you can control the stack from code, and add
or remove cameras dynamically during runtime. See this code example:

using UnityEngine;
using UnityEngine.Rendering.Universal;

public class StackController : MonoBehaviour


{
public Camera overlayCamera;

// Start is called before the first frame update


void Start()
{
Camera camera = GetComponent<Camera>();
var cameraData = camera.GetUniversalAdditionalCameraData();
cameraData.cameraStack.Remove(overlayCamera);
}
}

Post-processing and Camera Stacking, both easily configured using URP,


are powerful tools for creating rich, atmospheric effects in your games.

© 2024 Unity Technologies 98 of 140 | unity.com


The SubmitRenderRequest API

Sometimes you might want to render your game to a different destination than
the user’s screen. The SubmitRenderRequest API is designed with this purpose
in mind. Let’s look at a possible use case.

Coding a screengrab

The script below will render the game to an off-screen RenderTexture when
the user presses the onscreen GUI. The script should be attached to the Main
Camera. A RenderTexture is created in the Start callback. It is 1920 x 1080
pixels with a bit depth of 24. When the user presses the “Render Request”
button, the RenderRequest method is called.

In the RenderRequest method, there’s a reference to the Camera component.


Create a RenderPipeline.StandardRequest instance, then check whether
the current pipeline supports the RenderRequest framework. If it does,
you set the RenderTexture that was initialized in the Start callback as the
destination of this request object and initialize the render using RenderPipeline.
SubmitRenderRequest. This method takes a camera instance and a request
object. At this point, Texture2D contains a render of the current scene. To
save this to a file, you first need to convert the RenderTexture to a Texture2D
instance. The method ToTexture2D shows one possible route. Once you have
a Texture2D you can use the EncodeToPNG method of a Texture2D instance to
get a byte array. You can then use the System.IO.File method WriteAllBytes to
save the byte array to a file.

© 2024 Unity Technologies 99 of 140 | unity.com


If you use the script directly, the screengrab will be saved in a newly created
folder called RenderOutput in the Assets folder of your game. The file name is
R_ followed by a randomly chosen integer between 0 and 100,000.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Rendering;

[RequireComponent(typeof(Camera))]
public class StandardRenderRequest : MonoBehaviour
{
[SerializeField]
RenderTexture texture2D;

private void Start()


{
texture2D = new RenderTexture(1920, 1080, 24);
}

// When user clicks on GUI button,


// Render Requests are sent with various output textures to render giv-
en frame
private void OnGUI()
{
GUILayout.BeginVertical();
if (GUILayout.Button("Render Request"))
{
RenderRequest();
}
GUILayout.EndVertical();
}

void RenderRequest()
{
Camera cam = GetComponent<Camera>();

RenderPipeline.StandardRequest request = new RenderPipeline.Stan-


dardRequest();

if (RenderPipeline.SupportsRenderRequest(cam, request))
{
// 2D Texture
request.destination = texture2D;
RenderPipeline.SubmitRenderRequest(cam, request);

SaveTexture(ToTexture2D(texture2D));
}
}

void SaveTexture(Texture2D texture)


{

© 2024 Unity Technologies 100 of 140 | unity.com


byte[] bytes = texture.EncodeToPNG();
var dirPath = Application.dataPath + "/RenderOutput";
if (!System.IO.Directory.Exists(dirPath))
{
System.IO.Directory.CreateDirectory(dirPath);
}
System.IO.File.WriteAllBytes(dirPath + "/R_" + Random.Range(0,
100000) + ".png", bytes);
Debug.Log(bytes.Length / 1024 + "Kb was saved as: " + dirPath);
#if UNITY_EDITOR
UnityEditor.AssetDatabase.Refresh();
#endif
}

Texture2D ToTexture2D(RenderTexture rTex)


{
Texture2D tex = new Texture2D(rTex.width, rTex.height, Tex-
tureFormat.RGB24, false);
RenderTexture.active = rTex;
tex.ReadPixels(new Rect(0, 0, rTex.width, rTex.height), 0, 0);
tex.Apply();
Destroy(tex);//prevents memory leak
return tex;
}
}

© 2024 Unity Technologies 101 of 140 | unity.com


Additional tools
compatible with URP

Another benefit of using URP is its compatibility with Unity’s latest authoring
tools that bring complex creation tasks into the reach of technical artists.
This chapter unpacks how to create shaders using Shader Graph, and how
to create particle effects using the Visual Effects (VFX) Graph.

Shader Graph

Shader Graph brings custom shaders to an artist’s workflow. The Shader Graph
tool is included when you start a project using the URP template or import the
URP package.

Covering Shader Graph warrants a separate guide, but let's go over some basic
yet crucial steps by creating the Light Halo shader from the Lighting chapter.

1. Right-click in the Project window, find a suitable folder, and choose Create
> Shader Graph > URP > Unlit Shader Graph. For this example, choose
Unlit. Name the new asset FresnelAlpha.

© 2024 Unity Technologies 102 of 140 | unity.com


2. Double-click the new Shader Graph Asset to launch the Shader Graph
editor.

If you’re familiar with shaders, then you’ll recognize the Vertex and Fragment
nodes. By default, this shader will ensure any model with a material using
it that it is correctly placed in the Camera view using the Vertex node, and
that each pixel is set to a grey color using the Fragment node.

3. This shader is going to set the alpha transparency of the object. It therefore
needs to apply to the Transparent queue. Change the Graph Inspector
> Graph Settings > Surface Type to Transparent. You’ll see that the
Fragment node now has an Alpha input as well as Base Color.

© 2024 Unity Technologies 103 of 140 | unity.com


4. Add properties to the shader. For instance, add Color as a Color, and
Power and Strength as Float values.

© 2024 Unity Technologies 104 of 140 | unity.com


5. Set the default values using Graph Inspector > Node Settings > Default.
Set Color to white, Power to 4, and Strength to 1.

6. Shader Graph functions by joining nodes together. A node will have one
or more inputs and an output. To add a node, right-click and choose
Create Node in the Search panel at the top, then enter Fre. The results
will show a Fresnel Effect node.

© 2024 Unity Technologies 105 of 140 | unity.com


7. A node shows a preview of its effect. Notice that the Fresnel Effect is bright
toward the edge. The value is the difference between the View direction
and the Normal direction – and for a sphere, this is greatest at the edge.

The alpha value should be lowest at the edge. You can flip the result using
a One Minus node. To do this, click Create Node and enter One. Select the
One Minus node. Now drag from Out(1) on the Fresnel Effect node to In(1)
on the One Minus node. The 1 means that the value type is a single float.
If it was 3, then it would be a vector with three components.

The nodes should be joined like this:

8. Let’s look at how to control the size of gradient and the overall
transparency. Use a Power node for sizing the gradient. Create a Power
node and connect One Minus Out(1) to Power A(1). Drag the Power property
to the graph and join it to Power B(1). The graph should now look like this:

© 2024 Unity Technologies 106 of 140 | unity.com


9. Control the overall transparency using a Multiply node. Create it and connect
Power Out(1) to Multiply A(1). Drag the Strength property to the graph and
join it to Multiply B(1). Then join the Multiply Out(1) to Fragment Alpha(1) and
drag the Color(4) property to the graph and join it to Fragment Base Color(3).

Notice here that the property Color comprises a four-component vector,


while Base Color is a three-component vector. Shader Graph will map
the first three components of Color to the Base Color vector.

10. Save the asset and create a new material. Assign this shader to the new
material, which is located in Shader Graphs/FresnelAlpha.

© 2024 Unity Technologies 107 of 140 | unity.com


11. Now you can apply the material to an object, controlling its visibility at the
edges.

A shader is applied to a sphere-shaped Point light to provide a halo effect around it

Fullscreen Shader Graph

The Fullscreen Shader Graph is new to Unity URP 2022 LTS. It allows you to
create custom post-processing passes. Right-click in the Project pane and
select Create > Shader Graph > URP > Fullscreen Shader Graph.

Creating a Fullscreen Shader Graph

You can access a pixel’s color for the fragment shader using a URP Sample
Buffer node that itself uses the BlitSource option. The graph below shows a
simple tint example. The URP Sample Buffer also gives access to world normals
and motion vectors that are useful for edge detection and motion trails.

A simple tinting example

© 2024 Unity Technologies 108 of 140 | unity.com


To use this example, you need a way to Blit the result of the current camera
render texture using a material that uses this shader.

With the active Renderer Data asset selected, use the Inspector to add a
Renderer Feature. Select Full Screen Pass Renderer Feature.

Adding the Full Screen Pass Renderer Feature

It just remains to update the settings for this Renderer Feature. Set the material
you created that uses the Fullscreen Shader Graph, then select the position in
the render pipeline.

The Renderer Feature settings

The image below shows the tint effect on the left. The Fullscreen Shader Graph
is a useful way to create custom post-processing effects.

The tint effect

© 2024 Unity Technologies 109 of 140 | unity.com


Related links:

— This blog post goes through the Shader Graph process with an example
project and some advanced suggestions.

— Check out Shader Graph on the Unity website.

VFX Graph

The Visual Effect (VFX) Graph enables you to create myriad particle effects
with an artist-friendly, node-based graph. Use a VFX Graph to add fire, smoke,
mist, sparks, magic orbs, and many other effects to your project.

The target devices for any games containing effects created with VFX Graph
must be compute-capable because VFX Graph uses compute shaders running on
the GPU to ensure the best possible performance. Test your code and include a
non-compute fallback, and use VFX Graph sparingly for games targeting low-end
mobile devices.

To get better acquainted with VFX Graph, let’s go through the steps for creating
a smoke effect:

1. VFX Graph can be downloaded as a package using Package Manager.

2. Once VFX Graph is installed, there will be a new option when you right-click
in the Project window > Assets folder. Choose Create > Visual Effects >
Visual Effect Graph, and name the new asset Smoke.

© 2024 Unity Technologies 110 of 140 | unity.com


3. Create an empty
GameObject and select it
in the Hierarchy window. In
the Inspector, choose Add
Component > Effects > Visual
Effect.

Alternatively, you can add the


Visual Effect Graph Asset to the
Hierarchy view in-Editor. This
will add the component with the
asset, allowing you to skip steps
3 and 4.

4. Select the Smoke VFX Graph as the Asset Template using the Component
Settings panel.

5. Now you can edit the VFX Graph. Double-click to launch the Visual Effect
Graph window. There you’ll find Spawn, Initialize, Update, and Output
Context nodes already prepopulated.

You’ll use a Texture in the form of an Atlas that contains an animated smoke
sprite. A series of 64 images in an 8x8 grid will act as the source for an
individual particle. At any single frame, a single particle will display just one
image from the grid. It will cycle through the images at a predefined rate
as each frame is rendered. Here is the Smoke Sprite Atlas:

© 2024 Unity Technologies 111 of 140 | unity.com


6. Click the “+” button
and add a Color
property. This will allow
the user to manipulate
the color of the smoke
in the Inspector.

7. Let’s look at the Spawn block. The default Spawn block comes with a
Constant Spawn Rate node. Set this to 20.

8. The next block, Initialize, defines how to handle a particle when it’s first created.
Remove the Set Lifetime Random node. Then add a Set Tex Index, and set it to
a random value from 0 to 63, so that each smoke particle has a different look.
This is important because the particle displays an image from the Smoke Sprite
sheet shown earlier and you’ll want the first index used to be 0.

Then add a Set Lifetime


node set to 1.5 seconds. To
add some variation in the
speed at which a particle
is launched, use the Set
Velocity Random node. Set
A to -0.1, 0.4, -0.1 and B to
0.1, 1, 0.1. To set the Color
of a particle to brighten or
darken the Sprite, add a Set
Color node and drag the
Color property created to its
input.

© 2024 Unity Technologies 112 of 140 | unity.com


9. The next block, Update, defines what happens at each frame update.
By default, this appears as an empty block, but it actually contains some
implicit hidden blocks that can be disabled in the Inspector when
Update is selected.

Recall that you’re using a Sprite sheet for the image of each particle. In VFX
Graph, this means you’re using a Flipbook. Add a Flipbook Player node,
set its Mode to Constant, and the Frame Rate to 16. It will cycle through
consecutive frames in the Flipbook at 16 frame changes per second.

10. Next, set the final output


of the Particle. Set the
UV Mode to Flipbook
(or Flipbook Blend for a
smoother transition between
frames) and the Flipbook
Layout to Texture 2D. Using
the Sprite sheet, set the
Flipbook Size to 8x8, and
set the Main Texture to this
Texture. Replace Set Color
Over Life with Set Alpha
Over Life. The default curve
will blend the particle in and
out over its lifetime.

© 2024 Unity Technologies 113 of 140 | unity.com


11. Select
 the GameObject with this VFX Graph attached. In the Scene view,
a panel should be visible that you can use to demo the effect outside of
runtime. If you don’t see it, make sure the toggle for visualizing Particle
Systems is on.

Here’s an image of the final smoke effect in action:

© 2024 Unity Technologies 114 of 140 | unity.com


2D Renderer and 2D lights

If you are working on a 2D game, you’ll be pleased to know there is a dedicated


URP 2D Renderer. The simplest way to get started is to use the 2D URP template
from the Unity Hub. This template ensures that your project has a URP 2D
Renderer assigned via Project Settings > Graphics > Scriptable Render Pipeline
Settings. All verified and precompiled 2D packages are installed with the 2D URP
template and the default settings optimized for 2D projects. This also ensures that
the project loads faster than installing all the packages manually.

The 2D URP template in the Unity Hub

If you’re upgrading an existing project, then you need to find a suitable folder in
your project’s Assets folder. Right-click and select Create > Rendering > URP
Asset (with 2D Renderer). Give it a name, and select it using Project Settings >
Graphics > Scriptable Render Pipeline Settings. In the Scene view, be sure to
select the 2D button when editing.

Creating a 2D Renderer and Settings Asset

© 2024 Unity Technologies 115 of 140 | unity.com


If you’re updating an existing project, then you might find switching to the URP
2D Renderer gives a classic magenta render error.

Updating an existing project with URP 2D Renderer can result in rendering errors in your scene.

Fortunately, the Window > Rendering > Render Pipeline Converter has got you
covered. Select Built-in to 2D (URP) and click the Material and Material Reference
Upgrade panel. Then click Initialize Converters, followed by Convert Assets to
be able to deselect some items or Initialize And Convert to handle the process
with one click. If you still see magenta-colored sprites, you might need to manually
replace the shader in some of your materials. Choose one of the shaders in the
following table.

2D shaders available in URP


Shader Description
Sprite-Lit-Default Uses 2D lights when rendering
Sprite-Mask-Default Works with the stencil buffer
Sprite-Unlit-Default Uses only the texture colors when rendering

Converting a Built-in Render Pipeline 2D project to URP 2D

© 2024 Unity Technologies 116 of 140 | unity.com


2D lights are available with the URP 2D Renderer. These offer enhanced
performance and flexibility. Using the new tools, you can create a more
immersive experience and save time preparing different Sprite variations by
using baked lights to create new gameplay possibilities. If you have migrated
an existing project, then you will have no URP 2D lights in your scene. If your
Sprites use the Sprite-Lit-Default shader, you might be surprised to see a lit
render. But with no lights, you get a default Global Light assigned to the scene
for an unlit appearance.

With no lights in the scene, the render defaults to Unlit.

Add a light using the Hierarchy window. Right-click and choose Light > Global
Light 2D.

© 2024 Unity Technologies 117 of 140 | unity.com


Now you can adjust the Settings, Color, Intensity, as well as the Target Sorting
Layers they affect.

In the Global Light 2D Settings, the character uses an Unlit shader.

The 2D URP framework includes four light types:

— Sprite: Uses a Sprite to control the illumination level.

— Freeform: For creating a polygonal-shaped light.

— Spot: Provides great control over the angle and direction of the selected
light. Use it as a Point light. By default, the inner and outer cones span 360
degrees. You can also adjust the inner and outer radius and decide whether
the light casts shadows, as well as the strength of those shadows.

— Global: Lights all objects on targeted sorting layers.

Editing a Spot light 2D

© 2024 Unity Technologies 118 of 140 | unity.com


If a Sprite casts a shadow, then it needs a Shadow Caster 2D component added.

Adding a Shadow Caster 2D component

The URP 2D Renderer provides all the tools necessary to create first-class 2D
games that will perform well on even low-end hardware.

An image from the Unity 2D demo Dragon Crashers; Unity’s 2D development e-book, 2D game art, animation, and
lighting for artists, was authored by the creative director of Dragon Crashers.

Related links:

— The Unity 2D demo Dragon Crashers is available on the Unity Asset Store.

— The free e-book 2D game art, animation, and lighting for artists is an
advanced development guide created for Unity developers and artists
planning to make a commercial 2D game.

© 2024 Unity Technologies 119 of 140 | unity.com


Performance

Performance is highly dependent on the project you’re working on. Always


profile and test your game throughout the development cycle. Open the Profiler
via Window > Analysis > Profiler, and follow the suggestions in this chapter.

The Profiler window

This section looks at seven ways to improve the performance of your games:

— Managing your lighting


— Light Probes
— Reflection Probes
— Camera settings
— Pipeline settings
— Frame Debugger

— Profiler

© 2024 Unity Technologies 120 of 140 | unity.com


These optimizations are also covered in this tutorial.

Optimizing lighting and rendering in URP

URP is built with optimized real-time lighting in mind. The URP Forward Renderer
supports up to eight real-time lights per object and up to 256 real-time lights
per camera for desktop games, plus 32 real-time lights per camera for mobile
and other handheld platforms. URP also allows for configurable per-object Light
settings inside the Pipeline Asset for refined control over lighting.

As explained in the Lighting chapter, baked lighting is one of the best ways to
improve the performance of your scene. Real-time lighting can be expensive,
whereas baking lights can help you gain back performance, assuming the lights
in your scene are static. The baked lighting textures are batched into a single
draw call, without needing to be continuously calculated. This is especially
useful if your scene uses multiple lights. Another great reason to bake your
lighting is that it allows you to render bounced or indirect lighting in your
scene and improve the visual quality of the render.

Global Illumination is similarly covered in the Lighting section. This process


simulates rays of light bouncing around the environment and illuminating other
nearby objects with the bounced light. The figure below shows three lighting
setups for the same scene: with no baked light data, with baked lighting, and
with post-processing applied.

From left to right: no lighting data, baked lighting, post-processing added

When baked, areas of shadow in a scene receive the bounced light and are
illuminated. It can be subtle, but this technique spreads the light around a
scene more realistically and improves its overall appearance.

In the previous image, you can see that the specular highlights on the ground
are lost when baking. Baked lights only contain diffuse lighting. Whenever
possible, compute the direct lighting contribution from real-time, and have
Global Illumination come from Image Based Lighting (IBL)/shadow maps/Probes.

The effect of light baking on shadows: before baking on the left, and after baking on the right

© 2024 Unity Technologies 121 of 140 | unity.com


Use the lowest possible Lightmap Resolution and Lightmap Size when baking
your lights; go to Window > Rendering > Lighting > Scene. This helps to lower
the texture memory requirement.

Setting the Lightmap Resolution and Max Lightmap Size

Light Probes

As explained in the Lighting section, Light Probes sample the lighting data in
the scene during baking and allow the bounced light information to be used by
dynamic objects as they move or change. This helps them blend into and feel
more natural in the baked lighting environment.

Light Probes add naturalism to a render without increasing the processing time
for a rendered frame. This makes them suitable for all hardware, even low-end
mobile devices.

The effect of using Light Probes when rendering a dynamic object: with Light Probe on the left, and without on the right

Reflection Probes

You can also use Reflection Probes to optimize your scene. Reflection Probes
project parts of the environment onto nearby geometry to create more realistic
reflections. By default, Unity uses the Skybox as the reflection map. But by using
one or more Reflection Probes, the reflections will match their surroundings more
closely.

The effect of using Reflection Probes on smooth surfaces: with Reflection Probes on the left and without on the right

© 2024 Unity Technologies 122 of 140 | unity.com


The size of the cubemap generated when baking the Reflection Probes depends
on how close the Camera gets to a reflective object. Always make sure to use
the smallest map size that suits your needs to optimize your scene.

Adjusting the size of the Reflection Probe cubemap

Camera settings

The URP enables you to disable unwanted renderer processes on your cameras
for performance optimization. This is useful if you’re targeting both high- and
low-end devices in your project. Disabling expensive processes, such as post-
processing, shadow rendering, or depth texture can reduce visual fidelity but
improve performance on low-end devices.

Occlusion culling

Another great way to optimize your Camera is with occlusion culling. By default,
the Camera in Unity will always draw everything in the Camera’s frustum,
including geometry that might be hidden behind walls or other objects.
There’s no point in drawing geometry that the player can’t see, and that takes
up precious milliseconds. This is where occlusion culling comes in.

Occlusion culling is best suited to a scene where significant numbers of objects


might be masked when another item appears between them and the Camera.
A cellular corridor maze-type game is an ideal candidate for using occlusion
culling, as seen in the images below.

Frustum culling in the image on left, and occlusion culling in the image on right

© 2024 Unity Technologies 123 of 140 | unity.com


By baking occlusion data, Unity ignores the parts of your scene that are
blocked. Reducing the geometry being drawn per frame provides a significant
performance boost.

To enable occlusion culling in your scene, mark any geometry as either


Occluder Static or Occludee Static. Occluders are medium to large objects
that can occlude objects marked as Occludees. To be an Occulder, an object
must be opaque, have a Terrain or Mesh Renderer component, and not move
at runtime. Occludees can be any object with a Renderer component, including
small and transparent objects that similarly do not move at runtime.

You set the static properties using the usual drop-down.

Settings for an object included in occlusion data

Open Window > Rendering > Occlusion Culling, and select the Bake tab.
In the bottom-right corner of the Inspector, press Bake. Unity generates
occlusion data, saving the data as an asset in your project and linking the
asset to the current scene.

Occlusion culling Bake tab

© 2024 Unity Technologies 124 of 140 | unity.com


You can see occlusion culling in action using the Visualization tab. Select the
Camera in the scene and use the Occlusion Culling pop-up window in the Scene
view to configure the visualization. The pop-up might be hidden behind the small
Camera view window. Right-click the double-line icon and choose Collapse if this is
the case. Move the pop-up, then restore the Camera view using right-click expand.

Visualization tab and Occlusion Culling pop-up

As you move the Camera, you should see objects popping on and off.

The effect of occlusion culling off in the left image, and on in the right image

Pipeline settings

While the effects of changing the settings for the URP Asset and using different
Quality tiers were previously covered, here are some additional tips
for experimenting with Quality tiers to get the best results for your project:

— Reduce Shadow Resolution and distance for performance gains.

— Disable features that your project does not require, such as depth texture
and opaque texture.

— Enable the SRP Batcher to use the new batching method. The SRP Batcher
will automatically batch together meshes that use the same shader
variant, thereby reducing draw calls. If you have numerous dynamic
objects in your scene, this can be a useful way to gain performance.
If the SRP Batcher checkbox is not visible, then click the three vertical
dots icon ( ) and select Show Additional Properties.

© 2024 Unity Technologies 125 of 140 | unity.com


Enabling additional properties for the URP Asset Inspector

Frame Debugger

Use the Frame Debugger to gain a better understanding of what’s happening


during rendering. To view additional information in the Frame Debugger window,
adjust the Debug Level using the URP Asset. As with the SRP Batcher checkbox,
this is only visible in the Inspector with Show Additional Properties enabled.

Setting the Debug Level

Adjusting the Debug Level can affect performance. Always turn it off when the
Frame Debugger is not in use.

The Frame Debugger shows a list of all the draw calls made before rendering
the final image and can help you pinpoint why certain frames are taking a long
time to render. It can also identify why your scene’s draw call count is so high.

Open the Frame Debugger by going to Window > Analysis > Frame Debugger.
When your game is playing, select the Enable button. This will pause the game
and let you examine the draw calls.

© 2024 Unity Technologies 126 of 140 | unity.com


Frame Debugger detail

Clicking a stage in the render pipeline (left pane) will show a preview of this
stage in Game view.

The Frame Debugger shows every step of the rendering process in the Game View – in this case, the SSAO generation step.

Unity Profiler

Like the Frame Debugger, the Profiler is a great way to determine how long
it takes to complete a frame cycle in your project. It provides an overview of
rendering, memory, and scripting. You can identify scripts that take a long time
to complete, helping you to pinpoint potential bottlenecks in your code.

Open the Profiler via Window > Analysis > Profiler. When in Play Mode, the
window provides an overview of the overall performance of your game. You can
also pause the live view and use the Hierarchy Mode to get a breakdown of the
time taken to complete a single frame. The Profiler will show you each call Unity
has made during the frame.

© 2024 Unity Technologies 127 of 140 | unity.com


For an even more detailed analysis, use the low-level native plug-in Profiler API.
You can use this Profiler API to extend the Profiler, and profile the performance
of native plug-in code, or to prepare profiling data to send to third-party
profiling tools such as Razor for Sony Playstation, PIX for Microsoft (Windows
and Xbox), as well as Chrome Tracing, ETW, ITT, VTune, or Telemetry.

The Profiler window using the low-level native plug-in Profiler API

On the left is an example of


#include <IUnityInterface.h> using the low-level native
#include <IUnityProfiler.h> plug-in Profiler API
static IUnityProfiler* s_UnityProfiler = NULL;
static const UnityProfilerMarkerDesc* s_MyPluginMarker = NULL; Additional resources
static bool s_IsDevelopmentBuild = false;
If you’re interested in building
static void MyPluginWorkMethod()
{ advanced profiling skills in
if (s_IsDevelopmentBuild) Unity, start by downloading
s_UnityProfiler->BeginSample(s_MyPluginMarker);
the free e-book, Ultimate
// Code I want to see in Unity Profiler as "MyPluginMethod". guide to profiling Unity
// ... games. This guide brings
together advanced advice and
if (s_IsDevelopmentBuild)
s_UnityProfiler->EndSample(s_MyPluginMarker); knowledge on how to profile
} an application in Unity, manage
its memory, and optimize its
extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API UnityPlugin-
power consumption from start
Load(IUnityInterfaces* unityInterfaces)
{ to finish.
s_UnityProfiler = unityInterfaces->Get<IUnityProfiler>();
if (s_UnityProfiler == NULL)
A couple of other useful
return;
s_IsDevelopmentBuild = s_UnityProfiler->IsAvailable() != 0; resources recommended by Nik
s_UnityProfiler->CreateMarker(&s_MyPluginMarker, "MyPluginMethod", include Measuring Performance
kUnityProfilerCategoryOther, kUnityProfilerMarkerFlagDefault, 0); by Catlike Coding, and Unity
}
Draw Call Batching by The
extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API UnityPluginUn- Gamedev Guru.
load()
{
s_UnityProfiler = NULL;
}

© 2024 Unity Technologies 128 of 140 | unity.com


URP 3D Sample

A new URP 3D Sample is available via the Unity Hub. This sample project
replaces the construction scene that will be familiar to many developers who
have been using URP for a few years. The URP 3D Sample contains four distinct
environments that illustrate the capabilities of URP in Unity 2022 LTS.

Let’s go through each environment.

The URP 3D Sample is available in the Unity HUB when you start a new project in Unity 22 LTS, you can see more about
the sample in this website

© 2024 Unity Technologies 129 of 140 | unity.com


The garden

This scene illustrates how you can efficiently scale your content with URP to
suit multiple platforms, from mobile and console to high-end gaming desktops.
It features stylized PBR rendering, customizable vegetation, and rendering
numerous lights with the new Forward+ renderer that surpasses previous light
count limits.

The oasis

This is a photorealistic scene with highly detailed textures, VFX Graph effects,
SpeedTree, and a custom water solution. It targets devices that support
compute shaders.

© 2024 Unity Technologies 130 of 140 | unity.com


The cockpit

This scene uses custom lighting code with Shader Graph. It’s designed for
untethered VR devices such as Meta Quest 2.

The terminal

This scene is the link between the other sample scenes, providing a transition
effect to move from one scene to the next, It also features the perfect setting
for you to drop in assets for look-dev.

© 2024 Unity Technologies 131 of 140 | unity.com


Moving between the environments

The sample project uses a transition effect to move between scenes. The
transition effect uses an off-screen render target to render the incoming scene
before the transition is complete. The incoming scene is then rendered to large
monitors placed in the outgoing scene using a custom shader created with
Shader Graph, and the full-screen swap is handled using a stencil via a Render
Objects Renderer Feature.

Screen Stencil Renderer FeatureI

© 2024 Unity Technologies 132 of 140 | unity.com


To see the effect in action, walk toward the pedestal until the Unity logo is
displayed, then keep the logo in the center of the screen. This will trigger the
transition.

All scene assets are loaded at load time, but only a single scene is enabled. The
cameras used at runtime, when starting from The Terminal scene, are the same
as those found in the FPS_Controller GameObject. MainCamera renders the
active scene, and ScreenCamera the scene displayed on the monitors.

FPS_Controller for The Terminal Scene

During a transition, the incoming scene camera is rendered to the render target.
This creates a potential problem since URP only supports one main directional
light. A script called Scripts > SceneManagement > SceneTransitionManager.cs
runs before rendering, enabling the active scene’s main light and disabling the
other to keep to this restriction.

Take a look at the script below. In the OnBeginCameraRendering method,


we first check whether we’re rendering the main camera. If isMainCamera is
true, then the ToggleMainLight calls activate the main directional light for the
currentScene and disable the main directional light for the screenScene, the
incoming scene. However, if isMainCamera is false, then the reverse will be the
case.

The same script handles switching the fog, reflection, and skybox to suit the
scene being rendered by adjusting the settings of the RenderSettings object.

The transition between the incoming and outgoing scenes is handled using a
Render Objects Renderer Feature. By writing a value to the stencil buffer, this
can be checked in a subsequent pass. If the pixel being rendered has a certain
stencil value, then you keep what is already in the color buffer; otherwise, you
can freely overwrite it. Renderer Features are a highly flexible way to build a
final render using combinations of passes.

© 2024 Unity Technologies 133 of 140 | unity.com


Renderer Features for the Mobile Foward+ Renderer

To match camera positions during a transition, the project has a


SceneMetaData script for each scene that stores an offset Transform,
while a SceneTransitionManager script handles the incoming and outgoing
scenes during the transition. The Update method tracks the progress of the
transition. When ElapsedTimeInTransition is greater than m_TransitionTime,
then TriggerTeleport is called, which in turn calls the Teleport method. This
repositions and orientates the player to create a seamless switch from the
outgoing scene to the incoming scene.

ScreenTransitionManager.cs Update method

© 2024 Unity Technologies 134 of 140 | unity.com


Scalability

URP supports a wide range of hardware,


and there are several ways the new
sample scenes illustrate how to work
with different devices. You’ll notice the
different options in Project Settings >
Quality.

Quality levels

Each option uses a different Render Pipeline Asset. As explained in the Quality
section, URP handles Quality using a combination of this panel together with the
settings of the Render Pipeline Asset.

Standalone VR headsets present a significant challenge when displaying real-


time 3D graphics. They have high-resolution screens, and each eye must be
handled separately, resulting in each rendered frame requiring twice the work.
Additionally, with a minimum target fps of 72, you’ll need a lot of pixels per
second. A workaround for this challenge is to use stylized lighting. The Cockpit
scene below uses a Toon Shaded lighting model.

The Cockpit sample scene

© 2024 Unity Technologies 135 of 140 | unity.com


The custom lighting is handled using Shader Graph, with no coding necessary.

As usual for a Toon shader, it combines the normal vector and the Main light
direction using a dot product to determine the lighting level. It then uses a
ramp to set staged levels of light rather than smoothly changing values. The
lighting model used in The Cockpit scene also uses Baked Global Illumination
in the calculation and does some edge detection to add a subtle outline effect.
Custom lighting is handled using Shader Graph.

See The Universal Render Pipeline cookbook for a tutorial about creating Toon
Shaders.

Running the sample project on a mobile device

A common problem for game developers is getting a game running smoothly


on a mobile device. The new sample project includes a Mobile Forward+ URP
Asset in the Settings folder. Remember that the URP Asset is the principal way
you can adjust quality settings. Forward+ relies on the CPU to do significant
culling operations per frame and so is not necessarily the best option for a low-
end mobile device. The best option for such devices is the Deferred renderer,
which is used by a URP asset in the sample project.

The screengrab on the next page shows the settings for the Mobile Forward+
asset.

© 2024 Unity Technologies 136 of 140 | unity.com


The Mobile Forward+ URP Asset

© 2024 Unity Technologies 137 of 140 | unity.com


The Renderer List has two Universal Renderer Data assets: one for the active
scene, Mobile Forward+_Renderer, and the other for rendering the screen
scene, Forward+_Screen_Renderer. The Depth Texture is enabled. Note that
Additional Lights do not cast shadows. This is a very expensive option and for
mobile devices can often be mimicked using light cookies. The Garden scene
in particular has lots of lights, and many use cookies to give a suggestion of
shadows. Notice the lighting on the rocks in the bottom left of the next image
with and without cookies.

Here are three top tips when targeting mobile platforms.

— Reduce the number of pixels rendered. Most modern mobiles have a high
DPI or dots-per-inch count. For most games, a DPI of 96 is sufficient. If
Screen.DPI is 300, for example, then a render scale of 96/300 on a 2400
x 1200 screen would mean rendering 768 x 384 pixels, almost a tenth of
the pixels, which is a massive performance boost. You can set the render
scale in the URP Asset or adjust the value at runtime.

— Notice that the Mobile Forward+_Renderer asset has a Decal renderer


feature with its Technique option set to Automatic. This will switch to
Screen Space on GPUs with hidden surface removal. This provides a
performance boost by avoiding a depth prepass, which is a waste of
resources on these devices.

— Use Deferred rendering on devices where the CPU overhead of Forward+


is too expensive.

A careful study of these four scenes alongside their URP Asset settings and
documentation will help you learn how to use the techniques on display in your
own projects.

Garden Scene Point Light with and without cookies

© 2024 Unity Technologies 138 of 140 | unity.com


Conclusion
For developers and artists looking to switch
to URP, be sure to check out the full Unity
Documentation, as well as Unity Learn, the Dave the Diver by MINTROCKET

Unity Blog, and the URP Forum.

The Unity Product Board provides an overview of


current URP features being developed, in addition
to what’s coming up next. You can even add your
own feature requests.

To wrap up this e-book, here are just a few of


the stunning and original games made with the Lost in Random by Thunderful Games, published by Electronic Arts, for console and PC

rendering power and flexibility of Unity’s URP.

Good luck with your game development.

Neon White by Angel Matrix and Ben Esposito, published by Annapurna Interactive

Pixel Ripped 1978 by ARVORE Immersive Experiences Death in the Water 2 by Lighthouse Games Studio

Bare Butt Boxing by Tuatara Can’t Live Without Electricity by MELOVITY

© 2024 Unity Technologies 139 of 140 | unity.com


unity.com

You might also like