Unity
Unity
Version 1.8.0
2|Introduction|Unity
OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved.
BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of their
respective owners. Certain materials included in this publication are reprinted with the permission of the
copyright holder.
2||
Unity|Contents|3
Contents
Getting Started.................................................................................................... 9
Unity VR Support............................................................................................................................................... 9
Importing the Oculus Utilities Package.............................................................................................................9
Migrating to Utilities from the Integration Package....................................................................................... 10
OVRInput............................................................................................................ 22
OVRInput Usage.............................................................................................................................................. 22
Touch Input Mapping......................................................................................................................................25
Rift Remote Input Mapping.............................................................................................................................27
Xbox Input Handling....................................................................................................................................... 27
OVRHaptics........................................................................................................ 29
VR Compositor Layers....................................................................................... 31
OVRBoundary..................................................................................................... 36
Cubemap Screenshots....................................................................................... 38
Unity Sample Framework................................................................................... 40
Configuring for Build......................................................................................... 44
PC Build Target: Microsoft Windows and Mac OS X..................................................................................... 44
Build Settings and Player Settings.............................................................................................................44
Quality Settings.......................................................................................................................................... 46
Running the Build.......................................................................................................................................46
Mobile Build Target: Android......................................................................................................................... 47
Build Settings............................................................................................................................................. 47
Player Settings............................................................................................................................................ 47
Running the Build.......................................................................................................................................48
Unity 5.1 and later provide built-in virtual reality support for the Oculus Rift and Samsung Gear VR, so basic
development is possible without any further downloads or assets. We provide the optional Oculus Utilities for
Unity 5.package to assist with development - it include prefabs, C# scripts, sample scenes, and more. Be sure
to check out the other resources that can help Unity developers, such as our audio spatialization plugin and
Unity Sample Framework - see Contents and Other Resources on page 7 for more information.
Projects using Unity 4 should use our Unity 4.x Legacy Integration package instead. New developers should
use Unity 5.1+.
For information on recommended and supported Unity versions, see Compatibility and Requirements.
Most information in this guide applies to the use of the Utilities package for both Rift and Mobile development.
Any exceptions are clearly indicated where they occur.
For information on integrating the Oculus Platform SDK in Unity applications, see our Platform SDK guide.
When developing for both Rift and mobile platforms, keep in mind that the requirements for PC and mobile
VR applications differ substantially. If you would like to generate builds for both PC and mobile from a single
project, it is important to follow the more stringent mobile development best practices, as well as meeting the
required 90 FPS required by the Rift.
Before beginning development, download and install the appropriate Unity version (indicated below).
6|Introduction and Recommended Versions|Unity
OS Compatibility
Windows: Windows 7, 8, 10
Mac: OS X Yosemite, El Capitan
VR support for Unity 4 is provided by assets included with the Oculus Unity Legacy Integration package.
Developers using Unity 4 and the Legacy Integration should migrate to Unity 5. We are maintaining the Unity
Integration for developers who are not ready to upgrade, but it will be eventually discontinued.
If you are starting a new project, we strongly recommend using Unity 5.x and the Utilities package.
Note: Due to issues with earlier releases, we now recommend all developers update to 5.3.6p5 or
version 5.4.1p1 or later.
Unity 5.3 Unity 5.3.6p5 and Provides first-party support for virtual reality development. For
later optional supplementary plugins and scripts, use the latest Oculus
Utilities for Unity.
Unity 5.4 Unity 5.4.1p1 and Provides first-party support for virtual reality development. For
later optional supplementary plugins and scripts, use the latest Oculus
Utilities for Unity.
Unity 4 4.7.0f1 Enabled by importing custom assets from the latest Oculus Legacy
Integration.
For complete details Oculus SDK or Integration version compatibility with Unity, see Unity-SDK Version
Compatibility.
A feature set comparison between Unity Professional and Personal editions may be found here: http://
unity3d.com/unity/licenses
Gamepad Controller
You may wish to have a compatible gamepad controller for use with the supplied demo applications, such
as the Xbox 360 controller for Windows, an HID-compliant game controller for Mac, or a Moga Pro or other
compatible controller for Gear VR.
Unity|Introduction and Recommended Versions|7
Samples
Oculus Sample Framework for Unity 5 - sample scenes and guidelines for common features; see Unity
Sample Framework for more information.
Mobile Resources
Oculus Remote Monitor debugging client for mobile development; see Oculus Remote Monitor (Mobile) on
page 52 for more information).
Oculus Utilities for Unity 5 SDK Examples: Examples designed for use with the Mobile SDK; see Mobile SDK
Examples (Deprecated) on page 19 for more information.
Audio Resources
Oculus Native Spatializer Plugin for Unity (included with Oculus Audio SDK Plugins) - provides easy-to-use
high-quality audio spatialization; see Oculus Native Spatializer for Unity for more information.
Oculus OVRLipSync (a Unity 5 plugin for animating avatar mouths to match speech sounds)
Built-in audio spatialization: The Unity v 5.4.1p1 Editor and later feature a basic version of our Oculus Native
Spatializer Plugin for Unity 5 (ONSP). It makes it trivially easy to add basic spatialization (HRTF transforms) to
audio point sources in your Unity project. For more information, see First-Party Audio Spatialization (Beta) in our
Oculus Native Spatializer for Unity guide.
Recommended Configuration
We recommend the following settings in your project:
On Windows, enable Direct3D 11. D3D 11 exposes the most advanced VR rendering capabilities. D3D
9 and OpenGL do not currently work in VR with Unity. D3D 12 is currently available as an experimental
feature.
Use the Linear Color Space. Linear lighting is not only more correct for shading, it also causes Unity to
perform sRGB read/write to the eye textures. This helps reduce aliasing during VR distortion rendering,
where the eye textures are interpolated with slightly greater dynamic range.
Never clone displays. When the Rift is cloned with another display, the application may not vsync properly.
This leads to visible tearing or judder (stuttering or vibrating motion).
8|Introduction and Recommended Versions|Unity
When developing for mobile, please be sure to review all of the relevant performance and design
documentation, especially the Unity Best Practices: Mobile. Mobile apps are subject to more stringent
limitations and requirements and computational limitations which should be taken into consideration from the
ground up.
Application Signing
Mobile applications require two different signatures at different stages of development. Be sure to read the
Application Signing section of the Mobile SDK documentation for more information.
Getting Started
This section describes steps taken to begin working in Unity.
Unity VR Support
Unity 5.1 and later offer first-party virtual reality support, enabled with the Virtual Reality Supported checkbox in
the Other Settings > Configuration tab of Player Settings.
When Unity virtual reality support is enabled, any camera with no render texture is automatically rendered in
stereo to your device. Positional and head tracking are automatically applied to your camera, overriding your
cameras transform.
Unity applies head tracking to the VR camera within the reference frame of the camera's local pose when
the application starts. If you are using OVRCameraRig, that reference frame is defined by the TrackingSpace
GameObject, which is the parent of the CenterEyeAnchor GameObject that has the Camera component.
The Unity Game View does not apply lens distortion. The image corresponds to the left eye buffer and uses
simple pan-and-scan logic to correct the aspect ratio.
For more information and instructions for using Unitys VR support, see the Virtual Reality section of the Unity
Manual.
Note: Unitys VR support is not compatible with Oculus Legacy Integrations for Unity 4.
Assets/Plugins Oculus.*
OVR.*
Assets/Plugins/ *Oculus*
Android/
AndroidManifest.xml
*vrapi*
*vrlib*
10|Getting Started|Unity
Assets/Plugins/x86/ Oculus.*
OVR.*
Assets/Plugins/ Oculus.*
x86_64/
OVR.*
Unity Project
If you are already working in a Unity project, save your work before beginning.
Otherwise, create a new project into which you will import the Oculus assets:
Utilities Package
Note: If you are importing the Utilities package into a project that previously used our legacy
Integration package, see Migrating to Utilities from the Integration Package for important steps to take
before importing the custom assets package.
To import the package into Unity, select Assets > Custom Package... and select the Utilities for
Unity .unityPackage to import the assets into your new project. Alternately, you can locate the .unityPackage
file in your file system and double-click to launch.
When the Importing package dialog box opens, leave all of the boxes checked and select Import. The import
process may take a few minutes to complete.
Please let us know about any issues you encounter in the Oculus Unity Forum, and keep your eye out for
updates.
Upgrade Procedure
1. Replace any usage of OVRManager.instance.virtualTextureScale or
OVRManager.instance.nativeTextureScale with UnityEngine.VR.VRSettings.renderScale. The value of
renderScale is equal to nativeTextureScale * virtualTextureScale. If you set renderScale to a value that is less
than or equal to any value it has had since the application started, then virtual texture scaling will be used. If
you increase it higher, to a new maximum value, then the native scale is increased and the virtual scale is set
to 1.
2. Replace any usage of OVRManager.instance.eyeTextureAntiAliasing with
UnityEngine.QualitySettings.antiAliasing. Instead of multisampling the back-buffer when VR is enabled, Unity
multisamples the eye buffers.
3. Remove any usage of OVRManager.instance.timeWarp and OVRManager.instance.freezeTimeWarp.
TimeWarp is always on and cannot be frozen.
4. Do not assume there are Cameras on OVRCameraRig.leftEyeAnchor or rightEyeAnchor. Instead
of calling GetComponent<Camera>(), use Camera.main or, for backward compatibility, use
OVRCameraRig.leftEyeCamera or rightEyeCamera.
5. Move any scripts, image effects, tags, or references from the Cameras on OVRCameraRig.leftEyeAnchor and
rightEyeAnchor to the one on centerEyeAnchor.
6. Remove any usage of OvrCapi.cs. The CAPI C# binding is no longer available. If you need to access CAPI,
use UnityEngine.VR.VRDevice.GetNativePtr() to get an ovrHmd pointer and then pass it to a native plugin
that uses the Oculus SDK corresponding to your Unity version. For more on which Unity versions correspond
to which SDKs, see "Integration Versions" in Compatibility and Requirements.
Note: Don't forget to move any scripts, image effects, tags, or references from the left and right eye
anchors to the center eye anchor as noted above.
12|A Detailed Look at Oculus Utilities for Unity|Unity
Contents
OVR
The contents of the OVR folder in OculusUtilities.unitypackage are uniquely named and should be safe to
import into an existing project.
Editor Scripts that add functionality to the Unity Editor and enhance several C# component
scripts.
Materials Materials used for graphical components within the Utilities package, such as the main
GUI display.
Prefabs The main Unity prefabs used to provide the VR support for a Unity scene:
OVRCameraRig and OVRPlayerController.
Scripts C# files used to tie the VR framework and Unity components together. Many of these
scripts work together within the various Prefabs.
Note: We strongly recommend that developers not directly modify the included OVR scripts.
Plugins
The Plugins folder contains the OVRGamepad.dll, which enables scripts to communicate with the Xbox
gamepad on Windows (both 32 and 64-bit versions).
This folder also contains the plugin for Mac OS: OVRGamepad.bundle.
Unity|A Detailed Look at Oculus Utilities for Unity|13
Prefabs
Utilities for Unity 5 provides prefabs in Assets/OVR/Prefabs:
OVRCameraRig
OVRPlayerController
OVRCubemapCaptureProbe
To use, simply drag and drop one of the prefabs into your scene.
OVRCameraRig
OVRCameraRig replaces the regular Unity Camera within a scene. You can drag an OVRCameraRig into your
scene and you will be able to start viewing the scene with the Gear VR and Rift.
Note: Make sure to turn off any other Camera in the scene to ensure that OVRCameraRig is the only
one being used.
OVRCameraRig contains one Unity camera, the pose of which is controlled by head tracking; two anchor
GameObjects for the left and right eyes; and one tracking space GameObject that allows you to fine-tune the
relationship between the head tracking reference frame and your world. The rig is meant to be attached to a
moving object, such as a character walking around, a car, a gun turret, et cetera. This replaces the conventional
Camera.
OVRCameraRig.cs
OVRManager.cs
OVRPlayerController
The OVRPlayerController is the easiest way to start navigating a virtual environment. It is basically an
OVRCameraRig prefab attached to a simple character controller. It includes a physics capsule, a movement
system, a simple menu system with stereo rendering of text fields, and a cross-hair component.
14|A Detailed Look at Oculus Utilities for Unity|Unity
To use, drag the player controller into an environment and begin moving around using a gamepad, or a
keyboard and mouse.
Note: Make sure that collision detection is active in the environment.
OVRPlayerController.cs
Figure 2: OVRPlayerController
OVRCubemapCaptureProbe
This prefab allows you to capture a static 360 screenshot of your application while it is running,
either at a specified time after launch, when a specified key is pressed, or when the static function
OVRCubemapCapture.TriggerCubemapCapture is called. For more information on this function, see our Unity
Developer Reference.
OVRCubemapCaptureProbe is based on OVR Screenshot (see Cubemap Screenshots on page 38 for more
information).
Unity|A Detailed Look at Oculus Utilities for Unity|15
Screenshots are taken from the perspective of your scene camera. They are written to a specified directory and
may be either JPEG or PNG. File type is specified by the file extension entered in the Path Name field; default
is PNG. Resolution is configurable.
Basic Use
Drag OVRCubemapCaptureProbe into the scene and set the parameters as desired in the Inspector view.
Parameters
Auto Trigger After Select to enable capture after a delay specified in Auto Trigger Delay.
Launch Otherwise capture is triggered by the keypress specified in Triggered by
Key.
Auto Trigger Delay Specify delay after application launch before cubemap is taken. (requires
Auto Trigger After Launch selected).
Triggered By Key Specifies key to trigger image capture (requires Auto Trigger After Launch
not selected).
Path Name Specifies directory, file name, and file type (JPEG or PNG) for screen
capture.
Cubemap Size Specify size (2048 x 2048 is default, and is the resolution required for
preview cubemaps submitted to the Oculus Store).
16|A Detailed Look at Oculus Utilities for Unity|Unity
Unity Components
This section gives a general overview of the Components provided by the Utilities package.
OVRCameraRig
OVRCameraRig is a Component that controls stereo rendering and head tracking. It maintains three child
"anchor" Transforms at the poses of the left and right eyes, as well as a virtual "center" eye that is halfway
between them.
This Component is the main interface between Unity and the cameras. It is attached to a prefab that makes it
easy to add comfortable VR support to a scene.
Important: All camera control should be done through this component. You should understand this script when
implementing your own camera control mechanism.
Updated Anchors Allows clients to filter the poses set by tracking. Used to modify or ignore positional
tracking.
GameObject Structure
TrackingSpace A GameObject that defines the reference frame used by tracking. You can move this
relative to the OVRCameraRig for use cases in which the rig needs to respond to
tracker input. For example, OVRPlayerController changes the position and rotation of
TrackingSpace to make the character controller follow the yaw of the current head pose.
OVRManager
OVRManager is the main interface to the VR hardware. It is a singleton that exposes the Oculus SDK to Unity,
and includes helper functions that use the stored Oculus variables to help configure camera behavior.
This component is added to the OVRCameraRig prefab. It can be part of any application object. However, it
should only be declared once, because it includes public members that allow for changing certain values in the
Unity Inspector.
Monoscopic If true, rendering will try to optimize for a single viewpoint rather than rendering once
for each eye. Not supported on all platforms.
Queue Ahead When enabled, distortion rendering work is submitted a quarter-frame early to avoid
(Deprecated) pipeline stalls and increase CPU-GPU parallelism.
Use Recommended When enabled, Unity will use the optimal antialiasing level for quality/performance on
MSAA Level the current hardware.
Unity|A Detailed Look at Oculus Utilities for Unity|17
Enable Adaptive Enable to configure app resolution to scale down as GPU exceeds 85% utilization, and
Resolution to scale up as it falls below 85% (range 0.5 - 2.0; 1 = normal density). Requires Unity 5.4
or later.
Max Render Scale Sets minimum bound for Adaptive Resolution (default = 0.7).
Min Render Scale Sets maximum bound for Adaptive Resolution (default = 1.0).
Tracking Origin Type Set to Eye Level to track the position and orientation y-axis relative to the HMDs
position. Set to Floor Level to track position and orientation relative to the floor,
based on the users standing height as specified in the Oculus Configuration Utility.
Default is Eye Level.
Use Position Tracking Disables the IR tracker and causes head position to be inferred from the current rotation
using the head model.
Use IPD in Position If enabled, the distance between the user's eyes will affect the position of each
Tracking OVRCameraRig's cameras.
Reset Tracker On Load When disabled, subsequent scene loads will not reset the tracker. This will keep the
tracker orientation the same from scene to scene, as well as keep magnetometer
settings intact.
Helper Classes
In addition to the above components, your scripts can always access the HMD state via static members of
OVRManager.
OVRTracker Provides the pose, frustum, and tracking status of the infrared tracking sensor.
Rift Recentering
OVRManager.display.RecenterPose() recenters the head pose and the tracked controller pose, if
present (see OVRInput on page 22 for more information on tracking controllers).
Recenter requests are passed to the Oculus C API. For a more detailed description of what happens
subsequently, please see VR Focus Management in our PC SDK Developer Guide.
18|A Detailed Look at Oculus Utilities for Unity|Unity
Utilities
OVRPlayerController contains a few variables attached to sliders that change the physics
properties of the controller. This includes Acceleration (how fast the player will increase
speed), Dampening (how fast a player will decrease speed when movement input is not
activated), Back and Side Dampen (how much to reduce side and back Acceleration),
Rotation Amount (the amount in degrees per frame to rotate the user in the Y axis)
and Gravity Modifier (how fast to accelerate player down when in the air). When HMD
Rotates Y is set, the actual Y rotation of the cameras will set the Y rotation value of the
parent transform that it is attached to.
OVRGridCube OVRGridCube is a helper class that shows a grid of cubes when activated. Its main
purpose is to be used as a way to know where the ideal center of location is for the
user's eye position. This is especially useful when positional tracking is activated. The
cubes will change color to red when positional data is available, and will remain blue if
position tracking is not available, or change back to blue if vision is lost.
GameObject Structure
Trivial An empty scene with one cube and a plain Unity camera.
These scripts for assisting with mobile development are located in Assets/OVR/Scripts/:
Unity|A Detailed Look at Oculus Utilities for Unity|19
OVROverlay.cs Add to an object with a Quad mesh filter to have the quad rendered as a TimeWarp
overlay instead by drawing it into the eye buffer.
OVRPlatformMenu.cs Helper component for detecting Back Key long-press to bring-up the Universal Menu
and Back Key short-press to bring up the Confirm-Quit to Home Menu. Additionally
implements a Wait Timer for displaying Long Press Time. For more information on
interface guidelines and requirements, please review Interface Guidelines and Universal
Menu in the Mobile SDK documentation.
These simple scripts for assisting with mobile development are located in Assets/OVR/Scripts/Util:
OVRChromaticAberration.csDrop-in component for toggling chromatic aberration correction on and off for Android.
OVRDebugGraph.cs Drop-in component for toggling the TimeWarp debug graph on and off. Information
regarding the TimeWarp Debug Graph may be found in the TimeWarp technical note in
the Mobile SDK documentation.
OVRModeParms.cs Example code for de-clocking your application to reduce power and thermal load as
well as how to query the current power level state.
OVRMonoscopic.cs Drop-in component for toggling Monoscopic rendering on and off for Android.
See our Oculus Utilities for Unity Reference Manual for a more detailed look at these and other C# scripts.
Undocumented scripts may be considered internal, and should generally never be modified.
Note: SDK Examples has been replaced by the Unity Sample Framework - we recommend you use that
instead. See Unity Sample Framework on page 40 for more information.
The Examples include sample scenes, scripts that allow you to toggle features including chromatic aberration
correction, TimeWarp debug graph, and monoscopic rendering, and samples that illustrate typical
implementations of touchpad input, volume control, and more.
To download, select Platform: Game Engines from our Downloads page here: https://developer.oculus.com/
downloads/
To import SDKExamples into Unity, begin by creating a new, empty project. Then select Assets >Import
Package > Custom Package... and select SDKExamples.unityPackage to import the assets into your project.
Alternately, you can locate the SDKExamples.unityPackage and double-click to launch, which will have the
same effect.
20|A Detailed Look at Oculus Utilities for Unity|Unity
Once imported, replace your Unity project's ProjectSettings folder with the ProjectSettings folder included with
SDKExamples.
Note: If you don't replace the ProjectSettings folder, imported scenes will show console errors.
30Hz_Sample An example of how to set the TimeWarp vsync rate to support 30Hz apps, as well
as how to enable Chromatic Aberration Correction and Monoscopic Rendering
for Android. For more information on 30Hz TimeWarp and Chromatic Aberration
Correction for Android, please review the TimeWarp technical note in the Mobile SDK
documentation.
Crosshair_Sample An example of how to use a 3D cursor in the world with three different modes.
GlobalMenu_Sample An example demonstrating Back Key long-press action and the Universal Menu.
Additionally demonstrates a gaze cursor with trail. For more information on Interface
Guidelines and requirements, please review the following documents: Interface
Guidelines and Universal Menu in the Mobile SDK documentation.
LayeredCameras Illustrates how to render nearby and far-away content with separate cameras.
Menu_Sample An example demonstrating a simple in-game menu activated by Back Key short-press
action. The menu also uses the Battery Level API for displaying the current battery level
and temperature.
SaveState_Sample An example demonstrating saving the state of the game on pause and loading it on
resume. Click on the objects in the scene to change their color. When you run the scene
again, the objects should be in the color you had selected before exiting.
Startup_Sample An example of a quick, comfortable VR app loading experience utilizing a black splash
screen, VR enabled logo scene, and an async main level load. For more information
on interface guidelines, please review Interface Guidelines and Universal Menu in the
Mobile SDK documentation.
Crosshair3D.cs Detailed code for how to create judder-free crosshairs tied to the camera view.
MoviePlayerSample.cs Example code and documentation for how to play an in-game video on a textured quad
using Android MediaPlayer (mobile) or Unity's native media rendering (PC).
Unity|A Detailed Look at Oculus Utilities for Unity|21
StartupSample.cs Example code for loading a minimal-scene on startup while loading the main scene in
the background.
TimeWarp30HzSample.cs Example code for setting up TimeWarp to support 30Hz apps as well as toggling
Chromatic Aberration Correction and Monoscopic Rendering on and off.
22|OVRInput|Unity
OVRInput
OVRInput exposes a unified input API for multiple controller types. It may be used to query virtual or raw
controller state, such as buttons, thumbsticks, triggers, and capacitive touch data. It currently supports
the Oculus Touch, Microsoft Xbox controllers, and the Oculus Remote on desktop platforms. Gamepads
compatible with Samsung Gear VR, such as the Samsung EI-GP20 and Moga Pro, must be Android compatible
and support Bluetooth 3.0. For more details on supported mobile gamepad features, see System and Hardware
Requirements in our Mobile SDK documentation.
When used with tracked controllers such as Oculus Touch, OVRInput provides position and orientation data
through GetLocalControllerPosition() and GetLocalControllerRotation(), which return a
Vector3 and Quaternion, respectively.
Controller poses are returned by the constellation tracking system and are predicted simultaneously with
the headset. These poses are reported in the same coordinate frame as the headset, relative to the initial
center eye pose, and may be used for rendering hands or objects in the 3D world. They are also reset by
OVRManager.display.RecenterPose(), similar to the head and eye poses.
OVRInput provides control of haptic vibration feedback on compatible controllers. For example,
SetControllerVibration() sets vibration frequency and amplitude. SetControllerVibration()
support for Oculus Touch is now deprecated; please use OVRHaptics on page 29 instead.
For keyboard and mouse control, we recommend using the UnityEngine.Input scripting API (see Unitys Input
scripting reference for more information).
Mobile input bindings are automatically added to InputManager.asset if they do not already exist.
For more information, see OVRInput in the Unity Developer Reference. For more information on Unitys
input system and Input Manager, documented here: http://docs.unity3d.com/Manual/Input.html and http://
docs.unity3d.com/ScriptReference/Input.html.
Note: The term Touch in OVRInput refers to actual Oculus Touch controllers.
See OVRTouchpad.cs in Assets/OVR/Scripts for our interface class to the touchpad. The Gear VR HMD
touchpad is not currently exposed by OVRInput.
OVRInput Usage
The primary usage of OVRInput is to access controller input state through Get(), GetDown(), and GetUp().
Control Enumerates
OVRInput.RawButton
OVRInput.RawTouch
OVRInput.RawNearTouch
OVRInput.RawAxis1D
OVRInput.RawAxis2D
The first set of enumerations provides a virtualized input mapping that is intended to assist developers with
creating control schemes that work across different types of controllers. The second set of enumerations
provides raw unmodified access to the underlying state of the controllers. We recommend using the first set of
enumerations, since the virtual mapping provides useful functionality, as demonstrated below.
More on Controls
Example Usage:
// returns true if the primary button (typically A) was pressed this frame.
OVRInput.GetDown(OVRInput.Button.One);
// returns a Vector2 of the primary (typically the Left) thumbsticks current state.
// (X/Y range of -1.0f to 1.0f)
OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick);
// returns true if the primary thumbstick has been moved upwards more than halfway.
// (Up/Down/Left/Right - Interpret the thumbstick as a D-pad).
OVRInput.Get(OVRInput.Button.PrimaryThumbstickUp);
// returns a float of the secondary (typically the Right) index finger triggers current state.
// (range of 0.0f to 1.0f)
OVRInput.Get(OVRInput.Axis1D.SecondaryIndexTrigger);
// returns true if the left index finger trigger has been pressed more than halfway.
// (Interpret the trigger as a button).
OVRInput.Get(OVRInput.RawButton.LIndexTrigger);
// returns true if the secondary gamepad button, typically B, is currently touched by the user.
OVRInput.Get(OVRInput.Touch.Two);
In addition to specifying a control, Get() also takes an optional controller parameter. The list of supported
controllers is defined by the OVRInput.Controller enumeration (for details, refer to OVRInput in the Unity
Developer Reference.
Specifying a controller can be used if a particular control scheme is intended only for a certain controller type.
If no controller parameter is provided to Get(), the default is to use the Active controller, which corresponds
to the controller that most recently reported user input. For example, a user may use a pair of Oculus Touch
controllers, set them down, and pick up an Xbox controller, in which case the Active controller will switch to
the Xbox controller once the user provides input with it. The current Active controller can be queried with
OVRInput.GetActiveController() and a bitmask of all the connected Controllers can be queried with
OVRInput.GetConnectedControllers().
Example Usage:
// returns a float of the Hand Triggers current state on the Left Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.Touch);
// returns a float of the Hand Triggers current state on the Right Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.SecondaryHandTrigger, OVRInput.Controller.Touch);
Note that the Oculus Touch controllers may be specified either as the combined pair (with
OVRInput.Controller.Touch), or individually (with OVRInput.Controller.LTouch and RTouch). This
is significant because specifying LTouch or RTouch uses a different set of virtual input mappings that allow
more convenient development of hand-agnostic input code. See the virtual mapping diagrams in Touch Input
Mapping for an illustration.
Example Usage:
// returns a float of the Hand Triggers current state on the Left Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.LTouch);
// returns a float of the Hand Triggers current state on the Right Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.RTouch);
Unity|OVRInput|25
This can be taken a step further to allow the same code to be used for either hand by specifying the controller
in a variable that is set externally, such as on a public variable in the Unity Editor.
Example Usage:
// public variable that can be set to LTouch or RTouch in the Unity Inspector
public Controller controller;
// returns a float of the Hand Triggers current state on the Oculus Touch controller
// specified by the controller variable.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, controller);
// returns true if the primary button (A or X) is pressed on the Oculus Touch controller
// specified by the controller variable.
OVRInput.Get(OVRInput.Button.One, controller);
This is convenient since it avoids the common pattern of if/else checks for Left/Right hand input mappings.
Raw Mapping
The raw mapping directly exposes the Touch controllers. The layout of the Touch controllers closely matches
the layout of a typical gamepad split across the Left and Right hands.
Unity|OVRInput|27
Raw Mapping
Raw Mapping
The raw mapping directly exposes the Xbox controller.
Unity|OVRHaptics|29
OVRHaptics
This guide reviews OVRHaptics and OVRHapticsClip, two C# scripts that programmatically control haptics
feedback for the Oculus Touch controller.
Haptics Clips
Haptics clips specify the data used to control haptic vibrations in Touch controllers.
Vibrations are specified by an array of bytes or samples, which specify vibration strength from 0-255. This
data can be sent to the left and right touch controllers independently, which process the amplitudes at a
sample rate of 320 Hz. The duration of vibration is determined by the number of bytes sent to the devices.
Haptics clips may be created in different ways, depending on your needs. For example, you may manually
create a clip with a pre-allocated fixed size buffer, and then write in bytes procedurally. This allows you to
generate vibrations on a frame-by-frame basis.
The OVRHaptics class is used to produce the actual vibrations. It defines a LeftChannel and a RightChannel.
You can also access these channels through the aliased Channels property, where Channels[0] maps to
LeftChannel, and Channels[1] maps to RightChannel. This alias is useful when using a variable for the channel
index in a script that can be associated with either hand..
Once you have selected a haptics channel, you may perform four operations with the following
OVRHapticsChannel member functions:
See our Developer Reference for API documentation and details on the relevant classes and members.
OVRHapticsClip reads in an audio clip, downsamples the audio data to a sequence of bytes with the expected
sample rate and amplitude range, and feeds that data into the clips internal amplitude buffer.
We generally recommend AudioClip-generated haptics clips for static sound effects such as gunshots or
music that do not vary at runtime. However, you may wish to write your own code to pipe the audio output
of an audio source in realtime to a OVRHapticsClip, allowing you near-realtime conversion of audio into
corresponding haptics data.
Best Practices
It is important to keep your sample pipeline at around the right size. Assuming a haptic frequency of 320 Hz
and an application frame rate of 90 Hz, we recommend targeting a buffer size of around 10 clips per frame. This
allows you to play 3-4 haptics clips per frame, while preserving a buffer zone to account for any asynchronous
30|OVRHaptics|Unity
interruptions. The more bytes you queue, the safer you are from interruptions, but you add additional latency
before newly queued vibrations will be played.
Note: For use with Oculus Touch only.
Unity|VR Compositor Layers|31
VR Compositor Layers
OVROverlay is a script in OVR/Scripts that allows you to render Game Objects as VR Compositor Layers instead
of drawing them to the eye buffer.
OVROverlay
Game Objects rendered as VR compositor layers render at the frame rate of the compositor instead of
rendering at the application frame rate. They are less prone to judder, and they are raytraced through the
lenses, improving the clarity of textures displayed on them. This is useful for displaying easily-readable text.
Quadrilateral compositor layers are currently supported by Rift and mobile, while cubemap and cylinder
compositor layers are currently available in mobile only.
All layer types support both stereoscopic and monoscopic rendering, though stereoscopic rendering only
makes sense for cubemaps in most cases. Stereoscopically-rendered overlays require two textures, specified by
setting Size to 2 in the Textures field of OVROverlay in the Inspector.
Gaze cursors and UIs are good candidates for rendering as quadrilateral compositor layers. Cylinders may be
useful for smooth-curve UI interfaces. Cubemaps may be used for startup scenes or skyboxes.
We recommend using a cubemap compositor layer for your loading scene, so it will always display at a steady
minimum frame rate, even if the application performs no updates whatsoever.
32|VR Compositor Layers|Unity
Applications may add three compositor layers to a scene. You may use no more than one cylinder and one
cubemap compositor layer per scene.
Note that if a compositor layer fails to render (e.g., you attempt to render more than three compositor layers),
only quads will currently fall back and be rendered as scene geometry. Cubemaps and cylinders will not display
at all, but similar results can be achieved with scene geometry such as Unitys Skybox component or Cylinder
MeshFilter.
You may use OVRRTOverlayConnector to render textures to a compositor layer. See OVRRTOverlayConnector
below for more information.
1. Whether objects are rendered in front of or behind the scene geometry rendered to the eye buffer, and
2. The sequence in which the compositor layers are enabled in the scene.
By default, VR compositor layers are displayed as overlays in front of the eye buffer. To place them behind the
eye buffer, set Current Overlay Type to Underlay in the Inspector. Note that underlay compositor layers are
more bandwidth-intensive, as the compositor must punch a hole in the eye buffer with an alpha mask so
that underlays are visible. Texture bandwidth is often a VR bottleneck, so use them with caution and be sure to
assess their impact on your application.
Underlays depend on the alpha channel of the render target. If a scene object that should occlude an underlay
is opaque, set its alpha to 1. If the occluder is transparent, you must use the OVRUnderlayTransparentOccluder
shader provided in the Utilities in Assets/OVR/Shaders. Overlays do not require any special handling for
transparency.
Compositor layers are depth ordered by the sequence in which they are enabled in the scene, but the order is
reversed for overlays and underlays. Underlays should be enabled in the scene in the sequence in which you
want them to appear, enabling the underlays in front first and the layers in the back last. Overlays should be
enabled in the opposite order.
Basic usage
Example
In this example, most of the scene geometry is rendered to the eye buffer. The application adds a gaze cursor
as a quadrilateral monoscopic overlay and a skybox as a monoscopic cubemap underlay behind the scene.
Unity|VR Compositor Layers|33
Note the dotted sections of the eye buffer, indicating where OVROverlay has punched a hole to make the
skybox visible behind scene geometry.
In this scene, the quad would be set to Current Overlay Type: Overlay and the cubemap would be set to
Current Overlay Type: Underlay. Both would be disabled, then the quad overlay enabled, then the skybox
enabled.
Note that if the cubemap in our scene were transparent, we would need to use the
OVRUnderlayTransparentOccluder, which is required for any underlay with alpha less than 1. If it were
stereoscopic, we would need to specify two textures and set Size to 2.
The center of a cylinder overlay Game Objects is used as the cylinders center. The dimensions of the cylinder
are encoded in transform.scale as follows:
Only half of the cylinder may be displayed, so the arc angle must be smaller than 180 degrees.
Unity|VR Compositor Layers|35
OVRRTOverlayConnector
OVRRTOverlayConnector is a helper class in OVR/Scripts/Util used to link a Render Texture to an OVROverlay
Game Object. Attach this script to your camera object, and specify your overlay owner object in Ovr Overlay
Obj.
The overlay camera must use Render Texture, and must be rendered before the Main Camera (e.g., using
camera depth), so the Render Texture will be available before being used.
OVRRTOverlayConnector triple-buffers the render results before sending them to the overlay, which is a
requirement for time warping a render target. It also clears the render Texture's border to alpha = 0 to avoid
artifacts on mobile.
For more information, see "OVRRTOverlayConnector" in our Unity Developer Reference.
36|OVRBoundary|Unity
OVRBoundary
OVRBoundary exposes an API for interacting with the Oculus Guardian System.
Note: The Guardian System is not yet supported by public versions of the Oculus runtime.
During Touch setup, users define an interaction area by drawing a perimeter called the Outer Boundary in
space with the controller. An axis-aligned bounding box called the Play Area is calculated from this perimeter.
When tracked devices approach the Outer Boundary, the Oculus runtime automatically provides visual cues to
the user demarcating the Outer Boundary. This behavior may not be disabled or superseded by applications,
though the Guardian System visualization may be disabled via user configuration in the Oculus App.
Possible use cases include pausing the game if the user leaves the Play Area, placing geometry in the world
based on boundary points to create a natural integrated barrier with in-scene objects, disabling UI when the
boundary is being rendered to avoid visual discomfort, et cetera.
Basic Use
Boundaries are BoundaryType.OuterBoundary and BoundaryType.PlayArea.
Applications may query the location of nodes relative to the Outer Boundary or Play Area by using
OVRBoundary.BoundaryTestResult TestNode(), which takes the node and boundary type as
arguments.
Applications may also query arbitrary points relative to the Play Area or Outer Boundary using
OVRBoundary.BoundaryTestResult TestPoint(), which takes the point coordinates in the tracking
space as a Vector3 and boundary type as arguments.
Results are returned as a struct called OVRBoundary.BoundaryTestResult, which includes the following
members:
IsTriggering bool Returns true if the node or point triggers the queried boundary
type.
ClosestDistance float Distance between the node or point and the closest point of the
test area.
ClosestPoint Vector3 Describes the location in tracking space of the closest boundary
point to the queried node or point.
ClosestPointNormal Vector3 Describes the normal of the boundary point that is closest to the
queried node or point.
display, and setting the visibility to true will fail if the user has disabled the visual display of the boundary
system.
Applications may query the current state of the boundary system using OVRBoundary.GetVisible().
Additional Features
You may set the boundary color of the automated Guardian System visualization using
OVRBoundary.SetLookAndFeel(). Alpha is unaffected. Use ResetLookAndFeel() to reset.
OVRBoundary.GetGeometry() returns an array of up to 256 points that define the Boundary Area or Play
Area in clockwise order at floor level. You may query the dimensions of a Boundary Area or Play Area using
OVRBoundary.GetDimensions(), which returns a Vector3 containing the width, height, and depth in
tracking space units, with height always returning 0.
38|Cubemap Screenshots|Unity
Cubemap Screenshots
The OVR Screenshot Wizard allows you to easily export a 360 screenshot in cubemap format.
Cubemap previews may be submitted with applications to provide a static in-VR preview for the Oculus Store.
For more information, see Oculus Store Art Guidelines (PDF).
You may also use OVRCubemapCaptureProbe to take a 360 screenshot from a running Unity app. (see Prefabs
on page 13 for more information).
Basic Usage
When you import the Oculus Utilities OVRScreenshotWizard into your project, it will add a new Tools pull-down
menu to your menu bar. Select Tools > Oculus > OVR Screenshot Wizard to launch the tool.
Unity|Cubemap Screenshots|39
By default, the screenshot will be taken from the perspective of your Main Camera. To set the perspective to a
different position, assign any Game Object to the Render From field in the Wizard and click Render Cubemap
to save.
The generated cubemap may be saved either as a Unity Game Object, or as a horizontal 2D atlas texture in
PNG or JPG format with the following face order (Horizontal left to right): +x, -x, +y, -y, +z, -z.
Options
Render From: You may use any Game Object as the "camera" that defines the position from which the
cubemap will be captured.
To assign a Game Object to function as the origin perspective, select any instantiated Game Object in the
Hierarchy View and drag it here to set it as the rendering position in the scene. You may then position the
Game Object anywhere in the scene.
If you do not specify a Game Object in this field, the screenshot will be taken from the Main Camera.
Note: If the Game Object extends into the visible area of the scene, it will be included in the capture.
This may be useful if you wish to lock art to the origin point, e.g., if you wished to show looking out on
the scene from a cage, for example. If you do not want the Game Object to be visible, be sure to use a
simple object like a cube or a sphere, or simply use the scene Main Camera.
Size: Sets the resolution for each "tile" of the cubemap face. For submission to the Oculus Store, select 2048
(default, see Oculus Store Art Guidelines for more details).
Save Mode
Cube Map Folder: The directory where OVR Screenshot Wizard creates the Unity format Cubemap. The path
must be under the root asset folder "Assets"
Texture Format: Sets the image format of 2D atlas texture (PNG or JPEG).
The Unity Sample Framework can guide developers in producing reliable, comfortable applications and
avoiding common mistakes. It is available as a Unity project for developers who wish to examine how the
sample scenes were implemented, and as binaries for the Rift and Gear VR for developers to explore the
sample scenes entirely in VR. It is available as a separate download from our Downloads Center.
The Unity Sample Framework requires Unity v 5.3+. Please check Compatibility and Requirements for up-to-
date version recommendations.
Sample Scenes
In the Unity project, the following scenes are found in /Assets/SampleScenes:
A Note on Comfort
These samples are intended to be tools for exploring design ideas in VR, and should not necessarily be
construed as design recommendations. The Sample Framework allows you to set some parameters to values
that will reliably cause discomfort in most users - they are available precisely to give developers an opportunity
to find out how much is too much.
It is as important to play test your game on a range of players throughout development to ensure your game
is a comfortable experience. We have provided in-game warnings to alert you to potentially uncomfortable
scenes.
Version Compatibility
Oculus Rift Executable: Sample Framework versions 1.3 and later are compatible with the Oculus Rift CV1
runtime.
Gear VR Executable: Download the latest version of the Sample Framework from the Concepts section of
the Oculus Store to be sure you're up to date.
Unity Project File: Check the Sample Framework Unity Project File Release Notes and Compatibility and
Requirements for information on which Unity 5 versions are compatible with the project file.
1. Verify that you have installed the latest-recommended version of Unity 5 (see Compatibility and
Requirements for up-to-date information).
2. Download the Unity Sample Framework Project File from our Downloads Center.
3. Copy the zip to the appropriate directory and extract its contents.
4. Launch the Unity Editor and open the Sample Framework project.
Note: You will need to enable running applications from unknown sources in the Oculus app settings.
Launch the Oculus app, and in the gear pull-down menu in the upper right, select Settings > General
and toggle Unknown Sources on to allow. You may wish to disable this setting after use for security
reasons.
1. Open the Sample Framework project as described above.
2. From the Editor menu bar, select OVR > Samples Build Config > Configure Rift Build.
3. Build and run the project normally.
We have provided a Windows executable for use with the Oculus Rift or DK2. A Samsung Gear VR may be
downloaded for free from the Concepts section of the Oculus Store. These applications are simply builds of the
Unity project.
Navigation
Launch the Sample Framework on Rift or Gear VR to load the startup scene. You will see the Inspector, a three-
pane interface providing controls for scene settings, documentation, and navigation controls for browsing
to other scenes. Making selections on the top-level menu on the left panel changes the content of the other
Unity|Unity Sample Framework|43
two panels. The center panel is a contextual menu, and the right panel displays notes and instructions for the
current scene.
Inspector navigation is primarily gaze-controlled, supplemented by a mouse and keyboard, a gamepad (PC or
Gear VR), or the Gear VR touchpad.
To launch a scene from the center panel, you may select and click the scene with a mouse, gaze at the scene
name and press the A button on a gamepad, or tap the Gear VR touchpad.
Some scenes are grouped into folders (displayed as buttons). When browsing from a folder, select the ..
button to navigate one level up in the scene hierarchy.
Scrolling
Some panels support vertical scrolling. Several methods of scrolling are supported in order to illustrate some of
the available options for implementing this feature. The following methods are supported:
Build Settings
Click on File > Build Settings... and select one of the following:
For Windows, set Target Platform to Windows and set Architecture to either x86 or x86 64.
Note: Be sure to add any scenes you wish to include in your build to Scenes In Build..
In the Build Settings pop-up, select Build. If prompted, specify a name and location for the build.
If you are building in the same OS, the demo should start to run in full screen mode as a standalone
application.
Quality Settings
You may notice that the graphical fidelity is not as high as the pre-built demo. You will need to change some
additional project settings to get a better looking scene.
Navigate to Edit > Project Settings > Quality. Set the values in this menu to the following:
The most important value to modify is Anti-aliasing. The anti-aliasing must be increased to compensate for the
stereo rendering, which reduces the effective horizontal resolution by 50%. An anti-aliasing value of 2X is ideal -
4x may be used if you have performance to spare, and 8x usually isn't worth it.
Now rebuild the project again, and the quality should be at the same level as the pre-built demo.
PC builds create a single executable file that may be used in either Direct Display or Extended Display modes.
Unity|Configuring for Build|47
Android Manifest
The manifests of projects built with Unity's first-party VR support enabled are automatically updated during
build to meet our requirements (landscape orientation, vr_only, et cetera). All other values, such as Entitlement
Check settings, will not be modified. Do not add the noHistory attribute to your manifest.
Build Settings
From the File menu, select Build Settings. From the Build Settings menu, select Android as the
platform. Set Texture Compression to ASTC.
Player Settings
1. Click the Player Settings button and select the Android tab. In the Other Settings frame, select
Virtual Reality Supported. All required settings are enforced automatically, but you may wish to make
48|Configuring for Build|Unity
additional settings as appropriate, such as enabling Multithreaded Rendering and setting Graphics APIs to
OpenGLES3.
2. Select the Splash Image section. For Mobile Splash image, choose a solid black texture.
Note: Custom Splash Screen support is not available with Unity Personal edition.
Applications written for development are not launched through the Oculus Home menu system. Instead, build
the application directly to your phone, and you will be prompted to insert your phone into the Gear VR headset
to launch the application automatically.
To run the application in the future, remove your phone from the Gear VR headset, launch the application from
the phone desktop or Apps folder, and insert the device into the Gear VR when prompted to do so.
1. Copy an Oculus Signature File specific to your mobile device to the folder <project>/Assets/Plugins/
Android/assets/ or the application will not run. If this folder does not exist, go ahead and create it. See
"Create Your Signature Files" in the Oculus Mobile Submission Guidelines for more information.
2. Be sure the project settings from the steps above are saved with File > Save Project.
3. If you are not already connected to your phone via USB, connect now. Unlock the phone lock screen.
4. From the File menu, select Build Settings. While in the Build Settings menu, add the Main.scene to Scenes
in Build. Next, verify that Android is selected as your Target Platform and select Build and Run. If asked,
specify a name and location for the .apk.
The .apk will be installed and launched on your Android device.
Unity|Debugging and Performance Analysis in Unity|49
General Tips
VR application debugging is a matter of getting insight into how the application is structured and executed,
gathering data to evaluate actual performance, evaluating it against expectation, then methodically isolating
and eliminating problems.
When analyzing or debugging, it is crucial to proceed in a controlled way so that you know specifically what
change results in a different outcome. Focus on bottlenecks first. Only compare apples to apples, and change
one thing at a time (e.g., resolution, hardware, quality, configuration).
Always be sure to profile, as systems are full of surprises. We recommend starting with simple code, and
optimizing as you go - dont try to optimize too early.
We recommend creating a 2D, non-VR version of your camera rig so you can swap between VR and non-VR
perspectives. This allows you to spot check your scenes, and it may be useful if you want to do profiling with
third-party tools (e.g., Adreno Profiler).
It can be useful to disable Multithreaded Rendering in Player Settings during performance debugging. This will
slow down the renderer, but also give you a clearer view of where your frame time is going. Be sure to turn it
back on when youre done!
Performance Targets
Before debugging performance problems, establish clear targets to use as a baseline for calibrating your
performance.
These targets can give you a sense of where to aim, and what to look at if youre not making frame rate or are
having performance problems.
Below you will find some general guidelines for establishing your baselines, given as approximate ranges unless
otherwise noted.
Mobile
60 FPS (required by Oculus)
50-100 draw calls per frame
50,000-100,000 triangles or vertices per frame
PC
90 FPS (required by Oculus)
500-1,000 draw calls per frame
1-2 million triangles or vertices per frame
Unity Profiler
Unity comes with a built-in profiler (see Unitys Profiler manual). The Unity Profiler provides per-frame
performance metrics, which can be used to help identify bottlenecks.
Unity Pro comes with a built-in profiler. The profiler provides per-frame performance metrics, which can be used
to help identify bottlenecks.
PC Setup
To use Unity Profiler with a Rift application, select Development Build and Autoconnect Profiler in Build
Settings and build your application. When you launch your application, the Profiler will automatically open.
Mobile Setup
You may profile your application as it is running on your Android device using adb or Wi-Fi. For steps on
how to set up remote profiling for your device, please refer to the Android section of the following Unity
documentation: https://docs.unity3d.com/Documentation/Manual/Profiler.html.
The Unity Profiler displays CPU utilization for the following categories: Rendering, Scripts, Physics,
GarbageCollector, and Vsync. It also provides detailed information regarding Rendering Statistics, Memory
Usage (including a breakdown of per-object type memory usage), Audio and Physics Simulation statistics.
The Unity profiler only displays performance metrics for your application. If your app isnt performing as
expected, you may need to gather information on what the entire system is doing.
Unity|Debugging and Performance Analysis in Unity|51
In this mode, translucent colors will accumulate providing an overdraw heat map where more saturated
colors represent areas with the most overdraw.
52|Debugging and Performance Analysis in Unity|Unity
To use this profiler, connect to your device over Wi-Fi using ADB over TCPIP as described in the Wireless usage
section of Androids adb documentation. Then run adb logcat while the device is docked in the headset.
See Unitys Measuring Performance with the Built-in Profiler for more information. For more on using adb and
logcat, see Android Debugging in the Mobile SDK documentation.
Oculus Remote Monitor is available from our Downloads page. For more information about setup, features, and
use, see Oculus Remote Monitor in our Mobile SDK guide.
Feature Highlights
The Frame Buffer Viewer provides a mechanism for inspecting the frame buffer as the data is received in
real-time, which is particularly useful for monitoring play test sessions. When enabled, the Capture library
will stream a downscaled pre-distortion eye buffer across the network.
The Performance Data Viewer provides real-time and offline inspection of the following on a single,
contiguous timeline:
CPU/GPU events
Sensor readings
Console messages, warnings, and errors
Frame buffer captures
The Logging Viewer provides raw access to various messages and errors tracked by thread IDs.
Nearly any constant in your code may be turned into a knob that can be updated in real-time during a play
test.
Event Tracing for Windows (ETW) is a trace utility provided by Windows for performance analysis. GPUView
view provides a window into both GPU and CPU performance with DirectX applications. It is precise, has low
overhead, and covers the whole Windows system. Custom event manifests.
54|Debugging and Performance Analysis in Unity|Unity
ETW profiles the whole system, not just the GPU. For a sample debug workflow using ETW to investigate
queuing and system-level contention, see Example Workflow: PC below.
Windows 10 replaces ETW with Tracelogging.
Systrace
Reports complete Android system utilization. Available here: http://developer.android.com/tools/help/
systrace.html
NVIDIA NSight
NSight is a CPU/GPU debug tool for NVIDIA users, available in a Visual Studio version and an Eclipse version.
APITrace
https://apitrace.github.io/
Analyzing Slowdown
In this guide, we take a look at three of the areas commonly involved with slow application performance: pixel
fill, draw call overhead, and slow script execution.
Pixel Fill
Pixel fill is a function of overdraw and of fragment shader complexity. Unity shaders are often implemented
as multiple passes (draw diffuse part, draw specular part, and so forth). This can cause the same pixel to be
touched multiple times. Transparency does this as well. Your goal is to touch almost all pixels on the screen
only one time per frame.
Unity's Frame Debugger (described in Unity Profiling Tools on page 50) is very useful for getting a sense
of how your scene is drawn. Watch out for large sections of the screen that are drawn and then covered, or for
objects that are drawn multiple times (e.g., because they are touched by multiple lights).
Z-testing is faster than drawing a pixel. Unity does culling and opaque sorting via bounding box. Therefore,
large background objects (like your Skybox or ground plane) may end up being drawn first (because the
bounding box is large) and filling a lot of pixels that will not be visible. If you see this happen, you can move
those objects to the end of the queue manually. See Material.renderQueue in Unity's Scripting API Reference
for more information.
Frame Debugger will clearly show you shadows, offscreen render targets, et cetera.
Draw Calls
Modern PC hardware can push a lot of draw calls at 90 fps, but the overhead of each call is still high enough
that you should try to reduce them. On mobile, draw call optimization is your primary scene optimization.
Draw call optimization is usually about batching multiple meshes together into a single VBO with the same
material. This is key in Unity because the state change related to selecting a new VBO is relatively slow. If you
select a single VBO and then draw different meshes out of it with multiple draw calls, only the first draw call is
slow.
Unity batches well when given properly formatted source data. Generally:
Unity|Debugging and Performance Analysis in Unity|55
Batching is only possible for objects that share the same material pointer.
Batching doesn't work on objects that have multiple materials.
Implicit state changes (e.g. lightmap index) can cause batching to end early.
Script Performance
Unity's C# implementation is fast, and slowdown from script is usually the result of a mistake and/or an
inadvertent block on slow external operations such as memory allocation. The Unity Profiler can help you find
and fix these scripts.
Try to avoid foreach, lamda, and LINQ structures as these allocate memory needlessly at runtime. Use a for
loop instead. Also, be wary of loops that concatenate strings.
GameObject creation and destruction takes time. If you have a lot of objects to create and destroy (say, several
hundred in a frame), we recommend pooling them.
Don't move colliders unless they have a rigidbody on them. Creating a rigidbody and setting isKinematic
will stop physics from doing anything but will make that collider cheap to move. This is because Unity maintains
two collider structures, a static tree and a dynamic tree, and the static tree has to be completely rebuilt every
time any static object moves.
Note that coroutines execute in the main thread, and you can have multiple instances of the same coroutine
running on the same script.
We recommend targeting around 1-2 ms maximum for all Mono execution time.
PC Debug Workflow
In this guide, well use the example of a hypothetical stuttering app scene and walk through basic steps
debugging steps.
Where to Start
Begin by running the scene with the Oculus Performance HUD.
56|Debugging and Performance Analysis in Unity|Unity
If the scene drops more than one frame every five seconds, check the render time. If its more than 8 ms, have a
look at GPU utilization. Otherwise, look at optimizing CPU utilization. If observed latency is greater than 30 ms,
have a look at queuing.
If you find garbage collection spikes, dont allocate memory each frame.
Check for hogs in your hierarchy or timeline view, such as any single object that takes 8 ms to render. The GPU
may also wait for long stalls on CPU. Other potential problem areas are mesh rendering, shadows, vsync, and
subsystems.
Mobile Tips
Use Oculus Remote Monitor (Mobile) on page 52 for VRAPI, render times, and latency. Systrace shows CPU
queueing.
It is a common problem to see Gfx.WaitForPresent appear frequently in Oculus Remote Monitor. This reports
the amount of time the render pipeline is stalled, so begin troubleshooting by understanding your scene is
assembled by Unity - the Unity Frame Debugger is a good starting place. See Unity Profiling Tools on page
50 for more information.
Unity|Debugging and Performance Analysis in Unity|57
General Issues
Unity 5 hangs while importing assets from SDKExamples.
Be sure to delete any previously-imported Utilities packages from your Unity project before importing a new
version. If you are receiving errors and have not done so, delete the relevant folders in your project and re-
import Utilities. For more information, please see Importing the Oculus Utilities Package on page 9.
PC Issues
The app does not launch as a VR app.
Verify that you are using a compatible runtime - see Compatibility and Requirements for more details.
Ensure you have administrator rights to the system where you are installing the integration. Verify that the HMD
is plugged in and working normally, and that you have installed the Oculus runtime. Also verify that you have
not selected D3D 9 or Windows GL as the renderer (Legacy Integration only).
OS X Issues
Mac Tearing (Unity VR with Utilities)
Editor preview and standalone players do not vsync properly, resulting in a vertical tear and/or judder on DK2.
Android players are unaffected, even if built on a Mac.
Mobile
The app does not launch as a VR app.
Verify that you have selected Virtual Reality Enabled in Player Settings.
Contact Information
Questions?
Good performance is critical for all VR applications, but the limitations inherent to mobile development warrant
special consideration.
We recommend that you also review Design Guidelines and Mobile VR Design and Performance Guidelines in
the Mobile SDK documentation.
Design Considerations
Start up Sequence
For good VR experiences, all graphics should be rendered such that the user is always viewing a proper three-
dimensional stereoscopic image. Additionally, head-tracking must be maintained at all times. We recommend
considering using a cubemap overlay for your startup screen (see VR Compositor Layers on page 31), which will
render at a consistent frame rate even if the application is unavailable to update the scene.
An example of how to do this during application start up is demonstrated in the SDKExamples Startup_Sample
scene:
Solid black splash image is shown for the minimum time possible.
A small test scene with 3D logo and 3D rotating widget or progress meter is immediately loaded.
While the small start up scene is active, the main scene is loaded in the background.
Once the main scene is fully loaded, the start scene transitions to the main scene using a fade.
Universal Menu
Applications must handle the Back Key long-press action which launches the Universal Menu as well as the Back
Key short-press action which launches the Confirm-Quit to Home Menu and exits the current application,
returning to the Oculus Home application.
For sample implementation of Universal Menu support for mobile, have a look at GlobalMenu_Sample,
included in Oculus Mobile SDK Examples (available from our Downloads page). For more information, see our
Oculus Mobile SDK Examples guide.
See the class description of OVRPlatformMenu in our Unity Developer Reference for details about the relevant
public members.
Volume
The volume buttons are reserved, and volume adjustment on the Samsung device is handled automatically.
Volume control dialog is also handled automatically by the VrApi as of Mobile SDK 1.0.3. Be sure that you dont
implement your own volume handling display, or users will see two juxtaposed displays.
Best Practices
Be Batch Friendly. Share materials and use a texture atlas when possible.
Prefer lightmapped, static geometry.
Prefer lightprobes instead of dynamic lighting for characters and moving objects.
Bake as much detail into the textures as possible. E.g., specular reflections, ambient occlusion.
Only render one view per eye. No shadow buffers, reflections, multi-camera setups, et cetera.
Keep the number of rendering passes to a minimum. No dynamic lighting, no post effects, don't resolve
buffers, dont use grabpass in a shader, et cetera.
Avoid alpha tested / pixel discard transparency. Alpha-testing incurs a high performance overhead.
Replace with alpha-blended if possible.
Keep alpha blended transparency to a minimum.
Use Texture Compression. Favor ASTC.
Check the Disable Depth and Stencil* checkbox in the Resolution and Presentation pane in Player Settings.
Recommendations
Be mindful of the total number of GameObjects and components your scenes use.
Model your game data and objects efficiently. You will generally have plenty of memory.
Minimize the number of objects that actually perform calculations in Update() or FixedUpdate().
Reduce or eliminate physics simulations when they are not actually needed.
Use object pools to respawn frequently-used effects or objects instead of allocating new ones at runtime.
Use pooled AudioSources versus PlayOneShot sounds, as the latter allocate a GameObject and destroy it
when the sound is done playing.
Avoid expensive mathematical operations whenever possible.
Cache frequently-used components and transforms to avoid lookups each frame.
Use the Unity Profiler to:
Rendering Optimization
Be conservative on performance from the start.
Keep draw calls down.
Be mindful of texture usage and bandwidth.
Keep geometric complexity to a minimum.
Be mindful of fillrate.
Unity provides several built-in features to help reduce draw calls such as batching and culling.
Static batching is used for objects that will not move, rotate or scale, and must be set explicitly per object. To
mark an object static, select the Static checkbox in the object Inspector.
Dynamic batching is used for moving objects and is applied automatically when objects meet certain criteria,
such as sharing the same material, not using real-time shadows, or not using multipass shaders. More
information on dynamic batching criteria may be found here: https://docs.unity3d.com/Documentation/Manual/
DrawCallBatching.html
Culling
Unity offers the ability to set manual per-layer culling distances on the camera via Per-Layer Cull Distance.
This may be useful for culling small objects that do not contribute to the scene when viewed from a given
distance. More information about how to set up culling distances may be found here: https://docs.unity3d.com/
Documentation/ScriptReference/Camera-layerCullDistances.html.
Unity also has an integrated Occlusion Culling system. The advice to early VR titles is to favor modest scenes
instead of open worlds, and Occlusion Culling may be overkill for modest scenes. More information about
the Occlusion Culling system can be found here: http://blogs.unity3d.com/2013/12/02/occlusion-culling-in-
unity-4-3-the-basics/.
Texture Sizes: Favor texture detail over geometric detail, e.g., use high-resolution textures over more
triangles. We have a lot of texture memory, and it is pretty much free from a performance standpoint. That
said, textures from the Asset Store often come at resolutions which are wasteful for mobile. You can often
reduce the size of these textures with no appreciable difference.
Framebuffer Format: Most scenes should be built to work with a 16 bit depth buffer resolution.
Additionally, if your world is mostly pre-lit to compressed textures, a 16 bit color buffer may be used.
Screen Resolution: Setting Screen.Resolution to a lower resolution may provide a sizeable speedup for
most Unity apps.
Verify model vert counts are mobile-friendly. Typically, assets from the Asset Store are high-fidelity and will
need tuning for mobile.
Unity Pro provides a built-in Level of Detail System (not available in Unity Free), allowing lower-resolution
meshes to be displayed when an object is viewed from a certain distance. For more information on how
to set up a LODGroup for a model, see the following: https://docs.unity3d.com/Documentation/Manual/
LevelOfDetail.html
Verify your vertex shaders are mobile friendly. And, when using built-in shaders, favor the Mobile or Unlit
version of the shader.
Bake as much detail into the textures as possible to reduce the computation per vertex: https://
docs.unity3d.com/430/Documentation/Manual/iphone-PracticalRenderingOptimizations.html
Be mindful of GameObject counts when constructing your scenes. The more GameObjects and Renderers in
the scene, the more memory consumed and the longer it will take Unity to cull and render your scene.
Verify your fragment shaders are mobile friendly. And, when using built-in shaders, favor the Mobile or Unlit
version of the shader.
Overdraw: Objects in the Unity opaque queue are rendered in front to back order using depth-testing to
minimize overdraw. However, objects in the transparent queue are rendered in a back to front order without
depth testing and are subject to overdraw.
Avoid overlapping alpha-blended geometry (e.g., dense particle effects) and full-screen post processing
effects.
62|Tutorial: Build a Simple VR Unity Game|Unity
It is intended to serve as a basic introduction for developers who are new to VR development and to Unity.
Once the necessary tools are set up, this process should take a few hours to complete. By the end, you will
have a working mobile application that you can play and demonstrate on your Oculus Rift or Gear VR device, to
the amazement of your friends and loved ones.
We will build and modify the Unity game Roll-a-ball to add VR capability. The game is controllable by keyboard
or by the Samsung EI-GP20 gamepad.
Requirements
Oculus Rift or Gear VR with compatible Samsung phone
Samsung EI-GP20 gamepad (required for Mobile; optional for Desktop)
PC running Windows 7, 8 or 10 or a Mac running OS X 10
Unity 5 (see Compatibility and Requirements for specific version recommendations)
You will also need to refer to the relevant Oculus SDK documentation, available for download here: https://
developer.oculus.com/documentation/
Desktop: Download and install the Oculus PC SDK and Utilities package from Oculus PC SDK Downloads.
Prepare for development as described in the Oculus Rift Getting Started Guide. By the time you have
completed this process, you should be able to run the Demo Scene as described in that guide.
Mobile: Download and install the Oculus Mobile SDK from Oculus Mobile SDK Downloads. Prepare for
development as described by the Mobile SDK Setup Guide. By the time you have completed this process,
you should be able to communicate with your Samsung phone via USB. To verify this, retrieve the device
ID from your phone by connecting via USB and sending the command adb devices from a command
prompt. If you are communicating successfully, the phone will return its device ID. You may wish to make a
note of it - you will need it later to request a Oculus Signature File (see step four in Modify Roll-a-ball for VR
for more information).
2. Install Unity.
Check which version of the Unity editor you should download an install in our Compatibility and Version
Requirements on page 5, then download the appropriate version here: http://docs.unity3d.com/Manual/
index.html. Unity provides extensive documentation to introduce users to the environment. You may wish
to begin by reviewing their documentation to gain a basic familiarity with core concepts such as the Editor,
GameObjects, prefabs, projects, and scenes.
3. Build the Unity Roll-a-ball application.
Unity provides a number of video tutorials that walk you through the process of creating a simple game. The
first in the series provides instructions for creating the Roll-a-ball application, in which you use the keyboard
or gamepad to control a ball that rolls around a game board and picks up floating token counters:http://
unity3d.com/learn/tutorials/projects/roll-a-ball
Unity|Tutorial: Build a Simple VR Unity Game|63
The development process is covered in eight short video tutorials which run from around five to fifteen
minutes in length. Allow for a few hours to complete the procedure.
The final video in the series, "107. Publishing the game," describes building the Roll-a-ball game for play
in a web browser. You may skip this lesson if you wish for the purposes of this exercise, as we will follow a
different procedure for building a playable application (PC/Mac) or APK (Android).
Note: We refer to the assets, folders, and so forth by the names used in the Unity tutorial, so it is
helpful to follow the names they use in their example.
4. Duplicate your Roll-a-ball project (optional).
Once you have completed building Roll-a-ball, you may wish to create a duplicate Roll-a-ball project
specifically for VR development. It can be useful to retain a backup of the original unmodified Roll-a-ball
project in case you make mistakes or wish to work with it later without the VR assets.
To duplicate the Roll-a-ball project, simply navigate in your OS to the Unity project folder containing your
Roll-a-ball project, copy the folder and all of its contents, and rename it. For this tutorial, we will use the
project folder name Roll-a-ball-VR.
5. Launch the new project and prepare the game scene.
1. Launch Unity and select File > Open Project... and select the project folder location for Roll-a-ball-VR in
order to launch the project.
2. In your Project tab, open Assets > _Scenes and select "MiniGame."
3. Press F2 and rename the scene "VRMiniGame."
4. Open the scene "VRMiniGame."
Select Main Camera in the Hierarchy view and set the Position fields of the Main Camera Transform in the
Unity Inspector to the following values: X = 0; Y = 10; Z = -15.
2. Rotate camera forward for a better view.
Set the Rotation field of the OVRCameraRig Transform to the following value: X = 35; Y = 0; Z = 0.
64|Tutorial: Build a Simple VR Unity Game|Unity
Enter Play mode by pressing the play button. The Unity Game View preview will show the image
corresponding to the left eye buffer. If you are using the PC SDK, you will see the Health and Safety
Warning appear over the game; press any key to continue past it.
Go to Edit > Project Settings > Player and select PC or Android as appropriate. In the Other Settings pane,
check the Virtual Reality Supported checkbox.
5. Sign your application (Mobile Only).
To access your Samsung phone's VR capabilities, you will need to sign your application with an Oculus
Signature File (osig). If you recorded your device ID earlier, you may use it now to request your osig file.
Note that you need only one osig per mobile device.
You may obtain an osig from our self-service portal here: https://dashboard.oculus.com/tools/osig-
generator/. Once you have received an osig, copy it to your Unity project folder in /Roll-a-ball-VR/Assets/
Plugins/Android/assets/.
More information may be found on application signing in "Creating Your Signature File" in the Mobile App
Preparation and Submission Guidelines.
Play
Go ahead and try it out! You may use your keyboard or a paired Samsung gamepad to control your ball and
collect the game pickup objects.
Note: Because the GUIText display we built in the Roll-a-ball tutorial will not work with OVRCameraRig
without substantial modification, you will not see the score counter or "You win!" message when all the
pieces have been collected.
More detailed information about all of these topics may be found in our Developer Documentation. Also be
sure to check out Unitys Virtual Reality Documentation.
Question: Whats the best way to get started if youre a complete beginner?
Answer: We recommend reading through this FAQ, and reading through our Unity documentation - especially
the Introduction and Getting Started sections. Find out what the latest-recommended version of Unity 5 is
at our Compatibility and Requirements page, then download and install it. Browse around on our forums.
Download Utilities for Unity 5 from our Downloads page.
Then, read through Unitys excellent documentation and try out some of their introductory tutorials to get
acquainted with Unity development. Build your own simple Unity VR game by following the instructions in our
Getting Started Tutorial.
Question: What is the difference between the Oculus Unity 4 Legacy Integration and the Oculus Utilities for
Unity 5? How do the differences between Unity 4 and 5 affect development?
In Unity 4, you must import our Legacy Integration for Unity 4 unitypackage and add the supplied VR camera
and player controller to your application to add Rift or Gear VR support.
Unity 5.1+ provides VR support for the Oculus Rift and Samsung Gear VR, enabled by checking a box in Player
Settings. We offer a Utilities for Unity 5 unitypackage that includes useful scripts, scenes, and prefabs to assist
development, but unlike Unity 4, it is not required to create a VR application.
All new developers should use Unity 5 and the optional Oculus Utilities package.
Question: What are the system requirements for Unity development for Oculus? What operating systems are
supported for Unity development?
Answer: For the most up-to-date information, see Unity Compatibility and Requirements. We currently support
Windows and OS X for development. The Oculus Rift requires Windows 7, 8 or 10.
Answer: Our latest version recommendations may be found in our Unity Compatibility and Requirements
document. Be sure to check back regularly, as we update it frequently as new SDKs and Unity versions are
released. You can find an archive of information in our Unity-SDK Version Compatibility list.
Question: What other tools and resources do you provide to Unity developers?
Answer: To find the latest tools we provide, browse our Downloads page. You will find our Oculus Spatializer
Plugins to easily add spatialization to your Unity app. The Unity Sample Framework explores solutions
to common VR design problems. Mobile developers will find our SDK Examples dealing with mobile VR
development issues, as well as our Oculus Remote Monitor performance and debugging tool.
Note that some assets are only available to some Unity versions or platforms. For example, the Unity Sample
Framework is available for Unity 5 only, and the Oculus Remote Monitor is only for mobile development.
Question: What do I need to run Rift applications that I build with Unity?
Answer: You will need a compatible Windows PC, an Oculus Rift, and the appropriate version of the Oculus
Runtime, available from our Downloads page.
Unity|Getting Started FAQ|67
Question: I want to focus on mobile development for the Samsung Gear VR. What do I need to do to get
started? Do I need to download the Oculus Mobile SDK?
Answer: The Android SDK is required for mobile development with Unity. However, most Unity developers
do not need to download the Oculus Mobile SDK, or to install Android Studio or NDK. Follow the instructions
in our Device Setup guide install the Java Development Kit (JDK) and Android SDK before beginning
development, and then use Unity with our Legacy Integration or Utilities package, and target the Android
platform when you build.
See Unitys Getting Started with Android Development for more information.
Question: Can I develop a single application for both Samsung Gear VR and the Oculus Rift?
Answer: Yes, but when developing for both Rift and mobile platforms, keep in mind that the requirements for
PC and mobile VR applications differ substantially. If you would like to generate builds for both PC and mobile
from a single project, it is important to follow the more stringent mobile development best practices, as well as
meeting the required 90 fps required by the Rift.
Answer: Visit our developer support forums at https://developer.oculus.com. Our Support Center can be
accessed at https://support.oculus.com.
68|Unity-SDK Version Compatibility|Unity
Note: Due to issues with earlier releases, we now recommend all developers update to 5.3.6p5 or
version 5.4.1p1 or later.
Release Notes
This section describes changes for each version release.
New Features
Added Adaptive Resolution to OVRManager, which automatically scales down app resolution when GPU
utilization exceeds 85%. See OVRManager in Unity Components for details. (Rift only, requires Unity v 5.4
or later)
OVR Screenshot Wizard size parameter is now freeform instead of dropdown selection for greater flexibility.
Added recommended anti-aliasing level to help applications choose the right balance between performance
and quality.
Added support for more than one simultaneous OVROverlay. Now apps can show up to 3 overlay quads on
Gear VR and 15 on Rift.
API Changes
Added OVRHaptics.cs and OVRHapticsClip.cs to programmatically control haptics for Oculus Touch
controller. See OVRHaptics on page 29 for more information.
Added public members Enable Adaptive Resolution, Max Render Scale, and Min Render Scale to
OVRManager.
Added OVRManager.useRecommendedMSAALevel to enable auto-selection of anti-aliasing level based on
device performance.
Added OVRManager.useIPDInPositionTracking to allow apps to separately disable head position tracking
(see OVRManager.usePositionTracking) and stereopsis.
Bug Fixes
Fixed bug preventing power save from activating on Gear VR.
Fixed counter-intuitive behavior where disabling OVRManager.usePositionTracking prevented proper eye
separation by freezing the eye camera positions at their original offset.
Known Issues
Gear VR
Unity 5 automatically generates manifest files with Android builds that will cause them to be
automatically rejected by the Oculus Store submission portal. If this is blocking your application
submission, please let us know on our Developer Forum and we will work with you on a temporary
workaround.
Unity|Release Notes|71
Gear VR developers using Unity 5.3.4 or later, or using Unity 5.4.0b16 and later: Do not set DSP Buffer
Size to Best in Audio Manager in the Inspector for now or you will encounter audio distortion. Set it to
Good or Default instead.
New Features
Added OVR Screenshot and OVR Capture Probe tools, which exports a 360 screenshot of game scenes in
cube map format. See Cubemap Screenshots on page 38 for more information.
Switched to built-in volume indicator on mobile.
Exposed OVRManager.vsyncCount to allow half or third-frame rate rendering on mobile.
Added bool OVRManager.instance.isPowerSavingActive (Gear VR).
Bug Fixes
Repeatedly changing resolution or MSAA level no longer causes slowdown or crashing.
Fixed scale of OVRManager.batteryLevel and OVRManager.batteryTemperature.
Fixed race condition leading to black screens on Rift in some CPU-heavy cases.
Fixed memory bloat due to unpooled buffers when using MSAA.
Known Issues
Gear VR developers using Unity 5.3.4 or later, or using Unity 5.4.0b16 and later: Do not set DSP Buffer Size
to Best in Audio Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or
Default instead.
from our Downloads Page, to use these versions. For more information, see Utilities 1.3.2 and OVRPlugin on
page 74.
New Features
OVRInput may now be used without an OVRManager instance in the scene.
API Changes
Restored OVRVolumeControl.
Bug Fixes
OVRManager.instance.usePositionTracking now toggles the head model on Gear VR.
Fixed incorrect fog interaction with transparent UI shader.
Fixed crash on start with Unity 5.4.0b14 and b15 on Gear VR.
Restored OVRVolumeControl, which was accidentally removed in 1.3.0.
Known Issues
Utilities 1.3.0: Volume control will be missing on mobile applications until the release of Mobile SDK 1.0.2.
OVRVolumeControl is avaialble with Utilities v 0.1.3.0 and earlier. It was also restored in Utilities v 1.3.2.
Note: Floor-level tracking will often be used with standing experiences, but there may be situations in
which eye-level tracking is a better fit for a standing experience, or floor-level tracking is a better fit for a
seated experience.
Any application running Unity should now be able to pull the correct height information.
New Features
Added support for PC SDK 1.3, including support for Rift consumer version hardware.
Added support for Asynchronous TimeWarp and Phase Sync.
Added Rift Remote controller support.
Added application lifecycle management, including VR-initiated backgrounding and exit.
Exposed proximity sensor.
Added support for multiple trackers.
Unity|Release Notes|73
API Changes
OVRTracker.GetPose() no longer takes a prediction time. It now takes an optional index specifying the
tracker whose pose you want.
OVRTracker.frustum has been replaced by OVRTracker.GetFrustum(), which takes an optional
index specifying the tracker whose frustum you want.
OVRManager.isUserPresent is true when the proximity sensor detects the user.
OVRInput.GetControllerLocal[Angular]Velocity/Acceleration exposes the linear and angular
velocity and rotation of each Touch controller.
OVRDisplay.velocity exposes the heads linear velocity.
OVRDisplay.angularAcceleration exposes the heads angular acceleration.
Removed OVRGamepadController.cs and OVRInputControl.cs scripts, which have been replaced by the new
OVRInput.cs script. Refer to OVRInput for more information.
Added public member Tracking Origin Type to OVR Manager.
Added floor level reference frame for apps that need accurate floor height.
Removed OVRVolumeControl in favor of Universal Menus built-in volume meter.
OVRManager.queueAhead now controls Gear VR latency mode, allowing you to trade latency for CPU-GPU
parallelism. Queue-ahead is now automatically managed on Rift.
Events OVRmanager.VrFocusLost and VrFocusAcquired occur when the app loses and regains VR
focus (visibility on the HMD).
Events OVRManager.AudioOutChanged and AudioInChanged occur when audio devices change and
make audio playback impossible without a restart.
OVRManager.cpuLevel controls CPU power-saving vs performance trade-off on Gear VR.
OVRManager.gpuLevel controls GPU power-saving vs performance trade-off on Gear VR.
Added ability to hold a named mutex throughout runtime.
Bug Fixes
Removed redundant axial deadzone handling from Xbox gamepad input.
Fixed OVRManager.monoscopic to display left eye buffer to both eyes and use center eye pose.
Application lifetime management now works, even without the Utilities.
Fixed crash when running VR apps with Rift disconnected.
OVRManager.isUserPresent now correctly reports proximity sensor output.
74|Release Notes|Unity
Known Issues
Volume control will be missing on mobile applications until the release of Mobile SDK 1.0.2. To restore
OVRVolumeControl, please use an older copy of the Utilities.
You must download and install OVRPlugin from our website if you are using the following Unity versions:
After you have downloaded and installed Unity, take these steps to install OVRPlugin:
Note: Do not install OVRPlugin version 1.3.2 with any version of Unity 5.3 prior to 5.3.3p3 or it will not
work properly.
Unity|Release Notes|75
Note: Do not install OVRPlugin version 1.3.2 with any version of Unity 5.4 prior to 5.4.0b11, or it will not
work properly.
To use Unity 5.3.4p1 with the Oculus Rift or Samsung Gear VR, you must download and install our OVRPlugin
for Unity 1.3.0, available from our Downloads Page.
After you have downloaded and installed Unity 5.3.4p1, take these steps to install OVRPlugin:
This document provides an overview of new features, improvements, and fixes included in the latest version of
the Utilities for Unity 5.x. For information on first-party changes to Unity VR support for Oculus, see the Unity
Release Notes for the appropriate version.
Utilities for Unity 0.1.3 extends OVRInput support to mobile. OVRInputControl and OVRGamepadController are
now deprecated and will be removed in a future release.
76|Release Notes|Unity
Mobile input bindings are now automatically added to InputManager.asset if they do not already exist - it is
no longer required to replace InputManager.asset with Oculus version. However, this asset is still provided for
now to maintain legacy support for the deprecated OVRGamepadController and OVRInputControl scripts.
New Features
Default mobile input bindings are now programmatically generated when projects are imported if they do
not already exist.
Replacing InputManager.asset is no longer required to enable gamepad support on mobile.
API Changes
Added mobile support OVRInput.
Deprecated OVRInputControl and OVRGamepadController.
Bug Fixes
Fixed mobile gamepad thumbstick deadzone/drift handling and axis scaling.
Fixed mobile gamepad support when multiple gamepads are paired.
Fixed mobile gamepad bindings for triggers, D-pad, thumbstick presses, etc.
New Features
Redesigned input API for Oculus Touch controllers and Xbox gamepads.
Added h264 hardware-decoder plugin for Gear VR.
Added face-locked layer support to OVROverlay when parented to the camera.
Reduced latency in the pose used by the main thread for raycasting, etc.
Updated to PC SDK 0.7 and Mobile SDK 0.6.2.0.
Enabled VRSettings.renderScale on Gear VR.
Several minor performance optimizations.
SDKExamples
Restored MoviePlayerSample
API Changes
The Utilities package now requires Unity 5.1 or higher.
Added OVRInput API alpha. Refer to documentation for usage.
Exposed LeftHand/RightHand anchors for tracked controllers in OVRCameraRig.
Bug Fixes
Restored ability to toggle settings such as monoscopic rendering and position tracking.
HSWDismissed event is now correctly raised when the HSW is dismissed.
Fixed handedness of reported velocity and acceleration values.
OVRPlayerController now moves at a consistent speed regardless of scale.
Unity|Release Notes|77
Known Issues
Tearing in OS X: Editor preview and standalone players do not vsync properly, resulting in a vertical tear
and/or judder on DK2.
When switching between a mobile application and System Activities screen, the back button becomes stuck
in the "down" state. For more information and workarounds, please see Troubleshooting and Known Issues.
Overview
This is the initial release of Oculus Utilities for Unity, for use with Unity versions 5.1.2 and later. The Utilities
extend Unity's built-in virtual reality support with the following features:
The Oculus Utilities for Unity expose largely the same API as the Oculus Unity Integration, but they offer all the
benefits of Unity's built-in VR support:
Improved rendering efficiency with less redundant work performed for each eye.
Seamless integration with the Unity Editor, including in-Editor preview and direct mode support.
Improved stability and tighter integration with features like anti-aliasing and shadows.
Non-distorted monoscopic preview on the main monitor.
Oculus SDK 0.6.0.1 support (PC and mobile).
Known Issues
Pitch, roll, and translation are off for the tracking reference frame in Unity 5.1.1, especially in apps with
multiple scenes.
Mac OS X tearing. VSync is currently broken on the Mac, but works when you build for Gear VR.
Performance loss. CPU utilization may be slightly higher than in previous versions of Unity.
OVRPlayerController might end up in an unexpected rotation after OVRDisplay.RecenterPose() is called. To
fix it, call RecenterPose() again.
New Features
Disabled eye texture anti-aliasing when using deferred rendering. This fixes the blackscreen issue.
Eliminated the need for the DirectToRift.exe in Unity 4.6.3p2 and later.
Removed the hard dependency from the Oculus runtime. Apps now render in mono without tracking when
VR isn't present.
As with Mobile SDK v 0.5.0, Unity developers using this SDK version must install the Oculus Runtime for
Windows or OS X. This requirement will be addressed in a future release of the SDK.
Bug Fixes
Rework System Activities Event handling to prevent any per-frame allocations that could trigger Garbage
Collector.
Unity|Release Notes|79
Known Issues
For use with the Mobile SDK, we recommend Unity versions 4.6.3. The Mobile SDK is compatible with Unity
5.0.1p2, which addresses a problem with OpenGL ES 3.0, but there is still a known Android ION memory
leak. Please check back for updates.
VrPlatform entitlement checking is now disabled by default in Unity; handling for native development is
unchanged. If your application requires this feature, please refer to the Mobile SDK Documentation for
information on how to enable entitlement checking.
New Features
w
Bug Fixes
Health and Safety Warning no longer displays in editor Play Mode if a DK2 is not attached.
Known Issues
For use with the Mobile SDK, we recommend Unity versions 4.6.3, which includes Android 5.0 - Lollipop
support as well as important Android bug fixes. While the Mobile SDK is compatible with Unity 5.0.0p2
and higher, several issues are still known to exist, including an Android ION memory leak and compatibility
issues with OpenGL ES 3.0. Please check back for updates.
VrPlatform entitlement checking is now disabled by default in Unity; handling for native development is
unchanged. If your application requires this feature, please refer to the Mobile SDK Documentation for
information on how to enable entitlement checking.
New Features
Synced with the Oculus PC SDK 0.5.0.1 Beta.
VrPlatform entitlement checking is now disabled by default.
Bug Fixes
Health and Safety Warning no longer displays in editor Play Mode if a DK2 is not attached.
80|Release Notes|Unity
Known Issues
For use with the Mobile SDK, we recommend Unity versions 4.6.3, which includes Android 5.0 - Lollipop
support as well as important Android bug fixes. While the Mobile SDK is compatible with Unity 5.0.0p2
and higher, several issues are still known to exist, including an Android ION memory leak and compatibility
issues with OpenGL ES 3.0. Please check back for updates.
New Features
New Mobile Unity Integration Based on Oculus PC SDK 0.4.4
We would like to highlight the inclusion of the new Mobile Unity Integration with full DK2 support based on
the Oculus PC SDK 0.4.4. As this is a significant API refactor, please refer to the Unity Development Guide:
Migrating From Earlier Versions section for information on how to upgrade projects built with previous versions
of the Mobile Unity Integration.
API Changes
Fix for camera height discrepancies between the Editor and Gear VR device.
Moonlight Debug Util class names now prefixed with OVR to prevent namespace pollution.
Provide callback for configuring VR Mode Parms on OVRCameraController; see OVRModeParms.cs for an
example.
New Features
Added Unity Free support for Gear VR developers.
Bug Fixes
Unity vignette rendering updated to match native (slightly increases effective FOV).
Unity volume pop-up distance to match native.
API Changes
The following are changes to Unity components:
Table 5: Events
Behavior Changes
OVRCameraRigs position is always the initial center eye position.
Eye anchor Transforms are tracked in OVRCameraRigs local space.
OVRPlayerControllers position is always at the users feet.
IPD and FOV are fully determined by profile (PC only).
Layered rendering: multiple OVRCameraRigs are fully supported (not advised for mobile).
OVRCameraRig.*EyeAnchor Transforms give the relevant poses.
Upgrade Procedure
To upgrade, follow these steps:
82|Release Notes|Unity
1. Ensure you didnt modify the structure of the OVRCameraController prefab. If your eye cameras are on
GameObjects named CameraLeft and CameraRight which are children of the OVRCameraController
GameObject (the default), then the prefab should cleanly upgrade to OVRCameraRig and continue to work
properly with the new integration.
2. Write down or take a screenshot of your settings from the inspectors for OVRCameraController,
OVRPlayerController, and OVRDevice. You will have to re-apply them later.
3. Remove the old integration by deleting the following from your project:
OVR folder
OVR Internal folder (if applicable)
Any file in the Plugins folder with Oculus or OVR in the name
Android-specific assets in the Plugins/Android folder, including: vrlib.jar, libOculusPlugin.so, res/raw and
res/values folders
4. Import the new integration.
5. Click Assets -> Import Package -> Custom Package
6. Open OculusUnityIntegration.unitypackage
7. Click Import All.
8. Fix any compiler errors in your scripts. Refer to the API changes described above. Note that the substitution
of prefabs does not take place until after all script compile errors have been fixed.
9. Re-apply your previous settings to OVRCameraRig, OVRPlayerController, and OVRManager. Note that the
runtime camera positions have been adjusted to better match the camera positions set in the Unity editor. If
this is undesired, you can get back to the previous positions by adding a small offset to your camera:
----------------------------------------------------------------------
if ( cameraController.GetCameraForward( ref cameraForward ) &&
cameraController.GetCameraPosition( ref cameraPosition ) )
{
Unity|Release Notes|83
...
to
if (OVRManager.display.isPresent)
{
OVRDevice.ResetOrientation();
to
OVRManager.display.RecenterPose();
----------------------------------------------------------------------
cameraController.ReturnToLauncher();
to
OVRManager.instance.ReturnToLauncher();
----------------------------------------------------------------------
OVRDevice.GetBatteryTemperature();
OVRDevice.GetBatteryLevel();
to
OVRManager.batteryTemperature
OVRManager.batteryLevel
----------------------------------------------------------------------
OrientationOffset
----------------------------------------------------------------------
FollowOrientation
----------------------------------------------------------------------
Unity Beta versions v 5.4.0b16 and later feature a basic version of our Oculus Native Spatializer Plugin for Unity
5 (ONSP). It makes it trivially easy to add basic spatialization (HRTF transforms) to audio point sources in your
Unity project. For more information, see First-Party Audio Spatialization (Beta) in our Oculus Native Spatializer
for Unity guide.
New Features
If you assign Unitys VR splash image, it will display as a world-locked quad using Asynchronous TimeWarp,
which allows full position-tracked rendering while Unity prioritizes loading. To improve the users
experience, the splash screen appears within 0.5s of app launch.
Bug Fixes
Fixed a number of OpenGL ES errors, such as EGL_BAD_ALLOC due to implicit management of Unitys
WindowSurface.
Fixed extra 180 degree yaw rotation on all OVROverlay quads.
Fixed a few cases where D3D texture references were not released.
Fixed start-up crash in Unity 4 in some applications.
This download includes a Unity Project of the Sample Framework. VR applications of the Sample Framework are
also available for the Oculus Rift from our Downloads page, and for the Samsung Gear VR from the Concepts
section of the Oculus Store.
This version of the Oculus Sample Framework for Unity 5 pulls in Utilities for Unity version 1.5. Importing
Utilities for Unity 5 separately is no longer required. It is compatible with Unity 5.3 and up - please check
Compatibility and Version Requirements on page 5 for up-to-date Unity version recommendations.
For complete instructions on downloading and using the Sample Framework, see Unity Sample Framework on
page 40 in our developer documentation.
New Features
Added Movie Player, Per-Eye Cameras, Multiple Cameras, and Render Frame Rate scenes.
Reorganized in-VR scenes list structure to de-clutter the scenes menu.
Now includes Utilities for Unity 5 v 1.5; separately importing the Utilities unitypackage is no longer required.
Unity|Release Notes|85
Known Issues
In Oculus Rift builds, Oculus Remote support is currently limited to opening and closing the menu system.
The Oculus Unity Sample Framework consists of a Unity project as well as application binaries for playing the
sample scenes in VR. Sample scenes are navigated and controlled in-app with a simple UI, which also provides
explanatory notes.
This download includes a Unity Project of the Sample Framework. VR applications of the Sample Framework
are also available for the Oculus Rift/DK2 from our Downloads page, and for the Samsung Gear VR from the
Concepts section of the Oculus Store.
For complete instructions on downloading and using the Sample Framework, see Unity Sample Framework on
page 40 in our developer documentation.