Module 2
Module 2
RENDERING, MODELLING
AND PROGRAMMING
1
VR Toolkits
System architecture
2
The Rendering Pipeline
3
Introduction
• For smooth simulations
• Need to display at least 24, or, better 30 frame/secs
• Total latency is not over 100msec
• Rendering
• The process of converting the 3D geometrical models
populating a virtual world into a 2D scene presented to the
user
4
The Graphics Rendering Pipeline
5
Three functional stages
• Graphics rendering has three functional stages.
Output buffer
User Video
FIFO buffer
FIFO buffer
input controller
Application Geometry Rasterizer
stage stage stage
6
Application stage
• Implemented in software
• Run on several CPUs
• Read the world geometry database & the user’s
input
• Basic operations
• take care of user’s input
• acceleration algorithm
• Output: 3D geometric primitives (polygons, meshes)
Application
stage
User Video
input CPU1 controller
Geometry Rasterizer
stage stage
CPU2
7
Geometry stage
• Implemented in software or hardware
• Run on the geometry engines
• Basic operations
Model Light Scene Screen
Clipping
transformation computation projection mapping
8
Rasterizer stage
• Implemented in hardware
• Convert the vertex information output into pixel
information needed by the video display
• Basic operations:
• Scan conversion (rasterization)
• Z-buffering
• Anti-aliasing
• Texture mapping
• Output: pixel values Rasterizer
stage
User Video
input RU1 controller
Application Geometry
stage stage RU2
9
Bottleneck
• Transform-limited
• In the geometry stage
• Fill-limited
• In the rasterizer stage
10
Optimization
• Application stage : CPU-limited
• Replace the CPU with a faster one or add another CPU
• Reduce CPU’s load
• Reduce the scene complexity by using 3D models with a lower
polygonal count
• Optimize the simulation software
11
Optimization
• Rasterizer stage : fill-limited
• Reduce the number of pixel in the
displayed image
• Reduce the size of the display
window
• Reduce the window resolution
12
The Haptics Rendering Pipeline
13
The Haptics Rendering Pipeline
Force smoothing
Force mapping
User Compute force Haptic
input CPU1 interface
Haptic
texturing
CPU2
14
The stages of Haptics Rendering Pipeline
15
The stages of Haptics Rendering Pipeline
16
Conceptual Model of VR
Human
avatar
cognition
agent
virtual
H-effector P-sensor L-sensor V-effector
Logical devices
for
displacements,
angles,
events.
17
Functional model
displaying rendering
(Sec. 3-4,5,6,7)
simulation VW
(Sec. 6-3,4) DB
(Sec. 6-1)
Virtual Interaction
Sensing
perception (chap. 5)
(Sec. 3-3)
(chap. 4)
VW
Authoring
(Sec. 6-2)
18
VR Programming Toolkits
Are extensible libraries of object-oriented functions
designed to help the VR developer;
▪ Support various common i/o devices used in VR (so
drivers need not be written by the developer);
▪ Allow import of CAD models (saves time), editing of
shapes, specifying object hierarchies, collision detection
and multi-level of detail, shading and texturing, run-time
management;
▪ Have built-in networking functions for multi-user
interactions, etc.
19
VR Toolkits can be classified by:
✓ Whether text-based or graphical-programming;
✓ The type of language used and the library size;
✓ The type of i/o devices supported;
✓ The type of rendering supported;
✓ Whether general-purpose or application specific;
✓ Whether proprietary (more functionality, better
documented) or public domain (free, but less
documentation and functionality)
20
VR Toolkits in Early 90s
▪ RenderWare (Cannon), VRT3/Superscape (Dimension
Ltd.), Cyberspace Developer Kit (Autodesk), Cosmo
Authoring Tool (SGI/Platinum/CA), Rend386 and others;
▪ They allowed either text-based programming
(RenderWare, CDK and Rend386), or graphical
programming (Superscape and Cosmo);
▪ They were platform-independent and generally did not
require graphics acceleration hardware;
▪ As a result they tended to use “low-end” I/O devices
(mouse) and to support flat shading to maintain fast
rendering.
21
Rend386 scene
22
VR Toolkits discussed
Name Application Proprietary Library size language
Area
23
The scene graph:
✓ Is a hierarchical organization of objects (visible or not) in the
virtual world (or “universe”) together with the view to that world;
✓ Scene graphs are represented by a tree structure, with nodes
connected by branches.
✓ Visible objects are represented by external nodes, which are
called leaves (they have no children). Example nodes F, G, H, I
✓ Internal nodes represent transformations (which apply to all their
children)
Root node
A Internal node
B C
External node
D E J
F G H I
24
Scene
Scene graphs are not static
palm Ball
palm
Ball
26
Model Geometry
Define action
functions
Define networking
27
Model Geometry
Define action
functions
Define networking
28
WTK geometry:
✓ Are the only visible Imported geometry:
WTgeometrynode_load(hand)
objects in the scene (others
like viewpoint, serial ports,
etc; are not);
✓ Geometries are either
Geometry primitive:
imported from CAD (ex. dxf WTgeometry_newsphere()
or 3ds formats), or from
VRML (wrl) or through
neutral file format (nff);
Custom geometry:
✓ Custom geometry created WTgeometry_begin
…..
through polygons and WTpoly_addvertex()
vertices; WTgeometry_save
29
WTK object
appearance: To load a material table:
WTmtable_load(filename)
30
Model Geometry
Define action
functions
Define networking
31
WTK scene graph:
✓ The scene consists of various objects, some visible (geometry),
some not (viewpoint, transforms, etc.); These objects are nodes in a
scene graph;
✓ The scene graph is the hierarchical arrangement of nodes that
expresses the nodes spatial organization and relationship to each
other.
✓ Each scene graph has only one root node.
Root node
32
WTK scene graph terminology:
✓ If a node has a sub-tree that includes another node, it is its
ancestor. Example node A is ancestor of E;
✓ A parent node is a node direct ancestor. C is parent of E but not
of I. C is an ancestor of I;
✓ Siblings are children nodes of the same parent. F,G,H,I are sibling
nodes;
✓ If a node is rendered before another, it is its predecessor (need not
be its ancestor). B is a predecessor of J, but not its ancestor. Node B
can affect the rendering of node J.
Scene graph tree Scene graph sub-tree
A A
B C B C
D E J D E J
F G H I F G H I
33
WTK scene graph traversal:
✓ The order in which nodes appear in the scene graph determines
the order in which they are rendered. This is because at each frame
the scene graph is traversed top-to-bottom, left-to-right;
✓ Advantages of using scene graph include object grouping, level-
of-detail switching, instancing of geometry and sub-trees (better
memory usage), increased frame rate (better culling), multiple scene
graphs.
Traversal order is A, B, C…..
A
B C
D E J
F G H I
34
WTK node types:
✓ Geometry nodes – used for visible objects;
✓ Attribute nodes (fog, light, transform) – affect the way the
geometry nodes are rendered; Need to be placed in the graph before
the geometry they affect;
✓ Procedural nodes (root, level-of-detail, separator, switch, etc) –
control the way the scene graph is processed
Separator
Transform Content
36
WTK movable node hierarchy:
✓ To create a robot arm, each of the objects need to be created
separately and loaded as movable nodes (base, lower arm, middle
arm, effector);
✓ Then they need to be linked in a scene graph
base
lower
WTmovnode_load(base)
WTmovnode_load(lower)
WTmovnode_load(middle)
middle WTmovnode_load(effector)
WTmovnode_attach(base,lower);
WTmovnode_attach(lower, middle);
WTmovnode_attach(middle, effector);
effector
37
WTK virtual hand hierarchy:
palm Middle
proximal
Thumb proximal
Index middle
Index distal
38
WTK virtual hand loading:
/* Load the hand model */
Palm = WTmovnode_load(Root, "Palm.nff", 1.0);
39
WTK virtual hand hierarchy:
WTmovnode_attach(Palm,ThumbProximal);
WTmovnode_attach(ThumbProximal, ThumbDistal);
WTmovnode_attach(Palm, IndexProximal);
WTmovnode_attach(IndexProximal, IndexMiddle);
WTmovnode_attach(IndexMiddle, IndexDistal);
WTmovnode_attach(Palm, MiddleProximal);
WTmovnode_attach( MiddleProximal, MiddleMiddle);
WTmovnode_attach(MiddleMiddle, MiddleDistal);
WTmovnode_attach(Palm, RingProximal);
WTmovnode_attach( RingProximal, RingMiddle);
WTmovnode_attach(RingMiddle, RingDistal);
WTmovnode_attach(Palm, SmallProximal);
WTmovnode_attach( SmallProximal, SmallMiddle);
WTmovnode_attach(SmallMiddle, SmallDistal);
40
Model Geometry
Define action
functions
Define networking
41
WTK sensors:
✓ Allow the user to interact dynamically with the simulation by
providing input and receiving feedback from the simulation. Some
of the supported sensors are:
▪ track balls (spaceball, geometry ball Jr);
▪ trackers (Polhemus Fastrack, Isotrack, Insidetrack; Ascension Bird
and Flock of Birds);
▪ sensing gloves (5DT serial glove, Pinch glove, CyberGlove);
▪ displays (CrystalEyes glasses, BOOM display, Virtual i/o HMD,
CyberMaxx2 HMD)
Etc.
42
Camera “fly-by”
Using the trackball:
y
✓ We can use the spaceball to interactively
z change the viewpoint to the scene;
x ✓ The spaceball needs to be declared as a
sensor and needs to linked to the serial port;
✓ Then the sensor needs to be attached to the
viewpoint.
43
main() {
WTsensor *spaceball;
Wtnode *root, *scene;
/* start simulation */
WTuniverse_ready();
WTuniverse_go();
/*stop simulation*/
WTuniverse_delete();
return0;
}
44
Model Geometry
Define action
functions
Define networking
45
WTK action functions:
✓ To do the ball grasping we need to check for collision between
the hand and the ball, and then we need to make the ball a child of
the palm.
✓ WTK action functions are user defined functions that are
executed at every simulation loop (frame). Such functions are
collision detection and collision response.
✓ In our case also sound needs to be played as a form of collision
response
Z
Radius
World
coordinate axes
Midpoint
X
Y
46
WTK action functions:
WTsound_load(“spring”);
void action()
{
/* Check for collision detection */
if(WTnodepath_intersectbbox(HandNP, BallNP))
{
/* play spring sound*/
WTsound_play(spring);
/* Remove the Ball from the scene graph and immediately reattach it
as a child of the Palm
*/
WTnode_remove(Ball);
WTmovnode_attach(Palm, Ball, 0);
47
WTK scene graph extension – the haptic node:
✓ Another form of collision response is force feedback if the user has a
haptic glove (such as Rutgers Master II);
✓ This is compatible with VRML;
✓ The fields of the haptics node are stiffness, viscosity, friction and
haptic effect (indicating a force profile – square, sine, constant, ramp)
Group Ball
Define action
functions
Define networking
49
WTK networking:
✓ Uses the “World2World” library extension of WTK;
✓ A typical client-server architecture uses a single server
that does “double duty” managing connections as well as
data sharing. Simulation stops when a new client requests
connection.
51
WTK two-tier
client-server
architecture
information information
action action
Client X Client Y
53
Start Simulation
Render scene
(graphics, audio,
haptics)
Exit Simulation
54
Java and Java 3D
✓ Java
✓ object oriented programming language
✓ developed for network applications
✓ platform independence
✓ slower than C/C++
✓ Java 3D
✓ Java hierarchy of classes that serves as an interface to 3D
graphics rendering and sound rendering systems
✓ Perfectly integrated with Java
✓ Strong object oriented architecture
✓ Powerful 3D graphics API
55
Model Geometry
Networking
56
Model Geometry
Networking
57
Java 3D geometry:
✓ Geometry can be imported Imported geometry
from various file formats loader.load(“Hand.wrl")
58
Java 3D object
appearance:
✓ The appearance of a
geometry is specified using
an appearance object
✓ An appearance-class object Mat = new Material();
Mat.setDiffuseColor(r, g, b);
stores information about the Mat.setAmbientColor(r, g, b);
material (diffuse, specular, Mat.setSpecularColor(r, g, b);
shininess, opacity, …) and TexLd = new TextureLoader(“checkered.jpg”, ...);
texture Tex = TexLd.getTexture();
Geom.setAppearance(Appr)
59
Model Geometry
Networking
60
Java3D node types:
BranchGroup Compilable sub-graph
Switch Select which of the children are visible (useful for LOD)
61
Java3D scene graph
Node
62
Loading objects from files
63
Java3D model loading
Adding the model to the scene graph
Scene Sc = loader.load(“Hand.wrl”);
BranchGroup Bg = Sc.getSceneGroup();
RootNode.addChild(Bg);
64
Java3D virtual hand loading:
Palm = loader.load("Palm.wrl").getSceneGroup();
ThumbProximal = loader.load("ThumbProximal.wrl").getSceneGroup();
ThumbDistal = loader.load("ThumbDistal.wrl").getSceneGroup();
IndexProximal = loader.load("IndexProximal.wrl").getSceneGroup();
IndexMiddle = loader.load("IndexMiddle.wrl").getSceneGroup();
IndexDistal = loader.load("IndexDistal.wrl").getSceneGroup();
MiddleProximal = loader.load("MiddleProximal.wrl").getSceneGroup();
MiddleMiddle = loader.load("MiddleMiddle.wrl").getSceneGroup();
MiddleDistal = loader.load("MiddleDistal.wrl").getSceneGroup();
RingProximal = loader.load("RingProximal.wrl").getSceneGroup();
RingMiddle = loader.load("RingMiddle.wrl").getSceneGroup();
RingDistal = loader.load("RingDistal.wrl").getSceneGroup();
SmallProximal = loader.load("SmallProximal.wrl").getSceneGroup();
SmallMiddle = loader.load("SmallMiddle.wrl").getSceneGroup();
SmallDistal = loader.load("SmallDistal.wrl").getSceneGroup();
65
Java3D virtual hand hierarchy:
Palm.addchild(ThumbProximal );
ThumbProximal .addchild(ThumbDistal );
Palm.addchild(IndexProximal );
IndexProximal .addchild(IndexMiddle );
IndexMiddle .addchild(IndexDistal );
Palm.addchild(MiddleProximal );
MiddleProximal .addchild(MiddleMiddle );
MiddleMiddle .addchild(MiddleDistal );
Palm.addchild(RingProximal );
RingProximal .addchild(RingMiddle );
RingMiddle .addchild(RingDistal );
Palm.addchild(SmallProximal );
SmallProximal .addchild(SmallMiddle );
SmallMiddle .addchild(SmallDistal );
66
Model Geometry
Define behaviors
Networking
67
Input devices in Java3D
✓ The only input devices supported by Java3D are the mouse and
the keyboard
✓ The integration of the input devices currently used in VR
applications (position sensors, track balls, joysticks…) relies
entirely on the developer
✓ Usually the drivers are written in C/C++. One needs either to re-
write the driver using Java or use JNI (Java Native Interface) to
call the C/C++ version of the driver. The latter solution is more
desirable.
✓ Java3D provides a nice general purpose input device interface
that can be used to integrate sensors. However, many times
developers prefer custom made approaches
68
Java3D General purpose sensor interface
class PhysicalEnvironment - stores information about all the input devices
and sensors involved in the simulation
class InputDevice - interface for an input device driver
class Sensor - class for objects that provide real time data
PhysicalEnvironment
InputDevices Sensors
69
Model Geometry
Networking
70
Java3D - Animating the simulation
✓ Java3D offers Behavior objects for controlling the simulation
✓ A Behavior object contains a set of actions performed when the object receives
a stimulus
✓ A stimulus is sent by a WakeupCondition object
✓ Some wakeup classes:
✓WakeupOnCollisionEntry
✓WakeupOnCollisionExit
✓WakeupOnCollisionMovement
✓WakeupOnElapsedFrames
✓WakeupOnElapsedTime
✓WakeupOnSensorEntry
✓WakeupOnSensorExit
✓WakeupOnViewPlatformEntry
✓WakeupOnViewPlatformExit
71
Java3D - Behavior usage
Universe • We define a behavior Bhv that rotates the
Root sphere by 1 degree
• We want this behavior to be called each
frame so that the sphere will be spinning
VC 6.4 on book CD
72
Model Geometry
Define behaviors
Networking
73
Java3D - Networking
✓ Java3D does not provide a built-in solution for networked virtual environments
✓ However, it’s perfect integration in the Java language allows the developer to
use the powerful network features offered by Java
✓ Java3D applications can run as stand alone applications or as applets in a web
browser
Server
74
Java3D and VRML
75
Comparison between Java3D and WTK
77
Comparison between Java3D and WTK
78
Java3D –WTK Comparison
Java3d Release 3.1 Beta 1 has less system latencies than WTK Release 9
80
But Java3d has more variability in the scene rendering time
81
WTK does not have spikes in the scene rendering time
82
General Haptics Open Software Toolkits
The world of haptic technology is evolving rapidly, and several open-source
toolkits are empowering developers to create innovative interactive experiences.
High-Level Toolkits:
• CHAI 3D: This C++ toolkit is widely used for research and development in
haptics. It offers a rich library of haptic effects, collision detection, and
physics simulation, making it ideal for creating realistic virtual environments
and simulations.
• gVR Haptics: This API is specifically designed for integrating haptics into
Google VR applications. It provides a simple interface for controlling vibration
patterns and forces on Daydream and Cardboard controllers.
83
Low Level Libraries
84