Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
8 views

Module 2

Uploaded by

Sheeba
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Module 2

Uploaded by

Sheeba
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 84

VR ARCHITECHURE,

RENDERING, MODELLING
AND PROGRAMMING

1
VR Toolkits

System architecture
2
The Rendering Pipeline

3
Introduction
• For smooth simulations
• Need to display at least 24, or, better 30 frame/secs
• Total latency is not over 100msec

• Low latency and fast graphics require a VR engine that


has a powerful computer architecture
• These architectures are designed around a rendering pipeline

• Rendering
• The process of converting the 3D geometrical models
populating a virtual world into a 2D scene presented to the
user

4
The Graphics Rendering Pipeline

5
Three functional stages
• Graphics rendering has three functional stages.

Output buffer
User Video
FIFO buffer

FIFO buffer
input controller
Application Geometry Rasterizer
stage stage stage

6
Application stage
• Implemented in software
• Run on several CPUs
• Read the world geometry database & the user’s
input
• Basic operations
• take care of user’s input
• acceleration algorithm
• Output: 3D geometric primitives (polygons, meshes)
Application
stage
User Video
input CPU1 controller
Geometry Rasterizer
stage stage
CPU2

7
Geometry stage
• Implemented in software or hardware
• Run on the geometry engines

• Basic operations
Model Light Scene Screen
Clipping
transformation computation projection mapping

• Output: 2D geometric primitives (2D polygons)


Geometry
stage
User Video
input GE1 controller
Application Rasterizer
stage stage
GE2

8
Rasterizer stage
• Implemented in hardware
• Convert the vertex information output into pixel
information needed by the video display
• Basic operations:
• Scan conversion (rasterization)
• Z-buffering
• Anti-aliasing
• Texture mapping
• Output: pixel values Rasterizer
stage
User Video
input RU1 controller
Application Geometry
stage stage RU2

9
Bottleneck

• One of the three stages will be the slowest


=> bottleneck stage
• CPU-limited
• In the application stage

• Transform-limited
• In the geometry stage

• Fill-limited
• In the rasterizer stage

10
Optimization
• Application stage : CPU-limited
• Replace the CPU with a faster one or add another CPU
• Reduce CPU’s load
• Reduce the scene complexity by using 3D models with a lower
polygonal count
• Optimize the simulation software

• Geometry stage : Transform-limited


• Need to look at the computation load assigned to the GEs
• Reduce the number of virtual light sources
• Use the Simple shading mode
• Use the type of polygon for which its rendering hardware was
optimized

11
Optimization
• Rasterizer stage : fill-limited
• Reduce the number of pixel in the
displayed image
• Reduce the size of the display
window
• Reduce the window resolution

12
The Haptics Rendering Pipeline

13
The Haptics Rendering Pipeline

Collision Force computation Tactile computation


detection stage stage stage

Force smoothing

Force mapping
User Compute force Haptic
input CPU1 interface
Haptic
texturing
CPU2

14
The stages of Haptics Rendering Pipeline

• Collision detection stage


• Load the physical characteristics of the 3D objects from
the database
• Perform collision detection to determine which virtual
objects collide

• Force computation stage


• Compute the collision forces
• Force smoothing
• Force mapping

15
The stages of Haptics Rendering Pipeline

• Tactile computation stage


• Render the touch feedback component of the simulation
• The computed effects are added to the force vector
send to the haptics output display

• The haptics rendering pipeline has a much less


standardized architecture compared to its graphics
counterpart.

16
Conceptual Model of VR
Human

P-effector L-effector V-sensor


H-sensor virtual
object
sensing
perception

avatar
cognition

motion control action

agent
virtual
H-effector P-sensor L-sensor V-effector

Logical devices
for
displacements,
angles,
events.
17
Functional model

displaying rendering
(Sec. 3-4,5,6,7)

simulation VW
(Sec. 6-3,4) DB
(Sec. 6-1)
Virtual Interaction
Sensing
perception (chap. 5)
(Sec. 3-3)
(chap. 4)
VW
Authoring
(Sec. 6-2)

18
VR Programming Toolkits
Are extensible libraries of object-oriented functions
designed to help the VR developer;
▪ Support various common i/o devices used in VR (so
drivers need not be written by the developer);
▪ Allow import of CAD models (saves time), editing of
shapes, specifying object hierarchies, collision detection
and multi-level of detail, shading and texturing, run-time
management;
▪ Have built-in networking functions for multi-user
interactions, etc.

19
VR Toolkits can be classified by:
✓ Whether text-based or graphical-programming;
✓ The type of language used and the library size;
✓ The type of i/o devices supported;
✓ The type of rendering supported;
✓ Whether general-purpose or application specific;
✓ Whether proprietary (more functionality, better
documented) or public domain (free, but less
documentation and functionality)

20
VR Toolkits in Early 90s
▪ RenderWare (Cannon), VRT3/Superscape (Dimension
Ltd.), Cyberspace Developer Kit (Autodesk), Cosmo
Authoring Tool (SGI/Platinum/CA), Rend386 and others;
▪ They allowed either text-based programming
(RenderWare, CDK and Rend386), or graphical
programming (Superscape and Cosmo);
▪ They were platform-independent and generally did not
require graphics acceleration hardware;
▪ As a result they tended to use “low-end” I/O devices
(mouse) and to support flat shading to maintain fast
rendering.
21
Rend386 scene
22
VR Toolkits discussed
Name Application Proprietary Library size language
Area

WorldToolKit (WTK) General yes “C” >1,000 functions


Sense8/EAI/EDS/) purpose

Java3D General no Implemented in C


(Sun Microsystems) Purpose Programming in Java
19 packages, 275
classes
GHOST (SensAble Haptics for yes C++
Technologies) Phantom
PeopleShop Military/civilian yes C/C++
(Boston Dynamics)

23
The scene graph:
✓ Is a hierarchical organization of objects (visible or not) in the
virtual world (or “universe”) together with the view to that world;
✓ Scene graphs are represented by a tree structure, with nodes
connected by branches.
✓ Visible objects are represented by external nodes, which are
called leaves (they have no children). Example nodes F, G, H, I
✓ Internal nodes represent transformations (which apply to all their
children)

Root node

A Internal node

B C
External node
D E J

F G H I
24
Scene
Scene graphs are not static

palm Ball

Scene graph shows that


the ball is a child of “scene”
25
Scene

palm

Ball

Scene graph has been modified, such that


the ball is now a child of the palm

26
Model Geometry

Define scene graph

WTK Initiation Define and link


sensors

Define action
functions

Define networking

27
Model Geometry

Define scene graph


WTK Initiation
Define and link
sensors

Define action
functions

Define networking

28
WTK geometry:
✓ Are the only visible Imported geometry:
WTgeometrynode_load(hand)
objects in the scene (others
like viewpoint, serial ports,
etc; are not);
✓ Geometries are either
Geometry primitive:
imported from CAD (ex. dxf WTgeometry_newsphere()
or 3ds formats), or from
VRML (wrl) or through
neutral file format (nff);
Custom geometry:
✓ Custom geometry created WTgeometry_begin
…..
through polygons and WTpoly_addvertex()
vertices; WTgeometry_save
29
WTK object
appearance: To load a material table:
WTmtable_load(filename)

✓ Objects have material


properties such as the way they
reflect light (ambient, diffuse,
specular, shininess, emissive,
opacity); These properties are
specified using material tables
Applying texture:
✓ Textures are loaded from files WTtexture_load
WTgeometry_settexture_
or created and then filtered WTtexture_setfilter
(scaled) to the object size

30
Model Geometry

Define scene graph


WTK Initiation
Define and link
sensors

Define action
functions

Define networking

31
WTK scene graph:
✓ The scene consists of various objects, some visible (geometry),
some not (viewpoint, transforms, etc.); These objects are nodes in a
scene graph;
✓ The scene graph is the hierarchical arrangement of nodes that
expresses the nodes spatial organization and relationship to each
other.
✓ Each scene graph has only one root node.
Root node

32
WTK scene graph terminology:
✓ If a node has a sub-tree that includes another node, it is its
ancestor. Example node A is ancestor of E;
✓ A parent node is a node direct ancestor. C is parent of E but not
of I. C is an ancestor of I;
✓ Siblings are children nodes of the same parent. F,G,H,I are sibling
nodes;
✓ If a node is rendered before another, it is its predecessor (need not
be its ancestor). B is a predecessor of J, but not its ancestor. Node B
can affect the rendering of node J.
Scene graph tree Scene graph sub-tree

A A
B C B C
D E J D E J
F G H I F G H I
33
WTK scene graph traversal:
✓ The order in which nodes appear in the scene graph determines
the order in which they are rendered. This is because at each frame
the scene graph is traversed top-to-bottom, left-to-right;
✓ Advantages of using scene graph include object grouping, level-
of-detail switching, instancing of geometry and sub-trees (better
memory usage), increased frame rate (better culling), multiple scene
graphs.
Traversal order is A, B, C…..

A
B C
D E J

F G H I
34
WTK node types:
✓ Geometry nodes – used for visible objects;
✓ Attribute nodes (fog, light, transform) – affect the way the
geometry nodes are rendered; Need to be placed in the graph before
the geometry they affect;
✓ Procedural nodes (root, level-of-detail, separator, switch, etc) –
control the way the scene graph is processed

Scene graph separator node– allows


Root node tracks 1 and 2 to move independently.
Also prevents the track transform to influence
the tank turret geometry
Transform Group node

Geometry node Separator node Geometry node


“tank body” “tank axle” “tank turret”

Transform node Geometry node Transform node Geometry node


“track 1 position” “track” “track 2 position” “track”
35
WTK movable node:
✓ To help manage the state of the geometry nodes, and simplify
scene graph construction, WTK has a self-contained kind of node
called movable node;
✓ A movable node has its own separator, transform and content
(geometry, light, switch, level-of-detail) nodes;
✓ There can be several movable nodes arranged in a hierarchy
Parent

Separator

Transform Content

36
WTK movable node hierarchy:
✓ To create a robot arm, each of the objects need to be created
separately and loaded as movable nodes (base, lower arm, middle
arm, effector);
✓ Then they need to be linked in a scene graph

base

lower
WTmovnode_load(base)
WTmovnode_load(lower)
WTmovnode_load(middle)
middle WTmovnode_load(effector)
WTmovnode_attach(base,lower);
WTmovnode_attach(lower, middle);
WTmovnode_attach(middle, effector);
effector
37
WTK virtual hand hierarchy:

palm Middle
proximal

Thumb proximal

Thumb distal Index


proximal

Index middle

Index distal

38
WTK virtual hand loading:
/* Load the hand model */
Palm = WTmovnode_load(Root, "Palm.nff", 1.0);

ThumbProximal = WTmovnode_load(Palm, "ThumbProximal.nff", 1.0);


ThumbDistal = WTmovnode_load(ThumbProximal, "ThumbDistal.nff", 1.0);

IndexProximal = WTmovnode_load(Palm, "IndexProximal.nff", 1.0);


IndexMiddle = WTmovnode_load(IndexProximal, "IndexMiddle.nff", 1.0);
IndexDistal = WTmovnode_load(IndexMiddle, "IndexDistal.nff", 1.0);

MiddleProximal = WTmovnode_load(Palm, "MiddleProximal.nff", 1.0);


MiddleMiddle = WTmovnode_load(MiddleProximal, "MiddleMiddle.nff", 1.0);
MiddleDistal = WTmovnode_load(MiddleMiddle, "MiddleDistal.nff", 1.0);

RingProximal = WTmovnode_load(Palm, "RingProximal.nff", 1.0);


RingMiddle = WTmovnode_load(RingProximal, "RingMiddle.nff", 1.0);
RingDistal = WTmovnode_load(RingMiddle, "RingDistal.nff", 1.0);

SmallProximal = WTmovnode_load(Palm, "SmallProximal.nff", 1.0);


SmallMiddle = WTmovnode_load(SmallProximal, "SmallMiddle.nff", 1.0);
SmallDistal = WTmovnode_load(SmallMiddle, "SmallDistal.nff", 1.0);

39
WTK virtual hand hierarchy:

WTmovnode_attach(Palm,ThumbProximal);
WTmovnode_attach(ThumbProximal, ThumbDistal);
WTmovnode_attach(Palm, IndexProximal);
WTmovnode_attach(IndexProximal, IndexMiddle);
WTmovnode_attach(IndexMiddle, IndexDistal);
WTmovnode_attach(Palm, MiddleProximal);
WTmovnode_attach( MiddleProximal, MiddleMiddle);
WTmovnode_attach(MiddleMiddle, MiddleDistal);
WTmovnode_attach(Palm, RingProximal);
WTmovnode_attach( RingProximal, RingMiddle);
WTmovnode_attach(RingMiddle, RingDistal);
WTmovnode_attach(Palm, SmallProximal);
WTmovnode_attach( SmallProximal, SmallMiddle);
WTmovnode_attach(SmallMiddle, SmallDistal);

40
Model Geometry

Define scene graph


WTK Initiation
Define and link
sensors

Define action
functions

Define networking

41
WTK sensors:
✓ Allow the user to interact dynamically with the simulation by
providing input and receiving feedback from the simulation. Some
of the supported sensors are:
▪ track balls (spaceball, geometry ball Jr);
▪ trackers (Polhemus Fastrack, Isotrack, Insidetrack; Ascension Bird
and Flock of Birds);
▪ sensing gloves (5DT serial glove, Pinch glove, CyberGlove);
▪ displays (CrystalEyes glasses, BOOM display, Virtual i/o HMD,
CyberMaxx2 HMD)
Etc.

42
Camera “fly-by”
Using the trackball:
y
✓ We can use the spaceball to interactively
z change the viewpoint to the scene;
x ✓ The spaceball needs to be declared as a
sensor and needs to linked to the serial port;
✓ Then the sensor needs to be attached to the
viewpoint.

43
main() {
WTsensor *spaceball;
Wtnode *root, *scene;

/* initialize the universe*/


WTuniverse_new(WTDISPLAY_DEFAULT, WTWINDOW_DEFAULT);

/*load scene at the root*/


root= WTuniverse_getrootnodes();
Scene=Wtnode_load(root,”myscene”, 1.0);

/*attach sensor to the serial port*/


spaceball=WTspaceball_new(SERIAL2);

/*attach viewpoint to the spaceball*/


WTviewpoint_addressor(Wtuniverse_getviewpoints(), spaceball);

/* start simulation */
WTuniverse_ready();
WTuniverse_go();
/*stop simulation*/
WTuniverse_delete();
return0;
}

44
Model Geometry

Define scene graph


WTK Initiation
Define and link
sensors

Define action
functions

Define networking

45
WTK action functions:
✓ To do the ball grasping we need to check for collision between
the hand and the ball, and then we need to make the ball a child of
the palm.
✓ WTK action functions are user defined functions that are
executed at every simulation loop (frame). Such functions are
collision detection and collision response.
✓ In our case also sound needs to be played as a form of collision
response
Z
Radius
World
coordinate axes
Midpoint
X

Y
46
WTK action functions:
WTsound_load(“spring”);

void action()
{
/* Check for collision detection */
if(WTnodepath_intersectbbox(HandNP, BallNP))
{
/* play spring sound*/
WTsound_play(spring);

/* Remove the Ball from the scene graph and immediately reattach it
as a child of the Palm
*/
WTnode_remove(Ball);
WTmovnode_attach(Palm, Ball, 0);

/* stop playing spring sound */


WTsound_stop(spring)
}
}

47
WTK scene graph extension – the haptic node:
✓ Another form of collision response is force feedback if the user has a
haptic glove (such as Rutgers Master II);
✓ This is compatible with VRML;
✓ The fields of the haptics node are stiffness, viscosity, friction and
haptic effect (indicating a force profile – square, sine, constant, ramp)

Group Ball

Transform Geometry Haptics Dynamics

from (Popescu, 2001)


48
Model Geometry

Define scene graph


WTK Initiation
Define and link
sensors

Define action
functions

Define networking

49
WTK networking:
✓ Uses the “World2World” library extension of WTK;
✓ A typical client-server architecture uses a single server
that does “double duty” managing connections as well as
data sharing. Simulation stops when a new client requests
connection.

Typical client-server architecture


50
✓ WTK/W2W uses a single server manager and several
simulation servers to improve scalability and system
response:
▪ The server manager is the initial point of contact of a
new client connecting to the simulation – administration
tasks performed transparently of the simulation;
▪ The simulation servers interact directly with the assigned
clients, once handed over by the manager.
▪ This way the ongoing simulation is not disrupted when a
new client is requesting connection.

51
WTK two-tier
client-server
architecture

(from World2World release 1, Sense8 Co.)


52
WTK Simulation Servers:
✓ Shared properties are organized in shared groups. Client X and
Client Y are interested in the position property of the ball object.
✓ The simulation Server manages the distribution of shared
properties to clients that registered interest in that shared group

Position is the property


Ball is the object

information information

action action

Client X Client Y

53
Start Simulation

Read Sensor Data


WTK Run-time
loop Update Objects
(from sensors and Repeats every frame
intelligent actions

Render scene
(graphics, audio,
haptics)

Exit Simulation
54
Java and Java 3D
✓ Java
✓ object oriented programming language
✓ developed for network applications
✓ platform independence
✓ slower than C/C++
✓ Java 3D
✓ Java hierarchy of classes that serves as an interface to 3D
graphics rendering and sound rendering systems
✓ Perfectly integrated with Java
✓ Strong object oriented architecture
✓ Powerful 3D graphics API
55
Model Geometry

Define scene graph

Java 3D Setup sensors


Initiation
Define behaviors

Networking

56
Model Geometry

Define scene graph

Java 3D Setup sensors


Initiation
Define behaviors

Networking

57
Java 3D geometry:
✓ Geometry can be imported Imported geometry
from various file formats loader.load(“Hand.wrl")

(e.g. 3DS, DXF, LWS, NFF,


OBJ, VRT, VTK, WRL)
✓ Can be created as a
primitive geometry (e.g. Geometry primitive:
sphere, cone, cylinder, …) new Sphere(radius)

✓ Custom geometry created


by specifying the vertices,
edges, normals, texture Custom geometry:
coordinates using specially new GeometryArray(…)
new LineArray(…)
defined classes new QuadArray(…)
new TriangleArray(…)

58
Java 3D object
appearance:
✓ The appearance of a
geometry is specified using
an appearance object
✓ An appearance-class object Mat = new Material();
Mat.setDiffuseColor(r, g, b);
stores information about the Mat.setAmbientColor(r, g, b);
material (diffuse, specular, Mat.setSpecularColor(r, g, b);
shininess, opacity, …) and TexLd = new TextureLoader(“checkered.jpg”, ...);
texture Tex = TexLd.getTexture();

Appr = new Appearance();


Appr.setMaterial(Mat);
Appr.setTexture(Text);

Geom.setAppearance(Appr)
59
Model Geometry

Define scene graph

Java 3D Setup sensors


Initiation
Define behaviors

Networking

60
Java3D node types:
BranchGroup Compilable sub-graph

Group TransformGroup Transform + child nodes

Switch Select which of the children are visible (useful for LOD)

Node Background Universe background. Can be a color or an image

Behavior Actions to be performed by the simulation

Fog Fog node


Leaf
Light Light node. Special derived classes: AmbientLight, PointLight,
DirectionalLight
Shape3D Geometry + Appearance + BoundingBox

61
Java3D scene graph

Node

62
Loading objects from files

✓ Java3D offers by default support for Lightwave and Wavefront


model files
✓ Loaders for other file formats can be downloaded for free from
the web http://www.j3d.org/utilities/loaders.html
✓ Loaders add the content of the read file to the scene graph as a
single object. However, they provide functions to access the
subparts individually
Universe
Root

Cube Sphere Hand

Thumb Index Middle Ring Small

63
Java3D model loading
Adding the model to the scene graph
Scene Sc = loader.load(“Hand.wrl”);
BranchGroup Bg = Sc.getSceneGroup();
RootNode.addChild(Bg);

Accessing subparts of the loaded model


Scene Sc = loader.load(“Hand.wrl”);
BranchGroup Bg = Sc.getSceneGroup();
Thumb = Bg.getChild(0);
Index = Bg.getChild(1);
Middle = Bg.getChild(2);
Ring = Bg.getChild(3);
Small = Bg.getChild(4);

64
Java3D virtual hand loading:

Palm = loader.load("Palm.wrl").getSceneGroup();
ThumbProximal = loader.load("ThumbProximal.wrl").getSceneGroup();
ThumbDistal = loader.load("ThumbDistal.wrl").getSceneGroup();
IndexProximal = loader.load("IndexProximal.wrl").getSceneGroup();
IndexMiddle = loader.load("IndexMiddle.wrl").getSceneGroup();
IndexDistal = loader.load("IndexDistal.wrl").getSceneGroup();
MiddleProximal = loader.load("MiddleProximal.wrl").getSceneGroup();
MiddleMiddle = loader.load("MiddleMiddle.wrl").getSceneGroup();
MiddleDistal = loader.load("MiddleDistal.wrl").getSceneGroup();
RingProximal = loader.load("RingProximal.wrl").getSceneGroup();
RingMiddle = loader.load("RingMiddle.wrl").getSceneGroup();
RingDistal = loader.load("RingDistal.wrl").getSceneGroup();
SmallProximal = loader.load("SmallProximal.wrl").getSceneGroup();
SmallMiddle = loader.load("SmallMiddle.wrl").getSceneGroup();
SmallDistal = loader.load("SmallDistal.wrl").getSceneGroup();

65
Java3D virtual hand hierarchy:
Palm.addchild(ThumbProximal );
ThumbProximal .addchild(ThumbDistal );

Palm.addchild(IndexProximal );
IndexProximal .addchild(IndexMiddle );
IndexMiddle .addchild(IndexDistal );

Palm.addchild(MiddleProximal );
MiddleProximal .addchild(MiddleMiddle );
MiddleMiddle .addchild(MiddleDistal );

Palm.addchild(RingProximal );
RingProximal .addchild(RingMiddle );
RingMiddle .addchild(RingDistal );

Palm.addchild(SmallProximal );
SmallProximal .addchild(SmallMiddle );
SmallMiddle .addchild(SmallDistal );

66
Model Geometry

Define scene graph


Java3D
Initiation
Setup sensors

Define behaviors

Networking

67
Input devices in Java3D

✓ The only input devices supported by Java3D are the mouse and
the keyboard
✓ The integration of the input devices currently used in VR
applications (position sensors, track balls, joysticks…) relies
entirely on the developer
✓ Usually the drivers are written in C/C++. One needs either to re-
write the driver using Java or use JNI (Java Native Interface) to
call the C/C++ version of the driver. The latter solution is more
desirable.
✓ Java3D provides a nice general purpose input device interface
that can be used to integrate sensors. However, many times
developers prefer custom made approaches
68
Java3D General purpose sensor interface
class PhysicalEnvironment - stores information about all the input devices
and sensors involved in the simulation
class InputDevice - interface for an input device driver
class Sensor - class for objects that provide real time data

One input device can provide one or more sensors


A sensors object needs not be in relation with an input device (VRML style sensors)

PhysicalEnvironment
InputDevices Sensors

69
Model Geometry

Define scene graph


Java3D
Initiation
Setup sensors

Animating the scene

Networking

70
Java3D - Animating the simulation
✓ Java3D offers Behavior objects for controlling the simulation
✓ A Behavior object contains a set of actions performed when the object receives
a stimulus
✓ A stimulus is sent by a WakeupCondition object
✓ Some wakeup classes:
✓WakeupOnCollisionEntry
✓WakeupOnCollisionExit
✓WakeupOnCollisionMovement
✓WakeupOnElapsedFrames
✓WakeupOnElapsedTime
✓WakeupOnSensorEntry
✓WakeupOnSensorExit
✓WakeupOnViewPlatformEntry
✓WakeupOnViewPlatformExit

71
Java3D - Behavior usage
Universe • We define a behavior Bhv that rotates the
Root sphere by 1 degree
• We want this behavior to be called each
frame so that the sphere will be spinning

WakeupOnElapsedFrames Wup = new WakeupOnElapsedFrames(0);


Bhv.wakeupOn(Wup);

VC 6.4 on book CD

72
Model Geometry

Define scene graph


Java3D
Initiation
Setup sensors

Define behaviors

Networking

73
Java3D - Networking
✓ Java3D does not provide a built-in solution for networked virtual environments
✓ However, it’s perfect integration in the Java language allows the developer to
use the powerful network features offered by Java
✓ Java3D applications can run as stand alone applications or as applets in a web
browser
Server

Java3D Java3D Java3D Java3D


simulation simulation simulation simulation

Java Java Java Java


Applet Application Applet Application

74
Java3D and VRML

✓ VRML provides possibilities for defining the objects and


animating the objects in a virtual world
✓ Graphics APIs such as Java3D or WTK load from a VRML file
only the static information, ignoring the sensors, routes, scripts,
etc.
✓ Java3D structure is general enough to make the import of
sensors and routes possible but currently we are not aware of any
loader that does it
✓ One of the most popular library of Java3D loaders is the NCSA
Portfolio (http://www.ncsa.uiuc.edu/~srp/Java3D/portfolio/)

75
Comparison between Java3D and WTK

✓ A comparative study was done at Rutgers between Java3d


(Version 1.3beta 1) and WTK (Release 9);
✓ The simulation ran on a dual Pentium III 933 MHz PC (Dell)
with 512 Mbytes RAM, with an Wildcat 4110 graphics accelerator
which had 64 Mbytes RAM;
✓ The I/O interfaces were a Polhemus Insidetrack or the Rutgers
Master II force feedback glove;
✓ The scene consisted of several 420-polygon spheres and a virtual
hand with 2,270 polygons;
✓ The spheres rotated constantly around an arbitrary axis, while
the hand was either rotating, or driven by the user.
76
Java3D –WTK Comparison
Graphics scene used in experiments

77
Comparison between Java3D and WTK

✓ The simulation variables used to judged performance were:


▪ graphic mode (monoscopic, stereoscopic),
▪ rendering mode (wireframe, Gouraud, textured);
▪ scene complexity (number of polygons 5,000 – 50,000);
▪ lighting (number of light sources 1, 5, 10);
▪ interactivity (no interaction, hand input, force feedback)

78
Java3D –WTK Comparison

Java3d is faster on average than WTK, but has higher variability


79
Java3D –WTK Comparison

Java3d Release 3.1 Beta 1 has less system latencies than WTK Release 9
80
But Java3d has more variability in the scene rendering time
81
WTK does not have spikes in the scene rendering time
82
General Haptics Open Software Toolkits
The world of haptic technology is evolving rapidly, and several open-source
toolkits are empowering developers to create innovative interactive experiences.

High-Level Toolkits:

• CHAI 3D: This C++ toolkit is widely used for research and development in
haptics. It offers a rich library of haptic effects, collision detection, and
physics simulation, making it ideal for creating realistic virtual environments
and simulations.

• gVR Haptics: This API is specifically designed for integrating haptics into
Google VR applications. It provides a simple interface for controlling vibration
patterns and forces on Daydream and Cardboard controllers.

• OpenVR Haptics: This API allows developers to add haptic feedback to


SteamVR applications. It supports a wide range of haptic devices and offers
high-precision control over force feedback and other haptic effects.

83
Low Level Libraries

• HDAPI (OpenHaptics Library): This low-level library


provides direct access to haptic hardware, allowing
developers to fine-tune and customize haptic feedback
to a specific device. It's often used in conjunction with
HLAPI (Haptic Level Language API) for creating more
complex haptic effects.

• VRPN (Virtual Reality Peripheral Network): This toolkit


provides a communication framework for connecting
various VR peripherals, including haptic devices. It can
be used to build custom haptic interfaces and integrate
them with other VR software.

84

You might also like