Conference and Exhibition of the European Association of Virtual and Augmented Reality (2014)
G. Zachmann, J. Perret, and A. Amditis (Editors)
Classification of VR interaction techniques, based on user
intention
C. Weidig1 and D.R. Mestre 2 and J.H. Israel 3 and F. Noël 4 and V. Perrot 2 and J.C. Aurich1
1 TU
Kaiserslautern, Institute for Manufacturing Technology and Production Systems, Germany
2 Mediterranean Virtual Reality Center, Aix-Marseille University, France
3
4
Fraunhofer IPK, Berlin, Germany
G-SCOP laboratory, Grenoble INP, Université Joseph Fourier, CNRS, Grenoble, France
Abstract
The number of different virtual reality (VR) systems and corresponding interaction techniques is quite manifold,
today. For dedicated input devices, data types and application domains specific interaction techniques have been
developed. Thereby a wide audience of researchers can benefit from using VR systems. A drawback is, that for
non-VR specialists it is nearly impossible to get a clear overview and identify proper interaction techniques for
their research questions. Therefore this article proposes a classification scheme for VR interaction techniques
based on the users’ purpose. This will help non-VR specialist to identify VR systems and interaction techniques
perfectly matching their demand. The classification will be used within the project VISIONAIR to bring together
VR experts and external users.
Categories and Subject Descriptors (according to ACM CCS): I3.6 [Computer Graphics]: Methodology and
Techniques—Interaction techniques
1. Introduction
Interaction in virtual reality (VR) is a crucial aspect that
needs to be provided according to the users’ needs to allow
the beneficial usage of VR systems [MYW∗ 12]. The better the interaction technique reflects the users’ interaction
demand, the higher efficiency and effectiveness during the
usage of VR systems can be achieved [Bux86]. Therefore an
approach to classify interaction techniques in VR against the
users’ needs is introduced in this paper.
The goal of this paper is to analyse and assess the supportive
potential of interaction techniques for the interaction within
virtual environments. The investigation will result in a kind
of recommendation list for researchers so they can identify
the interaction technique which suits their demands best. By
offering such a list, even non-VR experts will have the opportunity to understand the characteristics of different interaction techniques and compare them considering the requirements of their own research projects.
A classification scheme has been developed which is based on the user requirements. Thereby the intention of the
user (Why to use the VR system?) and his targeted purpoc The Eurographics Association 2014.
DOI: 10.2312/eurovr.20141339
se (Which objective should be achieved using the VR system?) are the central points which influence the classification. This should facilitate the proper assignment of user demands and interaction techniques later on. By categorizing
the intention behind the interaction techniques, users may
choose best which interaction technique they request to support their research. The classification scheme can be understood as a common requirement specification that connects
user demands and capabilities provided by certain interaction techniques.
Moreover, due to today’s fast technology progress previous classifications are stressed to their limit. In particular, connected portable devices, coming in different sizes,
possessing significant computing power. They are input and
output devices in one, thus they allow many types of applications with completely new characteristics. So a purely
technical oriented approach can be too restrictive for the definition of new services and interaction techniques in VR applications.
The VISIONAIR project analyses and assesses the supportive potential of interaction techniques for the interaction
54
C. Weidig et.al. / Classification of VR interaction techniques, based on user intention
within immersive, virtual environments. VISIONAIR’s perspective is oriented on the benefit that can be created by the
end-user which is utilizing the interaction techniques to conduct research in different domains. Hence providing interaction techniques can be understood as a kind of service offered to clients. Within the project the focus is especially laid
on the development and beneficial use of handheld devices
in VR.
2. Related work
’An interaction technique is the fusion of input and output,
consisting of all software and hardware elements, that provides a way for the user to accomplish a task [Tuc04].’ Thereby
interaction techniques fulfil a certain user demand occurring
from interaction tasks by using input and output devices in a
beneficial way. Users are enabled to perform a specific task
within software systems (e.g. a VR system) serving their objectives [Bea00] [DF04]. To classify interaction techniques
some research work has been carried out [Bow99] [CB02].
The objective is to categorize interaction techniques to get
an overview on available techniques and identify gaps to initiate new design projects. In these two approaches, either
the perspective from immersive VR [Bow99] or the enduser perspective [CB02] are more or less separately addressed. A combination of both approaches, necessary to classify
VR interaction techniques from the end-user’s point of viewseems to be missing .
In [Bow99] interaction techniques are classified according
to a relative low technically oriented level, decomposing interaction techniques into elementary fractions . Three main
categories have been identified, which cover more or less the
whole set of interaction in VR. By distinguishing (1) travel,
(2) selection and (3) manipulation user’s input towards the
VR is captured quite generically. The classification is completed by the category (4) system control which includes superior functionality that is not directly related to the user interaction in VR, but requested to operate the VR systems in
general.
Following [NBS12] VR interaction has to address the users’
wish to handle virtual objects commonly in 3D as if they
were real. This demand can be broken down into three requirements interaction techniques must fulfil in the scope of
immersive VR systems.
• The dimension of space handled can range from 1D to 3D
• The degree of freedom (dof) devices allow is usually 2dof
to 6 dof
• Devices and interaction techniques usually provide complex feedback to the user [NBS12]
In contrary to this technical focused approach is the classification scheme developed by [CB02] is not exclusively addressing VR interaction techniques, but shifting the focus to
the information and content aspects of interaction. The main
classification criteria (Figure 1) are directly connected to the
users’ behaviour and their intention beyond the interaction
technique.
Communication behaviours
•
•
•
Medium (speech, text, video…)
Mode (face-to-face, mediated…)
Mapping (one-to-one, one-to-many…)
Information behaviours
•
•
•
•
Create
Disseminate
Organize
…
Objects interacted with
•
•
•
Level (information, meta-information)
Medium (image, text, speech…)
Quantity (one object, set of objects…
Common dimensions of interaction
•
•
•
Information Object (part – whole)
Systematicity (random – systematic)
Degree (selective – exhaustive)
Interaction criteria
•
•
•
•
Alphabet
Date
Person
…
FBK/018-001
Figure 1: Facets of a classification of interactions [CB02]
Even if the approach introduced in [CB02] is less technical,
and more information-oriented the literature review revealed that a classification approach to categorize and describe
VR interaction techniques from the end-user point of view is
missing. This gap shall be closed by the approach presented
in this paper.
3. Classification Approach
Within the VISIONAIR project, multiple research institutes
are connected, operating a wide range of different visualization facilities, targeting at highly diverse research domains.
Hence the following approach summarizes the experience
gathered by the usage of many different interaction techniques. The developed classification scheme is structured into three main classes ’purpose, ’object medium’ and ’user’.
Each main-class contains one sub-class that specifies the
main-class more in detail (Figure 2). The objective of this
structure is to classify VR interaction techniques from a generic, user driven, perspective incorporating the intention of
the end-user as a major driver. According to each main-class
a sub-class is defined, to outline the core functionality of the
main-class. Detailing out the description of the ’purpose’ the
user wants to achieve, the sub-class ’feedback’ is defined.
Feedback channels are often essential for the usability and
utility of interaction techniques from a user’s perspective. By
providing feedback on the interaction, the user gets direct
indication whether the tasks beyond the interaction can be
achieved. As second main-class, the ’object medium’, which
is handled by the interaction technique, is defined. Thereby
c The Eurographics Association 2014.
55
C. Weidig et.al. / Classification of VR interaction techniques, based on user intention
feedback
purpose
object medium
user
device type
interoperability
FBK/018-002
Figure 3: Attributes of the main-class ’purpose’
Figure 2: Main-classes of the classification approach
c The Eurographics Association 2014.
fy interaction techniques and user requirements on a generic
base. They shall help users to better identify the interaction
capability they request. For the other main-classes and subsequently the sub-classes detailed attribute descriptions are
made accordingly.
4. Implementation of the classification scheme
The classification scheme is by now implemented in an MS
Excel based taxonomy which integrates the classes into a
structured and clear table. The generic table will be completed by instances, describing the concrete interaction techniques used by VISIONAIR Partners. After setting up a full
list of interaction technique descriptions, the classification
scheme shall be published online, to be available for non-VR
experts. So the knowledge among interaction techniques for
VR systems and the classification taxonomy can be distributed. The table is structured by three hierarchical stages containing all main-classes, sub-classes and attributes of characteristics, according to the structure of the classification
scheme (Figure 4).
purpose
Interaction needs
the characteristics of information processed are the main focus (e.g. dimension of visual objects). The fact that, for some interaction techniques, specific device types are required
will be considered subsequently by the sub-class ’device type’. Thereby, it is not the device specifications in detail (e.g.
vendor) that are mentioned. Rather the capabilities provided
are in focus (e.g. degrees of freedom).
As last class the ’user’ involvement is considered. Here the
working situation and the team-constellation for which the
VR interaction technique is used will be reflected. The subclass ’interoperability’ is outlining interconnections which
are required and established by the interaction technique.
Thereby interconnections between users, information, and
also between several interaction techniques are considered.
Each main-class and each sub-class are detailed out by a set
of attributes, differentiating the characteristics of the classes. This will be further illustrated by the description of the
main-class ’purpose’, as this class is the core classification
measure of the approach. Based on the idea that interaction techniques are chosen due to the functionality they provide, ’purpose’ is characterized by the level of creative or
predetermined interaction capabilities provided. Interaction
techniques which allow an independent interaction are separated from interaction techniques allowing the interaction
only for predetermined tasks. Independent interaction is characterized as some kind of continuous interaction that allows
modifications of the virtual environment within infinitesimal
steps where every configuration of virtual elements is allowed. In contrary, interaction for predetermined tasks can be
understood as some kind of discrete interaction where only discrete modifications among certain steps are possible.
As shown in Figure 3 four attribute categories are proposed which outline the classification criteria and subdivide the
main-class ’purpose’.
The four attribute categories, named ’creative design’, ’assemble’, ’manage’ and ’observe’ are regiment into the continuum of continuous and discrete interactions. They are each
composed out of several typical tasks which are the purpose
behind certain interaction techniques. These attributes will
be the level on which users and interaction experts classi-
feedback
object medium
device type
creative design
assemble
manage
observe
user
interoperability
external
internal
design
create
modify
author
combine
communicate
merge
attach
update
replace
select
specify
present
make available
compare
Figure 4: Hierarchical structure of the classification scheme in MS Excel
In addition to the generic taxonomy, descriptions of interaction techniques developed and used in VISIONAIR are
included in the MS Excel file. Therefore the list of generic
attributes is instantiated once for each interaction technique.
Further, a short prosaic description is added to each interaction technique, describing the idea behind the interaction
56
C. Weidig et.al. / Classification of VR interaction techniques, based on user intention
technique and special characteristics among the usage. By
this a comprehensive list of interaction techniques used and
developed by VISIONAIR Partners for various purposes is
established.
The implementation of the classification scheme allows the
comparison of interaction techniques, but also the identification of interaction techniques matching specific user demands. As analysis tool a comparison algorithm between given
user needs and the interaction techniques outlined in the list
is implemented. Users have the option to mark all requested
functionality based on the generic taxonomy and the algorithm is identifying the interaction technique that fits the demand best. So even non-VR experts have the ability to search
for beneficial ways of interacting with their given research
data.
bar on a Tablet-PC, a user can tangibly control certain animations, while being in a CAVE and using other navigation
techniques in addition. The virtual sliders gives direct feedback among the current frame of the animation, which makes
the Tablet-PC an output device in parallel. In addition visual
feedback is provided by the VR system on which the virtual
environment is implemented. At the University of Kaiserslautern the interaction technique is realized by a Tablet-PC
on which the 2D GUI ’Covise TabletUI’ is implemented.
This prosaic description of the interaction technique gives a
5. Examples of actions conducted in Visionair
Figure 5: Example of the interaction technique implemented with Covise TabletUI
In the Visionair framework, we conducted a common task,
consisting in analyzing the supportive potential of interaction techniques within virtual environments. As evoked above,
the main objective was to help expert as well as new users
of VR systems in choosing the best suiting interaction technique regarding their demand. Following two example interaction techniques are outlined and the classifiaction scheme is used on them. This will illsutrate how classification
will be done to adress end-user perspective. In this domain,
Handheld Devices are more and more used (being relatively cheap and portable) to support users while working with
VR systems. 2D-based interaction devices in 3D environments offers new potentials, but also challenges which need
to be tackled. Different partners of Visionair made intensive analysis and investigation on the interaction techniques
already used. Extensions towards the integration of Tablet
PCs, tangible interaction devices and device-free interaction
were conducted. Among the different experimental studies
that have been carried out in the recent period, we can cite a
few. First, specific comparisons between 3D interaction and
2D interaction techniques have been carried out [NBS12].
Such analysis has been specifically used in the context of
manufacturing systems [SMN12]. A similar approach consisted in investigating gesture- and tool-based interaction in
virtual environments [WMA13]. PC and/or Android tablets
were used considered, for interacting with a CAVE system
or in generic docking tasks [MP14].Gesture- and tool-based
interaction techniques were also compared in the context of
3d sketching [IS12].
To illustrate how the classification scheme will handle the
characterisation of a concrete interaction technique, an example is given following. The interaction technique called
’wind back and forwards’ uses a Tablet-PC to control animations and predefined model movement in a CAVE by directly influencing the progress of the animation. Therefore
a slider is used to control the runtime of an animation or
model movement (Figure 5). By visualizing such a progress
FBK/018-005
first impression of the capabilities and characteristics of the
interaction technique ’wind back and forwards’. Following
the interaction technique is sort into the classification scheme (Figure 6). Looking at the purpose users can achieve by
using ’wind back and forwards’ it is obvious that the interaction technique is no very flexible and creative during the
usage. The objects handled and the targeted animations need
to be predefined, hence observation is the major ’purpose’
attribute which is addressed. The metaphor enables to control animations of 2D and 3D models. Even if the potential
movement and modifications controlled by the interaction
technique take place in 2D and 3D, the interaction on the
Tablet-PC itself has only 1 degree of freedom, which makes interaction very accurate. The Tablet-PC can be handled
by one person at a time, which has than the opportunity to
operate the animation and investigate effects and independencies with other elements in the virtual environment.
An example application realized at the University of Kaiserslautern is the rotation of cranes in a factory layout (Figure 7). Thereby the user can control the crane rotation and
assess the covered area the crane is able to operate in. The
user has the opportunity to check the position of cranes and
to compare the realized performance in context with the design of certain workplace layouts.
Another example describes a creative modelling technique,
namely immersive ’3d sketching’. The technique is used to
draw three-dimensional strokes within a virtual scene (i.e.
interaction purpose creative design, design, create). The immersive sketching system runs in an immersive five-sided
CAVE with 2.5 m edge length, employing a rendering cluster and an optical tracking system. It allows free-hand drawing and modelling in one-to-one scale (object medium;
3-dimensional; 3D models) by means of tangible interfaces, e.g. a stylus or bi-manual modelling tools (Figure 8)
[IWMS09]. The stylus allowed drawing virtual ink directly
into the virtual environment, following the movements of the
c The Eurographics Association 2014.
C. Weidig et.al. / Classification of VR interaction techniques, based on user intention
classification
57
'windbackand
purpose
manage
present
measure
observe
investigate
analyze
control
comprehend
feedback
external
Interactionneeds
viarenderingsystem
internal
viaintegrateddisplay
objectmedium
2ͲDimensional
Figure 8: Stylus tool for immersive sketching.
2Dmodels
3ͲDimensional
3Dmodels
devicetype
dimensionsof
manage2Dpositions
manage3Dpositions
degreesof
1dof
user
roleoftheuser
observer
operator
numberofusers
one
FBK/018-006
Figure 6: Exemplary classification of the interaction technique ’wind back and forwards’
As the user can simultaneously sketch and walk (navigate)
through the CAVE, the system also allows for parallel activities. According to the classification scheme the interaction
technique can be described in a standardized way(Figure 9).
This description can be further used by end-users to identify the interaction technique which is solving the requested
functionality in best way.
FBK/018-007
Figure 7: Analysis of the area covered by a virtual crane
within a factory layout
stylus tip (device type; degrees of freedom; 6 dof). A togglebutton inside the stylus is used to start and stop the extrusion
of virtual ink at the position of the tool tip (observe). The
system was evaluated in various previous user studies e.g. in
terms of usability and learnability. It could be shown that designers are able to learn 3d drawing movements but demand
refinement methods [IWMS09] [WIMB10].
This interaction technique was implemented at Fraunhofer IPK Berlin. The user can draw as many strokes as she or
he likes (user; role of the user; manipulate). The strokes can
be arranged by using a manipulation tool. Strokes can also be
grasped with the pen and extruded along the users hand movements as long as the button is pressed in extrusion mode.
c The Eurographics Association 2014.
Figure 9: Exemplary classification of the interaction technique ’3d sketching’
The two example classifications show the theoretically appropriability of the developed classification scheme. For further evaluation a wider database of interaction techniques
implemented within the Viaionair project will be build up.
Investigations will be made to which extend end-users could
profit from the classification scheme as it is proposed by
now. The classification scheme can be one part within an in-
58
C. Weidig et.al. / Classification of VR interaction techniques, based on user intention
novative knowledge provision framework to allow non-VR
experts access to interaction technique know-how.
[IS12] I SRAEL J. H., S NIEGULA E.: Berührungslose und begreifbare interaktionen des 3d-skizzierens. In Mensch und Computer
2012 - Workshopband: interaktiv informiert - allgegenwärtig und
allumfassend!? (2012), pp. 147–153. 4
6. Conclusion and Outlook
[IWMS09] I SRAEL J. H., W IESE E., M ATEESCU M., S TARK
R.: Investigating three-dimensional sketching for early conceptual design - results from expert discussions and user studies. In
Comput. Graph. (2009), vol. 33, pp. 462–473. 4, 5
By implementing this classification scheme two objectives
can be achieved. First existing and established interaction
techniques can be classified on a generic base. This allows
the comparison of interaction techniques independent from
application and VR system. Further assessment of similar
interaction techniques on joint applications and problem definitions can be considered. The second objective is that nonVR experts can use the classification scheme to express their
requirements in a standardized and structured way. This will
help to identify established interaction techniques users request to solve their research questions in VR.
In addition this process can even reveal gaps in VR interaction support. If users specify their requirements among interaction techniques and didn’t succeed in finding the proper interaction technique, new research activities can be initiated to develop new interaction techniques according to
users’ needs. As next step the classification scheme and the
comparison algorithm shall be implemented within the VISIONAIR website. This will allow external users to browse
through the interaction techniques provided by VISIONAIR
and identify beneficial interaction techniques to support their
research.
Acknowledgement
The research leading to these results has been funded by the
European Community’s 7th Framework Programme under
grant agreement VISIONAIR no.262044. The VISIONAIR
project (www.infra-visionair.eu) creates a European Infrastructure for Visualization and Interaction based Research.
[MP14] M ESTRE D., P ERROT V.: Evaluation of a smart tablet
interface for 3d interaction. In Proceedings of IEEE Symposium
on 3D User Interfaces 2014 (2014), pp. 135–136. 4
[MYW∗ 12] M ENCK N., YANG X., W EIDIG C., W INKES P.,
L AUER C., H AGEN H., H AMANN B., AURICH J.: Collaborative factory planning in virtual reality. In Proceedings of the 45th
CIRP Conference on Manufacturing Systems (2012), pp. 359–
366. 1
[NBS12] N OËL F., BA N., S ADEGHI S.: Qualitative comparison of 2d and 3d perception for information sharing dedicated to
manufactured product design. In Proceedings of the 3rd IEEE
Conference on Cognitive Infocommunications (2012), pp. 261–
265. 2, 4
[SMN12] S.S ADEGHI , M ASCLET C., N OËL F.: Gathering alternative solutions for new requirements in manufacturing company: Collaborative process with data visualization and interaction
support. In Proceedings of the 45th CIRP Conference on Manufacturing Systems (2012), pp. 465–470. 4
[Tuc04] T UCKER A.: Computer Science Handbook. Chapman
and Hall/CRC, 2004. 2
[WIMB10] W IESE E., I SRAEL J. H., M EYER A., B ONGARTZ
S.: Investigating the learnability of immersive free-hand sketching. In ACM SIGGRAPH/Eurographics Symposium on SketchBased Interfaces and Modeling SBIM’10 (2010), pp. 135–142.
5
[WMA13] W EIDIG C., M ENCK N., AURICH J.: Systematic development of mobile ar-applications supporting production planning. In Enabling Manufacturing Competitiveness and Economic Sustainability - Proceedings of the 5th International Conference on Changeable, Agile, Reconfigurable and Virtual Production (2013), pp. 219–224. 4
References
[Bea00] B EAUDOUIN -L AFON M.: Instrumental interaction: an
interaction model for designing post-wimp user interfaces. In
Proceedings of the SIGCHI Conference on Human Factors in
Computing Systems (2000), pp. 446–453. doi:10.1145/
332040.332473. 2
[Bow99] B OWMAN D.: Interaction techniques for common tasks
in immersive, virtual environments âĂŞ design, evaluation, and
application. Georgia Institute of Technology, 1999. 2
[Bux86] B UXTON W.: There’s more to interaction than meets
the eye: Some issues in manual input. User Centered System
Design: New Perspectives on HumanâĂŞComputer Interaction
(1986), 319–337. 1
[CB02] C OOL C., B ELKIN N.: A classification of interactions
with information. In Emerging frameworks and methods. Proceedings of the Fourth international Conference on Conceptions
of Library and Information Science (2002), pp. 1–15. 2
[DF04] D RAGICEVIC P., F EKETE J.: The input configurator toolkit: towards high input adaptability in interactive applications.
In Proceedings of the Working Conference on Advanced Visual interfaces (2004), pp. 244–247. doi:10.1145/989863.
989904. 2
c The Eurographics Association 2014.