EXOSTATION Phase B : 7-DOF Haptic Exoskeleton
Pierre Letier, More Avraam, Mihaita Horodinca, Samuel Veillerette(1) , Jean-Philippe Verschueren(2),
Tanguy Fautre, Elvina Motard, Michel Ilzkovitz(3), Andre Schiele(4), André Preumont (1)
(1)
Active Structures Laboratory
Université Libre de Bruxelles
50, Av. F.D. Roosevelt, B-1050 Brussels, Belgium
Email: scmero@ulb.ac.be
(2)
Micromega Dynamics
Parc Scientifique du Sart Tilman
Rue des Chasseurs Ardennais
B-4031 Angleur, Belgium
Emai : info@micromega-dynamics.com
Abstract
EXOSTATION is an ESA project aiming at developing
an exoskeleton, for an astronaut in microgravity,
supporting the telemanipulation with force feedback of
anthropomorphic robotic arms. Such robotic arms
could be embedded in mobile robots used as first
explorers in hostile environments or as assistants for
EVA (e.g. Eurobot, Robonaut).
A complete 7-DOF haptic control chain is currently
under development, for demonstration purpose. The
final haptic demonstrator will be composed of a 7DOF arm exoskeleton master, controlling a virtual
slave robot interacting with a virtual environment.
1. Introduction
In future space missions, robots could be used as first
explorers in hostile environments or as assistants for
Extra-Vehicular Activities (EVA). This will require a
higher level of cooperation between astronauts and
robots. For this, the use of a portable device that would
procure the robot operator with force-feedback
sensations (also called haptic sensations) would highly
increase the easiness of the command task.
In this context, ESA has launched the development of a
humanoid servicing robot, called EUROBOT. Two
control modes are foreseen to drive this robot:
autonomous control and master-slave manual control.
For this last purpose, an arm-exoskeleton has been
developed at ESTEC [1][2][3]. However, the proposed
(3)
Space Application Services
Leuvensesteenweg, 325
B-1932 Zaventem, Belgium
Email: info@spaceapplications.com
(4)
European Space Agency
Automation and Robotics Laboratory
P.O. Box 299, 2200 AG Noordwijk, The Netherlands
Emai : andre.schiele@esa.int
exoskeleton did not provide force-feedback to the
operator. The EXOSTATION project was launched to
implement a complete multi-DOF telemanipulation
system, including a force feedback exoskeleton master
arm communicating with a simulated slave robotic arm.
Due to technological complexity, the project was
divided in two phases. Phase A was dedicated to the
development of a fully integrated 1-DOF haptic chain
representing one joint (master and slave) of the future
telemanipulation system [4]. The purpose was to
validate all the components and their integration. In
parallel, several kinds of actuation technologies were
compared to select the best fit for the future portable
device [5][6].
Based on the results and experience gained in phase A,
the aim of phase B is to develop a complete 7-DOF
haptic control chain. It is composed of four main
components that will be presented in more details in
this paper (Figure 1).
The first component is a 7–DOF exoskeleton playing
the role of the master robot in the haptic loop. It is a
fully portable device with a specific kinematics
structure and on-joint actuation.
The second component is the exoskeleton controller
composed of the master controller (ECO) and local
integrated joint controllers. These implement the
control strategies and manage the communication links
with the slave simulator.
Figure 1: EXOSTATION Phase B, Overview
The final component is a 3D Visualization Client based
on OpenGL, which allows one to several users to
visualize the state of the slave arm and provides a GUI
to remotely control the simulator.
The complete system is under development. However
several preliminary tests have already been conducted
on the exoskeleton master. The results obtained are
presented below.
2. System Description
2.1 Exoskeleton Mechanical Design
The master device, known as the Sensoric Arm Master
(SAM), consists of a fully portable arm-exoskeleton
mechanical interface with a serial kinematics,
isomorphic to the human arm (Figure 2). SAM contains
7 actuated DOF from the shoulder to the wrist. That
corresponds to the minimum number of joint allowing
full immersion of the operator while keeping an
acceptable mechanical complexity. The system has also
5 passive DOF (sliders) to be able to align each active
joint of SAM with the biological joints of the arm in
order to fit a wide range of user morphologies.
The kinematics structure is chosen to be simple but at
the same time, designed to present a maximized
workspace without singularities. As shown in Figure 3,
if 3 orthogonal rotational joints parallel to the reference
axis (X,Y,Z) are used to represent the spherical joint of
the shoulder (the initial configuration considered),
singularities appear in the middle of the user
workspace. In order to move these singularities
away from the useful workspace to the frontier of the
human workspace, joints configuration on the shoulder
has been modified by two successive rotations to reach
the final configuration proposed. The same principle
has been used for the wrist joints.
The weight of the structure is minimized in order to
maximize user comfort. FEM analysis has been
conducted in order to optimize the weight of the
structure while maintaining its mechanical resistance
and stiffness in order to achieve optimal haptic
rendering. The current weight is 6.5 kg for the master
device.
Based on the mechanical design and results of the
phase A of the project, in order to implement forcefeedback, each joint, is composed of a brushed DC
motor coupled with a cable capstan and small reduction
ratio gearbox . Each motor/reducer is chosen to be able
to provide continuously around 1/20th of the human
maximum torque of the corresponding joint except if
the human arm torque in daily activities is lower (Table
1). This criterion is commonly used with other haptic
devices which are available on the market [8].
Table 1: Torque output of SAM compared to daily
activities and maximum human torque [9][10]
Typical
human arm
torque in
daylife
(N.m)
Max human
arm torque
(N.m)
1/20th
max
human
arm
torque
(N.m)
JOINT
The third component is a Slave Simulator simulating
on top of ODE (Open Dynamics Engine) [7] a 7-DOF
actuated slave arm, cinematically equivalent to the
exoskeleton.
Continuous/
Stall torque
(SAM's joint)
(N.m)
1
4,38/19.74
10
134
6,7
2
4,38/19.74
9,6
115
5,75
3
5,97/26.94
3,1
60
3
4
3,64/16.41
3,8
72
3,6
5
2,22/7.74
0,4
9
0,45
6
0,37/1.73
0,38
21
1,05
7
0,37/1.73
0,25
20
1
Figure 2: Schematic view of the Sensoric Arm Master (SAM)
Figure 3: Influence of the kinematic structure on the singularity level
Due to the use of large size bearing with seals for the
joint 3 and 5, these joints present a higher level of
friction compared to the others. For this reason joint 3
and 5 present higher continuous torque to be able to
provide sufficient friction compensation. The purpose
of combining the two types of reducers is to achieve a
high enough torque combined with high compactness,
low friction and low backlash transmission. To limit
the weight and the volume of the actuation part, a
compact design is proposed with a full integration of
the mechanical interfaces (Figure 4). In order to
achieve teleoperation of the slave robot, position and
torque measurement are provided at each joint,
respectively by an incremental encoder and an
integrated torque sensor based on strain gauge
technology. All the motors of the arm-exoskeleton have
been located on their corresponding joint’s link in
order to reduce complexity and number of mechanical
parts.
Due to the kinematics equivalence, the control strategy
is based on a joint to joint approach. Each joint of
SAM controls individually the corresponding joint of
the slave robot by exchanging the information of
position and torque between the master and the slave at
a 500Hz sampling rate.
Based on the results of Phase A, the general form of the
controller for one Master/Slave joint is represented in
Figure 5. The principle is to exchange both torque ( hi,
si) and position ( mi
si) between master and slave to
command the opposite side. The actuator set-points
( m and s) are created by local position and torque
controllers. This scheme allows implementing control
strategies from two channels (impedance control) to
four channels that led to the best results obtained on the
1 DOF demonstrator in phase A [4].
Other pictures and videos of SAM can be found at
http://www.ulb.ac.be/scmero .
Figure 4: Integrated joint with DC actuator, encoder and
torque sensor/conditioning
2.2 Exoskeleton Controller
Figure 5: General form of the control strategy for one
Master/Slave joint
2.2.1. Control Strategy
The haptic control loop enables force feedback
teleoperation between SAM Exoskeleton and the 7DOF cinematically equivalent slave robot (section 2.3)
by the exchange of information between the two
systems. Its purpose is to guarantee good performances
in terms of human sensations and stable behavior for
slave free motion or contact.
To limit the impact of friction forces of joint 3 and 5 on
the quality of haptic rendering, friction compensation
algorithms, based on [11], are also considered in
parallel to the previous scheme. Three different
methods are implemented. The first one, called Motion
Based method, uses a feedforward compensation based
on a model of the friction (Coulomb and viscous) using
a speed measurement:
F
c * (sign ) b *
The second, Force Feedback, (dashed line in Figure 5)
implements a torque feedback based on a local torque
measurement; finally the third one, the Hybrid method,
merges the two previous in order to reduce the
disadvantages of each of them. The Force Feedback is
used for small velocities and the Motion Based for
high. Switching between the two methods is done by
using high-pass and low-pass filter based on the
velocity measurement.
2.2.1. Control Hardware
Figure 6 represents the way the control is physically
implemented. Each joint is driven by local highly
integrated joint controllers mounted on the exoskeleton
itself. These communicate trough a joint dispatcher
with the global master controller ECO.
motor applications. Mainly, it acquires torque sensor
and encoder signals. A “smart” torque sensor signal
conditioning board, developed in phase A, is embedded
in each torque sensor. A PWM current driver is also
implemented by comparing the ECO current set point
and the current flowing through the motor windings.
A custom internal bus is used to connect each joint
controller to the joint dispatcher to overcome
throughput limitations (compared to a CAN bus as used
in phase A) when addressing a multi-joints architecture
and to address compactness/very restricted space
availability issues.
2.3 Slave Simulator
The Slave Simulator provides a virtual slave robot and
a set of virtual worlds. It simulates the slave robot, its
controllers, and the interactions with the virtual worlds.
The slave robot is controlled by direct kinematics, with
its 7 DOF corresponding to the seven joints of SAM.
When the slave robot is immersed in a virtual world,
the Slave Simulator computes the collisions between
the robot and itself, and between the robot and its
environment. When a collision occurs, the
corresponding dynamics and kinematics response is
computed. The physics engine (collision detection, and
dynamics / kinematics) is built on top of ODE.
Figure 6: Exoskeleton controller hardware architecture
ECO implements the control strategies presented
before for all the joints and the data links with the
Slave Simulator (TCP/IP) and the joint dispatcher. It
takes also care of the monitoring and triggering of the
whole system. To exhibit demanding real-time
performance, ECO is implemented on a PC running the
real-time QNX™ Operating System. A Sampling
frequency of 500 Hz is foreseen.
The joint dispatcher has two purposes. Firstly, it
receives the motor commands for all the joints from the
ECO Controller and dispatches the individual set points
to each joint. Secondly, it collects sensor readings from
individual joint controllers and transmits the whole set
of readings to the ECO Controller PC.
Finally each joint controller is implemented on a
DSP/Microcontroller TMS320F2812 optimized for
The virtual worlds (including the slave robot) are
described by Python scripts located on the Slave
Simulator. When a simulation starts with a given virtual
world, the corresponding code is executed by the Slave
Simulator. This allows the user to have a set of
different environments and slaves, and to modify them
without restarting the whole simulator. In the scope of
this project, different worlds will be implemented to
demonstrate and measure the capabilities of the system,
including the ability of the system to render various
stiffnesses, forces and frictions.
The simulation of the virtual worlds runs at the same
frequency as the remaining of the haptic loop (i.e. 500
Hz). The Slave Simulator runs on a Core 2 Duo 3.0
GHz with a Debian GNU/Linux. The operating system
is not real-time, but the simulation is precisely
synchronized by ECO to keep up with the 500Hz
frequency of the haptic loop. Because of the Simulation
Link protocol, ECO and the Slave Simulator are able to
resorb a real-time failure without any significant
degradation of the haptic feeling (i.e. the haptic loop
does not become instable).
2.4 3D Visualisation Client
3. Preliminary Tests
The 3D Visualization Client is a remote application
which is not part of the haptic chain, but connects to
the Slave Simulator to allow visualizing in real-time the
state of the virtual world. Several clients can connect to
the simulator allowing the simulation to be visualized
by several users at the same time, including an optional
HMD (Head Mounted Display).
3.1 Haptic rendering
The visual description of the slave robot and its
environment is automatically sent by the Slave
Simulator to the 3D Visualization Client when
required. The idea is to avoid any hard-coded or worlddependent visual information to be stored on the 3D
Visualization Client.
The 3D rendering is done in OpenGL 2.1 and features
the shadows (using shadow mapping) of the slave robot
to improve depth perception. Additionally an audio
feedback is provided when a contact occurs to enhance
the haptic sensation.
Figure 7 shows the GUI provided by the 3D
Visualization Client for supporting the calibration of
SAM joints, browsing a list of available virtual
environments, starting and stopping a simulation and/or
the simulator.
The complete 7-DOF SAM is currently under
development. Preliminary experiments have already
been conducted with the four first joints, from the
shoulder to the elbow with a simple two channels
control strategy (Figure 8). In order to conduct these
tests, the master has been used in combination with a
simplified VR (without slave robot simulation)
implemented through the exoskeleton controller instead
of using the VR platform. The embedded controller
being under development, control is currently
implemented using a Dspace acquisition board coupled
with MATLAB SIMULINK. Results obtained in terms
of haptic rendering are very promising. Firstly they
showed the possibility for SAM to produce enough
force to the operator in order to create a sensation of
rigid contact. Secondly, several geometrical forms
could be presented to the operator. Figure 9 represents
the position of the operator when he was asking to
follow a sphere and a cube in space (projection in XZ
plane). We can see a good matching between the real
position and the virtual object. The variations observed
in the cube case are due to difficulties to follow the
surfaces near angular discontinuities.
Figure 7: Early prototype of the 3D Visualization Client
(showing a pull-down menu to browse worlds)
Figure 8: Actual prototyping status of SAM (4 joints
implemented)
low speeds to the Motion Based method at higher
speeds in order to achieve a high level of friction
compensation at high speeds while maintaining limited
friction forces around null speed.
Figure 9: Haptic rendering experiments, geometrical form
representation
3.2 Friction compensation tests
In parallel to the global haptic rendering tests,
experiments were also conducted to reduce friction in
the third joint of the exoskeleton. As evocated before,
several algorithms were implemented. Figure 10
compares the torque to move the joint for an alternative
motion (sinusoidal speed pattern), without contact,
between the three methods and without compensation.
Although the Motion Based method reduces the friction
during motion, the friction of the joint is again present
around null speed due to the discontinuity in the
Coulomb component of the friction model for such
speeds. With Force Feedback, stability considerations
limit the feedback gains and thus the level of
compensation that can be achieved. However, it should
be noticed that this level of compensation is
independent of the speed of the joint and thus better
results are achieved at low speeds than with the Motion
Based method. The Hybrid method combines the
advantages of the two previous methods while limiting
at the same time their drawbacks. The Hybrid method
switches from the Force Feedback method used for
Figure 10 : Comparison of several friction compensation
methods
4. Conclusion
This paper has introduced and described the major
components of the complete 7-DOF haptic control
chain under development in the frame of Phase B of the
EXOSTATION project. Preliminary tests have been
conducted on both hardware and software and have led
to promising results. It has been shown that the four
first joints of the exoskeleton are already able to
provide high quality 3D haptic rendering when
interacting with a simplified Virtual Reality.
Preliminary tests conducted on the slave simulator have
also shown the ability to command a virtual multi-DOF
robotic arm, compute dynamics associated with
collisions from various virtual worlds defined in
Python scripts.
on Advanced Space Technologies for Robotics and
Automation, Nov. 2006.
Future work will include: further development of the
exoskeleton (3 remaining joints for the wrist),
implementation of the internal bus and joint dispatcher,
further development of the Slave Simulator and the 3D
Visualization Client. Once each subsystem will have
achieved a sufficient level of development, integration
in the final haptic control chain will be performed and
experiments will be conducted in order to assess the
global system performances.
[5] P. Letier, M. Avraam, M. Horodinca, A. Schiele and A.
Preumont, Survey of actuation technologies for bodygrounded exoskeletons, Proceeding of the Eurohaptics 2006
Conference, Paris, France, July 2006.
5. Acknowledgments
This study is funded by ESA in the framework of a
Technology Research Program (contract No.
18408/04/NL/CP) entitled: Control Stations for new
Space A & R Applications.
6. References
[1] A. Schiele and G. Visentin, The ESA human arm
exoskeleton for space robotics telepresence. Proceeding of
the 7th International Symposium on Artificial Intelligence,
Robotics and Automation in Space, 2003.
[2] A. Schiele and G. Visentin. Exoskeleton for the human
arm, in particular for space applications, us200323844, dec.
2003, european patent application ep 1364755a1, nov. 2003.
[3] A. Schiele and F.C.T. van der Helm. Kinematic design to
improve ergonomics in human machine interaction. IEEE
Transactions on Neural Systems and Rehabilitation
Engineering, Volume 14, Issue 4, Dec. 2006.
[4] P. Letier, M. Avraam, J.P. Verschueren, T. Fautre, J.M.
Wisley and A. Schiele, EXOSTATION Phase A: a 1-DOF
haptic demonstrator, Proceeding of the 9th ESA Workshop
[6] P.Letier, A. Schiele, M. Avraam, M. Horodinca and A.
Preumont, Bowden cable actuator for torque feedback in
haptic applications, Proceeding of the Eurohaptics 2006
Conference, Paris, France, July 2006.
[7] ODE (Open Dynamics Engine), http://www.ode.org
[8] T.H. Massie and J.K. Salisbury, The Phantom Haptic
Interface: A Device for Probing Virtual Objects, Proceedings
of the {ASME} Winter Annual Meeting Symposium on Haptic
Interfaces for Virtual Environment and Teleoperator
Systems, Nov. 1994, Chicago, IL.
[9] C. Carigan and M. Liszka, Design of an Arm Exoskeleton
with Scapula Motion for Shoulder Rehabilitation,
Proceedings of the International Coference on Advanced
Robotics, July 2005, Seattle, USA.
[10] J. Rosen, J.Perry, N. Manning, S. Burns and B.
Hannaford, The Human Arm Kinematics and Dynamics
during daily activities - Toward a 7 DOF Upper Limb
Powered Exoskeleton, Proceedings of the 12th International
Conference on Advanced Robotics, July 2005, Seattle, USA.
[11] N.L. Bernstein, D.A. Lawrence and L.Y. Pao, Friction
modeling and compensation for haptic interfaces,
Eurohaptics Conference, 2005 and Symposium on Haptic
Interfaces for Virtual Environment and Teleoperator
Systems, p. 290-295, March 2005.