The I-Walk Assistive Robot
A Multimodal Intelligent Robotic Rollator Providing Cognitive and
Mobility Assistance to the Elderly and Motor-Impaired
George Moustris1(B) , Nikolaos Kardaris1 , Antigoni Tsiami1 , Georgia Chalvatzaki1 ,
Petros Koutras1 , Athanasios Dometios1 , Paris Oikonomou1 , Costas Tzafestas1 ,
Petros Maragos1 , Eleni Efthimiou2 , Xanthi Papageorgiou2 ,
Stavroula-Evita Fotinea2 , Yiannis Koumpouros2 , Anna Vacalopoulou2 ,
Alexandra Karavasili3 , Alexandros Nikolakakis4 , Konstantinos Karaiskos4 ,
and Panagiotis Mavridis4
1 School of Electrical and Computer Engineering, National Technical University of Athens,
Athens, Greece
gmoustri@mail.ntua.gr
2 Embodied Interaction and Robotics Group, Institute for Language and Speech
Processing/ATHENA RC, Maroussi Athens, Greece
3 DIAPLASIS Rehabilitation Center, Kalamata, Greece
4 SenseWorks Ltd., Nea Smyrni, Attiki, Greece
Abstract. Robotic rollators can play a significant role as assistive devices for
people with impaired movement and mild cognitive deficit. This paper presents
an overview of the i-Walk concept; an intelligent robotic rollator offering cognitive
and ambulatory assistance to people with light to moderate movement impairment,
such as the elderly. We discuss the two robotic prototypes being developed, their
various novel functionalities, system architecture, modules and function scope,
and present preliminary experimental results with actual users.
Keywords: Assistive robotics · Intelligent system · Robotic rollator ·
Multimodal system · Elderly · Movement disorders
1 Introduction
Mobility problems, particularly concerning the elderly population, constitute a major
issue in our society. According to recent reports, approximately 20% of people aged
70 years or older, and 50% of people aged 85 and over, report difficulties in basic
activities of daily living. Mobility disabilities are common and impede many activities
important to independent living. A significant proportion of older people have serious
mobility problems. About 8% of people aged 75 years are not able to move outdoors
without help, and the percentage increases to 28% at the age of 85. The corresponding
percentages in relation to the ability to move indoors are about 5% and 14%, respectively
[21]. Furthermore, current demographics show that the elderly population (aged over 65)
in industrialized countries shows a constant increase. In EU, the rising life expectancy is
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2021
M. Saveriano et al. (Eds.): HFR 2020, SPAR 18, pp. 31–45, 2021.
https://doi.org/10.1007/978-3-030-71356-0_3
32
G. Moustris et al.
calculated to bring about an increase of 40% of the population aged 80 and over, in the
period 1995–2015, meaning that the aforementioned problems are expected to assume
an even greater significance in our society for the years to come.
Mobility is an important activity of the elderly since it promotes physical exercise,
independence and self-esteem. Assistive robots can play a significant role since they can
incorporate features such as posture support and stability, walking assistance, navigation
in indoor and outdoor environments, health monitoring and others.
In this paper we present an overview and current results of the i-Walk project, regarding the research and development of two intelligent robotic rollators incorporating multimodal human-robot interaction functionalities, providing ambulatory and cognitive
assistance to the elderly, as well as to people with moderate motor impairment. The key
motivation for this project originates from our aspiration to devise intelligent mobile
robotic mechanisms which can monitor and understand specific forms of human activity
in their workspace, in order to deduce their needs, particularly regarding to mobility
and ambulation, while also providing context-adapted support and intuitive assistance
in domestic environments (Fig. 1).
Fig. 1. View of the i-Walk lightweight version during an experiment with a patient
i-Walk incorporates methodologies employing multimodal signal acquisition and
processing through appropriate sensors, recognition-monitoring-analysis and prediction
of user actions, real-time gait analysis, user-adaptive motion control providing docking
and front-following behaviors, navigation in dynamic environments, cognitive assistance, human-computer interaction based on verbal and non-verbal communication, a
virtual avatar for more natural interfacing, speech synthesis and recognition, and others. The overarching goal of this effort is to provide mature prototypes, aiming at a
high technology readiness level (TRL 7: system prototype demonstration in operational
environment), which would require less R&D investment from the industrial partners of
the project to capitalize on them and bring them to the market. Having the commercial
viability of the platform as a guide, the i-Walk project offers two solutions; a lightweight
one, using a commercially available design, retrofitting it with the appropriate sensors
The I-Walk Assistive Robot
33
and electronics to provide a subset of the functionalities, and a heavier robotic rollator
incorporating all the envisaged functionalities. The former is intended for home use,
having no actuation at all while the latter is a fully-fledged motorized robotic rollator,
mainly for clinical environments.
2 Related Work
In recent years, there has been an increased interest in mass production of autonomous
walking assistants. Despite the small number of robotic rollators registered, approved and
licensed as commercial products, there are some implementations that constitute interesting approaches, e.g. the smart walker described in [37] which is especially suitable for
people who suffer from neurological disorders that affect movement, like Parkinson’s
disease, and is generally recommended for patients who present an increased risk of
falling due to an unsafe walking pattern. It is equipped with various distance sensors,
including lasers and sonars, which provide enhanced features to the user, such as the
ability to navigate autonomously in order to reach the patient’s location in an indoor environment. When the walking assistant mode is turned on, the platform performs basic gait
analysis, adjusting the walking speed, while an obstacle detection algorithm ensures a
safe navigation. It also provides cognitive support by actively notifying the user about the
agenda, either using voice commands, or through a touch screen interface. In the same
category, [31, 40] is a therapy device controlled by a therapist which is used for learning
to walk and for balance training during standing. The two independent motorized wheels
on each side provide the necessary actuation to exercise essential movement maneuvers,
with adjustable linear and angular velocity. Similar products have been developed for
use in outdoor environments [4, 16, 36], where in most cases an auxiliary torque plays a
major role when the patient walks on inclined ground. In particular, an assistive torque is
applied in order to facilitate walking uphill, whereas downhill, the electric motor brakes
automatically, and independently, preventing a rolling away.
Interesting features however, are not confined only to commercial robotic walking
assistants. During the last decade, an increased interest has been observed in studying,
investigating and developing smart robotic rollators in the research community. A typical
example is the robot described in [39] where a shared control strategy provides natural
response based on the user’s intention, after leveraging the interaction forces between the
user and the walker, which are interpreted as navigation commands. The implementation
of a robust autonomous navigation system, including mapping, localization and path
planning, ensures an effective robot-environment interaction, while the adaptation of
walker’s behavior to each patient’s gait pattern is taken after estimating and analyzing
the gait parameters. Finally, the rollator is equipped with an emergency braking system
that stops the walker, providing an additional safety level. In the same spirit, the idea
behind [9] is to augment the manual control by observing the patient and adapting the
control strategy accordingly. A similar implementation [38] provides the possibility of
choosing between autonomous and assistive modes. In the first case, the robot navigates,
after receiving user commands through its real-time gesture-based interface using an
RGB-D camera, while in the last one the platform controls its speed according to the
user’s gait pattern and the ground inclination. In another approach described in [2] the
34
G. Moustris et al.
walker detects the force applied by the user on the handles, and adjusts the level of
assistance of each motorized wheel.
In other cases, a robotic walking rollator [32] combines the use of both passive
(visual, acoustic and haptic) and active interfaces (electromagnetic brakes and motorized
turning wheels) in order to guide the user during the path following process. A similar
implementation that focuses on providing assistance to blind people in outdoor operation
is presented in [43], where the robot keeps the user informed about the vicinity through
a vibro-tactile belt, in order to avoid obstacles, while it guides them through the said
interface to reach a designated target location using autonomous navigation. Several
works have also been presented in the literature e.g. [7, 26] which are mostly used by
therapists and aim at examining patients’ mobility, stability and strength, and eventually
training and exercising their skills to recover control of their gait velocity as well as their
balance. Other platforms focus on estimating the features of the gait pattern, e.g. in [33]
where a leap motion sensor is used for gait classification, or in [19] where force and
sonar sensors are used to observe the human-robot interaction regarding the patient’s
upper body.
The i-Walk platform incorporates many technologies and functionalities present in
related systems, and aims at combining them in a user-friendly and clinically effective
manner.
3 Design and Overall Architecture
The two proposed rollator solutions share the same pool of hardware and software, with
the lightweight one using a subset of the heavy rollator.
The utilized h/w of the lightweight version is listed below:
1.
2.
3.
4.
5.
6.
RealSense camera 435i
360° RPLidar-A2
UM7 Orientation Sensor
eMeet M1 Black Conference Speakerphone
NVIDIA Jetson TX2
10.1′ Display
The hardware architecture and connectivity layout is presented in Fig. 2-LEFT. The
RealSense camera is intended for pose estimation, while the IMU sensor and the laser
are used in navigation. The Laser is also employed for gait tracking. All components
connect to the TX2 Jetson which serves as the central processing unit of the platform.
The heavy robotic rollator incorporates hardware and software that covers all the
envisaged functionalities. The additional modules include servos and encoders for the
wheels, force sensors on the handles, LIDARs, localization software and a NUC mini
pc additional to the Jetson. Thus, on top of the h/w in the previous list, there is also:
1. Hokuyo lidar UST-10LX for gait tracking
2. Hokuyo lidar UST-20LX for navigation
3. Mini-PC (NUC)
The I-Walk Assistive Robot
35
Fig. 2. i-Walk hardware architecture. Lightweight version (LEFT) and heavy version (RIGHT)
4.
5.
6.
7.
Force sensors on the handles
2 Servomotors on the rear wheels
Differential Encoders on all wheels
Decawave positioning system
The advanced architecture of this version is seen in Fig. 2-RIGHT. The software
architecture of both platforms is based on the widely used Robot Operating System
(ROS), running on Ubuntu Linux. For the lightweight version (Fig. 3-UP), input from
the various sensors reaches the necessary ROS libraries which perform signals conditioning and the communication between the different programming nodes. The first data
processing leads to a first representation (yellow boxes) that is the input for the upper
level processing, which recognizes the user’s state in terms of actions, movement and
communication (Orange boxes). This upper level leads to assistance decision making
(purple boxes).
The heavy version includes enhanced processing, like Scene Understanding and
Mapping, Mobility and Cognitive Assistance and a Real Time Location System for the
execution of the robotic movement and user assistance (Fig. 3-DOWN).
4 User Needs and Clinical Use-Cases
The definition of user needs followed standard procedures, which were developed in two
stages. In the first stage, user needs were collected by means of organizing workshops
with the users and rehabilitation experts which involved interviews as well as collection
of questionnaires. The second stage, involved the use of well validated evaluation scales
for the classification of ambulation and balance [3, 5, 6, 14] and cognitive capacity [17]
of the users who would form the group of data providers for the specification of user
needs as well as the pool of users to perform the platform evaluation tasks.
This work also resulted in the definition of use case scenarios which form the testbed
for the platform functionality and interaction characteristics. The adopted scenarios
involve:
36
G. Moustris et al.
Fig. 3. i-Walk software architecture. Lightweight version (UP) and heavy version (DOWN).
• Walking with a rollator. This scenario covers basic rehabilitation needs of supporting
patients with ambulation problems.
• Rehabilitation Exercises. The users are instructed to perform a suite of rehabilitation
exercises in seated and standing position including hand raises, torso turns, sit-to-stand
transfers, etc.
• Use of the elevator. This scenario targets the support of patients’ independent living.
• Transfer to bathroom. This scenario covers needs of patients with mobility problems
in basic activities of daily life.
5 User-Machine Interface
The user (patient)-machine interface was designed based on user-centered principles
[42]. This approach includes a cyclic design procedure of multiple phases, each of which
has users and user-needs at its core. Users are involved in a series of needs analysis and
design techniques so that high usability and accessibility products can be developed for
the specific user group.
First, in order to design the original patient-machine interface, there was extensive
analysis of user needs based on material which had been collected during on-site user
observations in DIAPLASIS rehabilitation center [15]. The next step in the process was
the development of a pilot patient-robot interface on the IrisTK platform [23]. IrisTK
is a framework which facilitates the development of dialogue systems based on spoken
interaction; it is designed to also manage multi-modal situations such as human-robot
interaction.
The I-Walk Assistive Robot
37
The implemented interface helps users in terms both of cognitive and mobility
issues through appropriate voice messages and visual signals which provide instructions, encouragement and cognitive support during the execution of walking and gym
exercising rehabilitation tasks. Users are able to communicate via their own voice messages through which they express their preferences or give orders to the platform. These
voice messages are produced in a completely natural manner and result to specific types
of system reactions.
The design of the i-Walk interaction environment was the result of extensive interchange between the engineering team and the clinical team of the DIAPLASIS rehabilitation center. In the original i-Walk interaction environment, the clinical personnel may
choose any of the following button options (Fig. 4 in Greek): “Patient Information”,
“Walking Information” and “Exercise Information”.
Fig. 4. Expert interface: view of patient’s performance of rehabilitation exercising
6 Multimodal Signal Acquisition and Processing
Essential to the development of an intelligent, user-aware assistive robot is the ability
to interact with the user in an intuitive, multi-modal way. Towards this goal, we have
designed and implemented a robotic perception system which consists of three submodules: (a) Visual Action and Gesture Recognition (b) Speech Understanding and (c)
Mobility Analysis, which are depicted in Fig. 5.
6.1 Visual Action and Gesture Recognition
Our activity and gesture recognition module consists of two different subsystems: the
first one performs 3D human pose estimation using the RGB-D sensor mounted on the
robotic rollator, while the second one recognizes human activity by employing a LSTMbased network architecture. Gestures and rehabilitation exercises are treated as special
38
G. Moustris et al.
Fig. 5. Overview of the i-Walk perception system
types of actions: in case the recognized activity is an exercise, the exercise monitoring
module presents the corresponding recognition scores, while in case a gesture is detected,
the gesture recognition module is triggered.
3D Pose Estimation: For the detection of the 2D body keypoints on the image plane
we employ Open Pose Library [10] with the accompanied models trained on large
annotated datasets [1, 25]. The third dimension of the 3D body keypoints is obtained
by the corresponding depth maps. Subsequently, given a pair of pixel coordinates for a
body joint and the depth value at this pixel, we calculate the corresponding 3D joint’s
coordinates through the inverse perspective mapping using the calibration matrix of the
camera. For the final human skeleton we discard the keypoints of the face, hands and feet
either because in many cases they are not detected, or the corresponding depth values
unreliable. For activity recognition, the 3D locations of the human joints are used as
features. We transform the 3D body joint locations which are provided in the camera
coordinate system to the body coordinate system with the middle-hip joint as origin.
We also normalize them by the length between the left- and right-hip joints (BNORM
scheme). In addition, we enhance the pose feature vector with the 3D velocity and
acceleration of each joint, computed from the sequence of the normalized 3D joints’
positions.
LSTM-Based Network for Activity Recognition: In our deep learning based module
for human activity recognition we employ a Neural Network architecture based on
LSTM units [22]. LSTM constitutes a special kind of recurrent neural networks that can
effectively learn long-term dependencies that exist in sequential data, such as human
joint trajectories. Our network architecture consists of two LSTM layers stacked on top
of each other and a fully connected (FC) layer, followed by softmax activation, to obtain
per-class scores. The sequence of the pose features in a temporal window is used as
input to the above network. To classify the whole sequence in one of the pre-defined
classes, we apply max pooling on the hidden states of the LSTM network, which, by
our in-house experiments, has been shown to yield the best results compared to several
other pooling schemes.
The I-Walk Assistive Robot
39
6.2 Speech Understanding
Our Speech Understanding module includes three submodules, as depicted in Fig. 5:
An Automatic Speech Recognition (ASR) module, a Natural Language Understanding
(NLU) and a dialog management and text-to-speech one. For ASR, speech recorded
through a 4-channel microphone array serves as input to the state-of-the-art speech
recognition Google speech-to-text API [20], and is transformed into text, thus requiring
an active internet connection. Subsequently, the transcribed text serves as input to the
NLU module, in order to be translated into a human intention. The integrated NLU
system has been built with RASA [8, 34, 35]: A set of pre-defined intentions, both
general purpose and specific to the current application has been designed. The former
category includes 7 general intents, namely greeting, saying my name, saying goodbye,
thanking, affirming, denying, asking to repeat, while the latter one includes 7 intents
designed for the Human-Robot Interaction: standing up, sitting down, walking, stopping,
ending interaction, going to the bathroom, doing exercises. Each intention is associated
with various phrases to express this particular intention. For example, a user can express
his/her will to stand up by saying “I want to stand up”, “Can you please help me stand
up”, or any other variation. A RASA NLU pipeline called tensorflow embeddings [34]
is then employed to predict the current intention based on the speech transcription. For
the dialog part, in order to manage the several intentions and perform the specific actions
required or just define what should be played back to the user, RASA Core has been
used. Finally, for the actual speech feedback, Google text-to-speech (TTS) in Greek has
been employed. All the above mentioned components have been integrated into ROS
platform and communicate among them or with other system components, if needed,
via ROS messages.
6.3 Mobility Analysis
Gait stability and mobility assessment are important for evaluating the rehabilitation progress. The Mobility Analysis module, triggered when activity “Walking” is
recognized, consists of the following sub-systems (Fig. 5):
Human-Centered Gait Tracking and Gait Analysis: The tracking module exploits
the RGB-D data capturing the upper-body and the laser data detecting the legs. An
hierarchical tracking filter based on an Unscented Kalman Filter estimates the positions
and velocities of the human Center-of-Mass (CoM), which is computed by the estimated
3D human pose, while an Interacting Multiple Model Particle Filter performs the gait
tracking and the recognition of the gait phases at each time frame [11, 12]. Considering
gait analysis literature [45], the walking periods are segmented into distinct strides given
the gait phases recognition and certain gait parameters are computed [13, 30].
Gait Stability Assessment: A deep neural network was designed and evaluated in [12],
as an encoder-decoder sequence to sequence model based on LSTMs. The input features
are the estimated positions of the CoM and the legs along with the respective gait phase
at each time frame, while the output predicts the gait stability state considering two
classes: stable walking and risk-of-fall state. In particular, the stability score used here
is the probability of performing stable walking.
40
G. Moustris et al.
Mobility Assessment: For assessing the patient’s mobility status, we compute gait
parameters, such as stride length, gait speed, etc., which serve as a feature vector for an
SVM classifier [13]. The classes are associated with the Performance Oriented Mobility
Assessment (POMA) [41].
7 Navigation and Control
The navigation task has been decomposed into three main subtasks: (a) path planning
(b) localization, and (c) path following. Complementary to these, there are two further
modules which provide assistive functionality; the “audial assistance” and the “user
following” module. Details are presented in the sequel.
7.1 Path Planning and Following
The planning module consists of two layers; a local planner and a global planner. These
are provided by the functionality given in ROS’ navigation stack. The global planner
provides a fast interpolated navigation function which can be used to create plans for
a mobile base [24]. The module employs Dijkstra’s algorithm to search through the
available working area and find the best path. It also uses a global obstacle costmap,
created using a prior known map, in order to calculate a safe global plan taking into
account the distance from the obstacles. Following, given the global plan, the local
planner creates a short-term plan and the appropriate velocity commands for the robot.
The local planner provides implementations of the Trajectory Rollout and Dynamic
Window Approach [18]. It continuously tests local motion paths in the environment, by
forward-simulating the robot motion in time. Based on a score, it outputs the desired
movement command to the robot. The local planner also creates a local obstacle costmap
which follows the robot, in order to calculate a safe local motion plan. This functionality
is included in the motorized version of the i-Walk platform.
7.2 Localization
Robot localization is performed using an Adaptive Monte Carlo Localization approach,
implemented in ROS as part of the navigation package and provides an estimate of the
robot’s pose against a known map. Essentially, it registers continuously the robot pose on
the map and corrects the odometry errors. In the lightweight version, the localizer uses
the lidar and IMU signals in order to provide the pose estimate, while in the motorized
version, the odometry is also used.
7.3 User Following
This module implements a functionality that enables the robot to follow the user from
the front. The goal is to have the Rollator follow the user while walking (i.e. comply
with the motion of the user without any physical interaction, that is. without any force
being applied on the Rollator handles), and to remain in close vicinity to the patient in
The I-Walk Assistive Robot
41
case of need, e.g. for postural support, cognitive assistance, etc. This means that the user
goal point is not known a priori, as the human can suddenly turn and flank the robot or
move to the other direction.
The module has been based on the work presented in a string of papers [27–29], incorporating further functionality such as such as real-time crossing detection, user intent
identification, and shared control. Specifically, a modified Dynamic Window Approach
was devised that tests for Arc-Line paths in a robot-centered rolling costmap. Using this
technique, distinct routes are detected signifying different directions of motion (Fig. 6).
Furthermore, it enables the on-line detection of undecidable areas, which demand user
input to resolve.
Fig. 6. Detection of distinct routes of motion in two different cases. First direction (cluster) is red
while the second cluster is green. Robot is depicted as a 2D frame (red-green).
7.4 Audial Assistance
The audial cognitive assistance module provides audio cues to the user while walking
with the rollator. The module assumes a known map along with a set of nodes that have
audio tokens associated with them. The nodes comprise a directed graph, meaning there
is a traversal order. Each node is a circle with two predefined radii Rin and Rout . When
the robot enters the Rin circle, an audio token is heard. Conversely when exiting the Rout
circle, another audio is played. Exiting a node makes this node obsolete and only forward
nodes are considered. Work presented in [44] has shown that this navigational assistance
can help cognitive impaired people to navigate easier through indoor environments, and
guide them to places they want to go, for example, go from their room to the dining area,
when in a rehabilitation center.
8 Preliminary Experimental Results
This section presents preliminary results from an audial assistance experiment with a
patient in the Diaplasis Rehabilitation Centre. The aim was to perform functional and
user acceptance testing of the assistive module, in order to assess its real-life performance
and deploy it later in a more structured and systematic validation testing of the entire
platform. In the experiment, the patient was placed at a starting position, and was asked
42
G. Moustris et al.
to follow the audio commands of the robot, which guided him through a manually
prescribed path. This path consisted of a loop around the Centre’s reception area, and
comprised two segments. This first segment consisted of six nodes (0:0 to 0:5 in Fig. 7),
and the second of three (1:0 to 1:2).
Fig. 7. View of the audial assistance experiment. The path is defined by nodes (green), and seen
in the thick red line. The actual path traversed by the user is seen in blue. The user started from
node 0:0 and stopped at 1:2. On the left is seen an image from the front-facing camera, mounted
on the rollator, taken from the stating position.
The patient managed to follow all the audio commands of the robot, and went through
all the prescribed nodes, thus traversing the desired path. The entire experiment lasted
for 152 s in total, where the patient traveled 49 m with an average velocity of 0.3 m/s.
Overall, the experiment was conducted without any adverse events, such as the patient
missing a node, not hearing the audio commands or not comprehending them, and was
considered successful. The patient did not report any problems with the module, and
had no trouble following the commands.
9 Conclusion and Future Work
In this paper we have presented an overview of the i-Walk project, pertaining to a platform
providing ambulatory and cognitive assistance to the elderly and the motor-impaired.
The platform offers two solutions, one lightweight and a second heavier motorized one,
aiming at home users and clinical environments respectively. This paper briefly covered
the various modules developed, mainly in the lightweight version, while also presenting
much of the intended functionality of the heavy platform. Further developments on both
rollators, especially regarding user acceptance and clinical validation are scheduled for
the months to come, and the results will be presented in appropriate venues.
Acknowledgments. This research has been co-financed by the European Union and Greek
national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call RESEARCH – CREATE – INNOVATE (project code:T1EDK- 01248/MIS:
5030856)
The I-Walk Assistive Robot
43
References
1. Andriluka, M., et al.: 2D human pose estimation: new benchmark and state of the art analysis.
In: 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 3686–3693
(2014)
2. Barrué, C., et al.: The i-Walker: an Intelligent pedestrian mobility aid. In: Bichindaritz, I., et al.
(eds) Computational Intelligence in Healthcare 4: Advanced Methodologies, pp. 103–123.
Springer (2010)
3. Bateni, H., Maki, B.E.: Assistive devices for balance and mobility: benefits, demands, and
adverse consequences. Arch. Phys. Med. Rehabil. 86(1), 134–145 (2005)
4. beactive + e. https://www.my-beactive.de/. Accessed 20 Jan 2020
5. Bennell, K., et al.: Measures of physical performance assessments: Self-Paced Walk Test
(SPWT), Stair Climb Test (SCT), Six-Minute Walk Test (6MWT), Chair Stand Test (CST),
Timed Up & Go (TUG), Sock Test, Lift and Carry Test (LCT), and Car Task. Arthritis Care
Res. 63(Suppl 11), S350–370 (2011)
6. Berg, K., et al.: The Balance Scale: reliability assessment with elderly residents and patients
with an acute stroke. Scand. J. Rehabil. Med. 27(1), 27–36 (1995)
7. Bieber, G., et al.: RoRo: a new robotic rollator concept to assist the elderly and caregivers. In:
Proceedings of the 12th ACM International Conference on PErvasive Technologies Related
to Assistive Environments, pp. 430–434. Rhodes, Greece (2019)
8. Bocklisch, T., et al.: Rasa: Open Source Language Understanding and Dialogue Management.
arXiv:1712.05181 [cs] (2017)
9. Bošnak, M., Škrjanc, I.: Embedded control system for smart walking assistance device. IEEE
Trans. Neural Syst. Rehabil. Eng. 25(3), 205–214 (2017)
10. Cao, Z., et al.: Realtime multi-person 2D pose estimation using part affinity fields. In: 2017
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1302–1310
(2017)
11. Chalvatzaki, G., et al.: Augmented human state estimation using interacting multiple model
particle filters with probabilistic data association. IEEE Robot. Autom. Lett. 3(3), 1872–1879
(2018)
12. Chalvatzaki, G., et al.: LSTM-based network for human gait stability prediction in an intelligent robotic rollator. In: 2019 International Conference on Robotics and Automation (ICRA),
pp. 4225–4232 (2019)
13. Chalvatzaki, G., et al.: User-adaptive human-robot formation control for an intelligent robotic
walker using augmented human state estimation and pathological gait characterization.
In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),
pp. 6016–6022 (2018)
14. Donoghue, O.A., et al.: Using timed up and go and usual gait speed to predict incident disability
in daily activities among community-dwelling adults aged 65 and older. Arch. Phys. Med.
Rehabil. 95(10), 1954–1961 (2014)
15. Efthimiou, E., et al.: User centered design in practice: adapting HRI to real user needs. In:
Proceedings of the 12th ACM International Conference on PErvasive Technologies Related
to Assistive Environments, pp. 425–429. Rhodes, Greece (2019)
16. ello. Der elektrische Rollator. https://ello.wmt.team/. Accessed 20 Jan 2020
17. Fountoulakis, K.N., et al.: Mini mental state examination (MMSE): a validation study in
Greece. Am. J. Alzheimer’s Dis. 15(6), 342–345 (2000)
18. Fox, D., et al.: The dynamic window approach to collision avoidance. IEEE Robot. Autom.
Mag. 4(1), 23–33 (1997)
19. Frizera-Neto, A., et al.: Empowering and assisting natural human mobility: the simbiosis
walker. Int. J. Adv. Robot. Syst. 8(3), 29 (2011)
44
G. Moustris et al.
20. Google cloud speech-to-text. https://cloud.google.com/speech-to-text/. Accessed 20 Jan 2020
21. Heikkinen, E., et al.: Disability in Old Age. The Finnish Centre for Interdisciplinary
Gerontology University of Jyväskylä Finland (2004)
22. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780
(1997)
23. IrisTK. Intelligent Real-Time Interactive Systems Toolkit. http://www.iristk.net/overview.
html. Accessed 20 Jan 2020
24. Konolige, K.: A gradient method for realtime robot control. In: Proceedings. 2000 IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS 2000), vol. 1, pp. 639–646
(2000)
25. Lin, T.-Y., et al.: Microsoft COCO: common objects in context. In: Computer Vision – ECCV
2014 (Cham, 2014), pp. 740–755 (2014)
26. Martins, M., et al.: A new integrated device to read user intentions when walking with a Smart
Walker. In: 2013 11th IEEE International Conference on Industrial Informatics (INDIN),
pp. 299–304 (2013)
27. Moustris, G.P., et al.: User front-following behaviour for a mobility assistance robot: a
kinematic control approach, pp. 142–149 (2015)
28. Moustris, G.P., Tzafestas, C.S.: Assistive front-following control of an intelligent robotic rollator based on a modified dynamic window planner. In: 2016 6th IEEE International Conference
on Biomedical Robotics and Biomechatronics (BioRob), pp. 588–593 (2016)
29. Moustris, G.P., Tzafestas, C.S.: Intention-based front-following control for an intelligent
robotic rollator in indoor environments. In: 2016 IEEE Symposium Series on Computational
Intelligence (SSCI), pp. 1–7 (2016)
30. Muro-de-la-Herran, A., et al.: Gait analysis methods: an overview of wearable and nonwearable systems, highlighting clinical applications. Sensors 14(2), 3362–3394 (2014)
31. Olenšek, A., et al.: Adaptive dynamic balance training during overground walking with assistive device. In: 2012 4th IEEE RAS EMBS International Conference on Biomedical Robotics
and Biomechatronics (BioRob), pp. 1066–1070 (2012)
32. Palopoli, L., et al.: Navigation assistance and guidance of older adults across complex public
spaces: the DALi approach. Intell. Serv. Robot. 8(2), 77–92 (2015)
33. Paulo, J., et al.: ISR-AIWALKER: robotic walker for intuitive and safe mobility assistance
and gait analysis. IEEE Trans. Hum.-Mach. Syst. 47(6), 1110–1122 (2017). https://doi.org/
10.1109/THMS.2017.2759807
34. RASA. http://rasa.com. Accessed 20 Jan 2020
35. RASA: Open source machine learning tools for developers to build, improve, and deploy
text-and voice-based chatbots and assistants. https://github.com/RasaHQ. Accessed 20 Jan
2020
36. Robot Assist Walker RT.2. https://www.rtworks.co.jp/eng/product/rt2.html. Accessed 20 Jan
2020
37. Robot Care Systems – LEA. https://www.robotcaresystems.com/. Accessed 20 Jan 2020
38. Shin, J., et al.: SmartWalker: an intelligent robotic walker. J. Am. Intell. Smart Environ. 8(4),
383–398 (2016)
39. Sierra, M.S.D., et al.: Human-robot-environment interaction interface for smart walker
assisted gait: AGoRA walker. Sensors 19(13), 2897 (2019)
40. THERA-Trainer e-go. https://www.thera-trainer.de/en/thera-trainer-products/gait/thera-tra
iner-e-go/. Accessed 20 Jan 2020
41. Tinetti, M.E., et al.: Fall risk index for elderly patients based on number of chronic disabilities.
Am. J. Med. 80(3), 429–434 (1986)
42. User-Centered Design: An Introduction. https://usabilitygeek.com/user-centered-design-int
roduction/. Accessed 20 Jan 2020
The I-Walk Assistive Robot
45
43. Wachaja, A., et al.: Navigating blind people with walking impairments using a smart walker.
Auton. Robot. 41(3), 555–573 (2017)
44. Werner, C., et al.: User-oriented evaluation of a robotic rollator that provides navigation
assistance in frail older adults with and without cognitive impairment. Gerontology 64(3),
278–290 (2018)
45. Burnfield, M.: Gait analysis: normal and pathological function. J. Sports Sci. Med. 9(2), 353
2010