Abstract
This paper describes the development of a control user interface for a wheelchair-mounted manipulator for use by severely disabled persons. It explains the construction of the interface using tasks to define the user interface architecture. The prototype robot used several gesture recognition systems to achieve a level of usability better than other robots used for rehabilitation at the time. The use of neural networks and other procedures is evaluated. It outlines the experiments used to evaluate the user responses and draws conclusions about the effectiveness of the whole system. It demonstrates the possibility of control using a head mouse.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
The potential number is much greater, limited only by the available disk space and acceptable menu-depth.
discrete blocks of code
Signal bandwidth is used here to refer to the number of distinctly different signals that may be generated with a device.
As initial evaluations employed a six-axis manipulator, orientation of the end-effector for Cartesian mode was not possible. Hence, evaluation of this mode of control is not included.
Control using pre-taught positions was employed, allowing more to be ascertained form the evaluation in terms of user interaction.
Examples of interaction errors are intending to select a specific command from the menu, and accidentally selecting another, or missing a command from a scanning system.
References
Batavia AI, Hammer GS (1990) Toward the development of consumer-based criteria for the evaluation of assistive devices. J Rehabil Res Dev 27(4):425–436
Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press. ISBN 0-19-853864-2
Card SK, Moran TP, Newell AL (1983) The psychology of human computer interaction. Erlbaum, Hillsdale. ISBN 0-898-592-437
Dallaway J, Timmers P (1995) Rehabilitation robotics in Europe. IEEE Trans Rehabil Eng 3:35–45
Demers L, Weiss-Lambrou R, Ska B (1996) Development of the Quebec user evaluation of satisfaction with assistive technology. Assist Technol 8(1):3–15
Harrington ME, Daniel RW, Kyberd PJ (1995) A measurement system for the recognition of arm gestures using accelerometers. Proc Inst Mech Eng 209(2):129–134
Harwin WS, Jackson RD (1990) Analysis of intentional head gestures to assist computer access by physically disabled people. J Biomed Eng 12:193–198
Hillman M (1992) Rehabilitation robotics. Crit Rev Phys Rehabil Med 4(1):79–103
Hillman M, Jepson J (1992) Evaluation of a robotic workstation for the disabled. J Biomed Eng 14:187–192
Hrycej T (1991) Back to single layer learning principles. International Joint Conference on Neural Networks, Seattle, 1991
Hush DR, Horne BG (1993) Progress in supervised neural networks. IEEE Signal Processing Magazine, January 1993, pp 8–39
Kassler M (1993) Introduction to the special issue on robotics for health care. Robotica 11:493–494
Keates S, Langdon P, Clarkson PJ, Robinson P (2004) User models and user physical capability, User modelling and user adapted interaction 12(2–3):139–169
Kieras DE (1988) Towards a practical GOMS model methodology for user interface design. Handbook of Human Computer Interaction, Amsterdam: Elsevier. ISBN 0444 705 368
Kwee HH, Duimel JJ (1988) The MANUS Wheelchair-Borne manipulator: developments towards a production model. In: Proceedings of the International Conference of the Association for the Advancement of Rehabilitation Technology, pp 440–461
Mahoney RM (1997) Robotic products for rehabilitation: status and strategy. International Conference of Rehabilitation Robotics, Bath University, UK, pp 12–17
McEachern W, Perricos C, Jackson R (1994) Head gesture assisted direct control of a rehabilitation manipulation system. ICORR 94:49–54
Neilson J (1994) Usability inspection methods. Wiley, New York, pp 5–6. ISBN 04710 187 75
Nielsen J, Phillips V (1993) Estimating the relative usability of two interfaces: heuristic, formal, and empirical methods compared. Human Factors in Computing Systems, Conf. Proc., pp 214–221
Parsons B, White A, Prior S, Warner P (2005) The Middlesex university rehabilitation robot. J Med Eng Technol 29(4):151–162
Parsons B, White A, Warner P (2006) Validation of a user interface for a rehabilitation robot. Univers Access Inf Soc 5(3):306–324
Prior SD (1990) An electric wheelchair-mounted robotic arm—a survey of potential users. J Med Eng Technol 14(4):143–154
Prior SD (1993) Investigations into the design of a wheelchair-mounted rehabilitation robotic manipulator, PhD thesis, Middlesex University
Smith J, Topping M (1997) Study to determine the main factors leading to the overall success of the Handy 1 robotic system. International Conference of Rehabilitation Robotics, Bath University, UK, pp 147–150
Tew AI, Gray CJ (1993) A real-time gesture recognizer. J Biomed Eng 15:181–187
Topping M (1995) The development of HANDY 1 a robotic aid to independence for the severely disabled. IEE Colloquium on Mechatronic Aids for the Disabled, University of Dundee, UK, pp 1–6. Digest No 1995/107
Verburg G, Kwee H, Wisaksana A, Cheetham A, Woerden JA (1995) “Manus: the evolution of an assistive technology. Technol Disabil 5:217–228
Wasserman PD (1989) Neural computing-theory and practice. Van Nostrand Reinhold, New York. ISBN 0 442 20743 3
Acknowledgment
This programme was funded by the charity ASPIRE and the National Advisory Body. European Students working on the project were supported under the ERASMUS and SOCRATES programmes.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Gerlich, L., Parsons, B.N., White, A.S. et al. Gesture recognition for control of rehabilitation robots. Cogn Tech Work 9, 189–207 (2007). https://doi.org/10.1007/s10111-007-0062-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10111-007-0062-3