Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Gesture recognition for control of rehabilitation robots

  • Original Article
  • Published:
Cognition, Technology & Work Aims and scope Submit manuscript

Abstract

This paper describes the development of a control user interface for a wheelchair-mounted manipulator for use by severely disabled persons. It explains the construction of the interface using tasks to define the user interface architecture. The prototype robot used several gesture recognition systems to achieve a level of usability better than other robots used for rehabilitation at the time. The use of neural networks and other procedures is evaluated. It outlines the experiments used to evaluate the user responses and draws conclusions about the effectiveness of the whole system. It demonstrates the possibility of control using a head mouse.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. The potential number is much greater, limited only by the available disk space and acceptable menu-depth.

  2. discrete blocks of code

  3. Signal bandwidth is used here to refer to the number of distinctly different signals that may be generated with a device.

  4. As initial evaluations employed a six-axis manipulator, orientation of the end-effector for Cartesian mode was not possible. Hence, evaluation of this mode of control is not included.

  5. Control using pre-taught positions was employed, allowing more to be ascertained form the evaluation in terms of user interaction.

  6. Examples of interaction errors are intending to select a specific command from the menu, and accidentally selecting another, or missing a command from a scanning system.

References

  • Batavia AI, Hammer GS (1990) Toward the development of consumer-based criteria for the evaluation of assistive devices. J Rehabil Res Dev 27(4):425–436

    Google Scholar 

  • Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press. ISBN 0-19-853864-2

  • Card SK, Moran TP, Newell AL (1983) The psychology of human computer interaction. Erlbaum, Hillsdale. ISBN 0-898-592-437

  • Dallaway J, Timmers P (1995) Rehabilitation robotics in Europe. IEEE Trans Rehabil Eng 3:35–45

    Article  Google Scholar 

  • Demers L, Weiss-Lambrou R, Ska B (1996) Development of the Quebec user evaluation of satisfaction with assistive technology. Assist Technol 8(1):3–15

    Google Scholar 

  • Harrington ME, Daniel RW, Kyberd PJ (1995) A measurement system for the recognition of arm gestures using accelerometers. Proc Inst Mech Eng 209(2):129–134

    Google Scholar 

  • Harwin WS, Jackson RD (1990) Analysis of intentional head gestures to assist computer access by physically disabled people. J Biomed Eng 12:193–198

    Article  Google Scholar 

  • Hillman M (1992) Rehabilitation robotics. Crit Rev Phys Rehabil Med 4(1):79–103

    Google Scholar 

  • Hillman M, Jepson J (1992) Evaluation of a robotic workstation for the disabled. J Biomed Eng 14:187–192

    Article  Google Scholar 

  • Hrycej T (1991) Back to single layer learning principles. International Joint Conference on Neural Networks, Seattle, 1991

  • Hush DR, Horne BG (1993) Progress in supervised neural networks. IEEE Signal Processing Magazine, January 1993, pp 8–39

  • Kassler M (1993) Introduction to the special issue on robotics for health care. Robotica 11:493–494

    Article  Google Scholar 

  • Keates S, Langdon P, Clarkson PJ, Robinson P (2004) User models and user physical capability, User modelling and user adapted interaction 12(2–3):139–169

    Google Scholar 

  • Kieras DE (1988) Towards a practical GOMS model methodology for user interface design. Handbook of Human Computer Interaction, Amsterdam: Elsevier. ISBN 0444 705 368

  • Kwee HH, Duimel JJ (1988) The MANUS Wheelchair-Borne manipulator: developments towards a production model. In: Proceedings of the International Conference of the Association for the Advancement of Rehabilitation Technology, pp 440–461

  • Mahoney RM (1997) Robotic products for rehabilitation: status and strategy. International Conference of Rehabilitation Robotics, Bath University, UK, pp 12–17

  • McEachern W, Perricos C, Jackson R (1994) Head gesture assisted direct control of a rehabilitation manipulation system. ICORR 94:49–54

    Google Scholar 

  • Neilson J (1994) Usability inspection methods. Wiley, New York, pp 5–6. ISBN 04710 187 75

  • Nielsen J, Phillips V (1993) Estimating the relative usability of two interfaces: heuristic, formal, and empirical methods compared. Human Factors in Computing Systems, Conf. Proc., pp 214–221

  • Parsons B, White A, Prior S, Warner P (2005) The Middlesex university rehabilitation robot. J Med Eng Technol 29(4):151–162

    Article  Google Scholar 

  • Parsons B, White A, Warner P (2006) Validation of a user interface for a rehabilitation robot. Univers Access Inf Soc 5(3):306–324

    Article  Google Scholar 

  • Prior SD (1990) An electric wheelchair-mounted robotic arm—a survey of potential users. J Med Eng Technol 14(4):143–154

    Google Scholar 

  • Prior SD (1993) Investigations into the design of a wheelchair-mounted rehabilitation robotic manipulator, PhD thesis, Middlesex University

  • Smith J, Topping M (1997) Study to determine the main factors leading to the overall success of the Handy 1 robotic system. International Conference of Rehabilitation Robotics, Bath University, UK, pp 147–150

  • Tew AI, Gray CJ (1993) A real-time gesture recognizer. J Biomed Eng 15:181–187

    Article  Google Scholar 

  • Topping M (1995) The development of HANDY 1 a robotic aid to independence for the severely disabled. IEE Colloquium on Mechatronic Aids for the Disabled, University of Dundee, UK, pp 1–6. Digest No 1995/107

  • Verburg G, Kwee H, Wisaksana A, Cheetham A, Woerden JA (1995) “Manus: the evolution of an assistive technology. Technol Disabil 5:217–228

    Article  Google Scholar 

  • Wasserman PD (1989) Neural computing-theory and practice. Van Nostrand Reinhold, New York. ISBN 0 442 20743 3

Download references

Acknowledgment

This programme was funded by the charity ASPIRE and the National Advisory Body. European Students working on the project were supported under the ERASMUS and SOCRATES programmes.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. S. White.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Gerlich, L., Parsons, B.N., White, A.S. et al. Gesture recognition for control of rehabilitation robots. Cogn Tech Work 9, 189–207 (2007). https://doi.org/10.1007/s10111-007-0062-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10111-007-0062-3

Keywords