Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

iSign: An Architecture for Humanoid Assisted Sign Language Tutoring

  • Chapter
Intelligent Assistive Robots

Abstract

This paper investigates the role of interaction and communication kinesics in human-robot interaction. It is based on a project on Sign Language (SL) tutoring through interaction games with humanoid robots. The aim of the study is to design a computational framework, which enables to motivate the children with communication problems (i.e., ASD and hearing impairments) to understand and imitate the signs implemented by the robot using the basic upper torso gestures and sound in a turn-taking manner. This framework consists of modular computational components to endow the robot the capability of perceiving the actions of the children, carrying out a game or storytelling task and tutoring the children in any desired mode, i.e., supervised and semi-supervised. Visual (colored cards), vocal (storytelling, music), touch (using tactile sensors on the robot to communicate), and motion (recognition and implementation of gestures including signs) based cues are proposed to be used for a multimodal communication between the robot, child and therapist/parent. We present an empirical and exploratory study investigating the effect of basic non-verbal gestures consisting of hand movements, body and face gestures expressed by a humanoid robot, and having comprehended the word, the child will give relevant feedback in SL or visually to the robot, according to the context of the game.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Adamo-Villani, N.: A Virtual Learning Environment for Deaf Children: Design and Evaluation. IJASET - International Journal of Applied Science, Engineering, and Technology 16, 18–23 (2006)

    Google Scholar 

  2. Weaver, K.A., Hamilton, H., Zafrulla, Z., Brashear, H., Starner, T., Presti, P., Bruckman, A.: Improving the Language Ability of Deaf Signing Children through an Interactive American Sign Language-Based Video Game. In: 9th International Conference of the Learning Sciences, pp. 306–307 (2010)

    Google Scholar 

  3. Greenbacker, C., McCoy, K.: The ICICLE Project: An Overiew. In: First Annual Computer Science Research Day, Department of Computer & Information Sciences, University of Delaware (2008)

    Google Scholar 

  4. Aran, O., Keskin, C., Akarun, L.: Sign Language Tutoring Tool. In: European Signal Processing Conference, EUSIPCO 2005, Antalya, Turkey (2005)

    Google Scholar 

  5. GĂĽrel, T.C.: Turkish Sign Language Animation With Articulated Body Model. MSc. Thesis, Bogazici University (2010)

    Google Scholar 

  6. Staner, A.T., Pentland, A.: Real-Time American Sign Language Recognition from Video using Hidden Markov Models. Technical Report TR-306, Media Lab, MIT (1995)

    Google Scholar 

  7. Kadous, W.: GRASP: Recognition of Australian Sign Language Using Instrumented Gloves. MSc. Thesis, University of New South Wales (1995)

    Google Scholar 

  8. Murakami, K., Taguchi, H.: Gesture Recognition Using Recurrent Neural Networks. In: SIGCHI Conference on Human Factors in Computing Systems, pp. 237–242 (1991)

    Google Scholar 

  9. Aran, O., Akarun, L.: A Multi-class Classification Strategy for Fisher Scores: Application to Signer Independent Sign Language Recognition. Pattern Recognition 43(5), 1776–1788 (2010)

    Article  MATH  Google Scholar 

  10. Aran, O., Ari, I., Benoit, A., Campr, P., Carrillo, A.H., Fanard, F., Akarun, L., Caplier, A., Rombaut, M., Sankur, B.: Signtutor: An Interactive System for Sign Language Tutoring. IEEE Multimedia 16(1), 81–93 (2009)

    Article  Google Scholar 

  11. Caplier, A., Stillittano, S., Aran, O., Akarun, L., Bailly, G., Beautemps, D., Aboutabit, N., Burger, T.: Image and Video for Hearing-impaired People. Journal on Image and Video Processing, Special Issue on Image and Video Processing for Disability 2007(5), 2:1–2:14 (2007)

    Google Scholar 

  12. Moni, M.A., Ali, A.B.M.S.: HMM Based Hand Gesture Recognition: A Review on Techniques and Approaches. In: 2nd Conference on Computer Science and Information Technology, pp. 433–437 (2009)

    Google Scholar 

  13. Rabiner, L., Juang, B.: An Introduction to Hidden Markov Models. IEEE Acoustic Speech Signal Processing Magazine 3(1), 4–16 (1986)

    Google Scholar 

  14. Jaffe, D.: Evolution of Mechanical Fingerspelling Hands for People who are Deaf-Blind. Journal of Rehabilitation Research and Development 31, 236–244 (1994)

    Google Scholar 

  15. Hersh, M.A., Johnson, M.A. (eds.): Assistive Technology for the Hearing-impaired Deaf and Deaf Blind. Springer (2003)

    Google Scholar 

  16. http://www.cmu.ac.th/images/uploadfile/engnewsfile-100715102721.doc

  17. Kipp, M., Heloir, A., Nguyen, Q.: Sign Language Avatars: Animation and Comprehensibility. In: Vilhjálmsson, H.H., Kopp, S., Marsella, S., Thórisson, K.R. (eds.) IVA 2011. LNCS, vol. 6895, pp. 113–126. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  18. http://www.sign-lang.uni-hamburg.de/esign/

  19. http://spectrum.ieee.org/automaton/robotics/humanoids/honda-robotics-unveils-next-generation-asimo-robot

  20. Park, I., Kim, J., Lee, J., Kim, M., Cho, B., Oh, J.: Development of Biped Humanoid Robots at the Humanoid Robot Research Center, Korea Advanced Institute of Science and Technology (KAIST). In: Hackel, M. (ed.) Humanoid Robots, Human-Like Machines, pp. 43–64 (June 2007)

    Google Scholar 

  21. Shen, Q., Saunders, J., Kose-Bagci, H., Dautenhahn, K.: An Experimental Investigation of Interference Effects in Human-Humanoid Interaction Games. In: 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 291–298. IEEE Press, New York (2009)

    Google Scholar 

  22. Kose-Bagci, H., Dautenhahn, K., Nehaniv, C.L.: Emergent Dynamics of Turn-Taking Interaction in Drumming Games with a Humanoid Robot. In: 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 346–353. IEEE Press, New York (2008)

    Google Scholar 

  23. Kose-Bagci, H., Ferrari, E., Dautenhahn, K., Syrdal, D.S., Nehaniv, C.L.: Effects of Embodiment and Gestures on Social Interaction in Drumming Games with a Humanoid Robot. Advanced Robotics, Special issue on Robot and Human Interactive Communication 24(14), 1951–1996 (2009)

    Google Scholar 

  24. Kose-Bagci, H., Dautenhahn, K., Syrdal, D.S., Nehaniv, C.L.: Drum-mate: Interaction Dynamics and Gestures in Human-Humanoid Drumming Experiments. Connection Science 22(2), 103–134 (2010)

    Article  Google Scholar 

  25. Turkish Sign Language Dictionary v1.0, http://www.cmpe.boun.edu.tr/pilab/tidsozlugu

  26. Kose, H., Yorganci, R., Algan, H.E., Syrdal, D.S.: Evaluation of the Robot Assisted Sign Language Tutoring using Video-based Studies. International Journal of Social Robotics, Special issue on Measuring Human-Robot Interaction 4(3), 273–283 (2012)

    Google Scholar 

  27. Kose, H., Yorganci, R., Algan, H.E.: Evaluation of the Robot Sign Language Tutor using Video-based Studies. In: 5th European Conference on Mobile Robots (ECMR 2011), pp. 109–114 (2011)

    Google Scholar 

  28. Kose, H., Yorganci, R.: Tale of a Robot: Humanoid Robot Assisted Sign Language Tutoring. In: 11th IEEE-RAS International Conference on Humanoid Robots (HUMANOIDS 2011), pp. 105–111 (2011)

    Google Scholar 

  29. Kose, H., Yorganci, R., Itauma, I.I.: Humanoid Robot Assisted Interactive Sign Language Tutoring Game. In: IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 2247–2249 (2011)

    Google Scholar 

  30. Ertugrul, B.S., Gurpinar, C., Kivrak, H., Kulaglic, A., Kose, H.: Gesture Recognition for Humanoid Assisted Interactive Sign Language Tutoring. In: 6th International Conference on Advances in Computer-Human Interactions (ACHI), pp. 135–140 (2013)

    Google Scholar 

  31. Ertugrul, B.S., Kivrak, H., Daglarli, E., Kulaglic, A., Tekelioglu, A., Kavak, S., Ozkul, A., Yorgancı, R., Kose, H.: iSign: Interaction Games for Humanoid Assisted Sign Language Tutoring. In: International Workshop on Human-Agent Interaction, iHAI 2012 (2012)

    Google Scholar 

  32. Ertugrul, B.S., Gurpinar, C., Kivrak, H., Kose, H.: Gesture Recognition for Humanoid Assisted Interactive Sign Language Tutoring. In: 21st Signal Processing and Communications Applications Conference (SIU), pp. 1–4 (2013)

    Google Scholar 

  33. Akalin, N., Uluer, P., Kose, H., Ince, G.: Humanoid Robots Communication with Participants Using Sign Language: An Interaction Based Sign Language Game. In: IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO), pp. 181–186 (2013)

    Google Scholar 

  34. Uluer, P., Akalin, N., Yorganci, R., Kose, H., Ince, G.: A New Robotic Platform as Sign Language Tutor. In: ASROB 2013 Workshop of IEEE. Int. Conf. on Intelligent Robots and Systems, IROS (2013) (to appear)

    Google Scholar 

  35. Rabiner, L.: A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proceedings of IEEE, 257–286 (1989)

    Google Scholar 

  36. Aggarwal, N., Aggarwal, K.: A Mid-Point based k-mean Clustering Algorithm for Data Mining. International Journal on Computer Science and Engineering (IJCSE) 4 (2012)

    Google Scholar 

  37. Kivrak, H.: Learning, and Implementation of Sign Language Gestures By Iimitation in a Humanoid Robot, Istanbul Technical University, Ms Thesis (September 2013)

    Google Scholar 

  38. Calinon, S., Guenter, F., Billard, A.: On Learning, Representing and Generalizing a Task in a Humanoid Robot. IEEE Transactions on Systems, Man and Cybernetics, Part B 37(2), 286–298 (2007)

    Article  Google Scholar 

  39. Sung, H.G.: Gaussian Mixture Regression and Classification, PhD Thesis, Rice University, Texas (2004)

    Google Scholar 

  40. Wang, S.B., Quattoni, A., Morency, L., Demirdjian, D.: Hidden Conditional Random Fields for Gesture Recognition. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 1521–1527 (2006)

    Google Scholar 

  41. Wainer, J., Ferrari, E., Dautenhahn, K., Robins, B.: The Effectiveness of using a Robotics Class to Foster Collaboration among Groups of Children with Autism in an Exploratory Study. Pers. Ubiquit. Comput. 14, 445–455 (2010)

    Article  Google Scholar 

  42. Davis, M., Dautenhahn, K., Nehaniv, C.L., Powell, S.D.: Guidelines for Researchers and Practitioners Designing Software and Software Trials for Children with Autism. Journal of Assistive Technologies 4(1), 38–48 (2010)

    Article  Google Scholar 

  43. Billard, A., Robins, B., Dautenhahn, K., Nadel, J.: Building Robota, a Mini-Humanoid Robot for the Rehabilitation of Children with Autism. Assistive Technology Journal 19(1), 37–49 (2007)

    Article  Google Scholar 

  44. Robins, B., Dautenhahn, K., Boekhorst, R., te, B.A.: Robotic Assistants in Therapy and Education of Children with Autism: Can a Small Humanoid Robot Help Encourage Social Interaction Skills? Special Issue ”Design for a More Inclusive World” of the International Journal Universal Access in the Information Society (UAIS) 4(2), 105–120 (2005)

    Google Scholar 

  45. Dautenhahn, K., Werry, I.: Towards Interactive Robots in Autism Therapy: Back-ground, Motivation and Challenges. Pragmatics and Cognition 12(1), 1–35 (2004)

    Article  Google Scholar 

  46. Kozima, H., Nakagawa, C., Yasuda, Y.: Children-robot Interaction: A Pilot Study in Autism Therapy. Progress in Brain Research 164, 385–400 (2007)

    Article  Google Scholar 

  47. Aldebaran Robotics, http://www.aldebaran-robotics.com/en/

  48. Aldebaran Robotics Choregraphe, http://www.aldebaran-robotics.com/en/programmable

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hatice Kose .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Kose, H. et al. (2015). iSign: An Architecture for Humanoid Assisted Sign Language Tutoring. In: Mohammed, S., Moreno, J., Kong, K., Amirat, Y. (eds) Intelligent Assistive Robots. Springer Tracts in Advanced Robotics, vol 106. Springer, Cham. https://doi.org/10.1007/978-3-319-12922-8_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12922-8_6

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12921-1

  • Online ISBN: 978-3-319-12922-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics