Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2448096.2448102acmotherconferencesArticle/Chapter ViewAbstractPublication PageswhConference Proceedingsconference-collections
research-article

Dual-mode tongue drive system: using speech and tongue motion to improve computer access for people with disabilities

Published: 23 October 2012 Publication History

Abstract

In this paper, we are presenting a new wireless and wearable assistive technology called dual-mode Tongue Drive System (dTDS), which is designed to allow people with severe disabilities use computers more effectively with increased speed, flexibility, usability, and independence through their tongue motion and speech. The dTDS detects users' tongue motion using a magnetic tracer and an array of magnetic sensors embedded in a compact, ergonomic, and stylish wireless headset. It also captures users' voice wirelessly using a small microphone on the same headset in a highly integrated fashion. Preliminary evaluation results based on 14 able-bodied subjects indicate that the dTDS headset combined with a commercially available speech recognition software can provide end users with significantly higher performance than either unimodal forms based on tongue motion or speech alone, particularly in completing tasks that require both pointing and text entry.

References

[1]
Andreasen Struijk, L. N. S. An inductive tongue computer interface for control of computers and assistive devices. IEEE Trans. Biomed. Eng., 53 (2006), 2594--2597.
[2]
Baljko, M. The contrastive evaluation of unimodal and multimodal interfaces for voice output communication aids. Proc. of the 7th Intl. Conf. on Multimodal Interfaces, (2005), 301--308.
[3]
Barea, R., Boquete, L., Mazo, M., and Lopez, E. System for assisted mobility using eye movements based on electrooculography. IEEE Trans. Rehab. Eng., 10 (2002), 209--218.
[4]
Carlson, D., and Ehrlich, N. Assistive Technology and Information Technology Use and Need by Persons with Disabilities in the United States. Report of U.S. Department of Education, National Institute on Disability and Rehabilitation, Washington, D.C., 2005.
[5]
Caruso, M. J., Bratland, T., Schneider, R. W., and Smith, C. H. A new perspective on magnetic field sensing. Proc. of Sensors Expo, 1998, 195--213.
[6]
Hochberg, L. R. and et al. Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature, 442 (2006), 164--171.
[7]
Huang, C., Chen, C., and Chung, H. Application of facial electromyography in computer mouse access for people with disabilities. Disability and Rehabilitation, 28 (2006), 231--237.
[8]
Huo, X., Wang, J., and Ghovanloo, M. A magneto-inductive sensor based wireless tongue-computer interface. IEEE Trans. Neural Syst. Rehabil. Eng., 16 (2008), 497--504.
[9]
Huo, X., Wang, J., and Ghovanloo, M. Using unconstrained tongue motion as an alternative control surface for wheeled mobility. IEEE Trans. on Biomed. Eng, 56 (2009), 1719--1726.
[10]
Huo, X., and Ghovanloo, M. Evaluation of a wireless wearable tongue--computer interface by individuals with high-level spinal cord injuries. J. Neural Engineering, 7 (2010), 026008.
[11]
Huo, X., and Ghovanloo, M. Using Speech Recognition to Enhance the Tongue Drive System Functionality in Computer Access, Proc. of 33rd IEEE Eng. in Med. and Biol. Conf., (2011), 6393--6396.
[12]
International Phonetic Association. Handbook of the International Phonetic Association. Cambridge University Press, 1999.
[13]
Kandel, E. R., Schwartz, J. H., and Jessell T. M. Principles of Neural Science, 4th ed. McGraw-Hill, New York, 2000.
[14]
Keates, S., and Robinson, P. The use of gestures in multimodal input. Proc. of the 3rd Intl. ACM Conf. on Assist. Tech., ACM Press (1998), 35--42.
[15]
Kim J., Huo, X., Minocha, J., Holbrook, J., Laumann, A., and Ghovanloo, M. Evaluation of a smartphone platform as a wireless interface between Tongue Drive system and electric-powered wheelchairs. IEEE Trans. Biomedical Eng., 59 (2012), 1787--1796.
[16]
McFarland, D. J., Krusienski, D. J., Sarnacki, W. A., and Wolpaw, J. R. Emulation of computer mouse control with a noninvasive brain--computer interface. J. Neural Eng., 5 (2008), 101--110.
[17]
National Institute of Neurological Disorders and Stroke (NINDS), NIH, "Spinal cord injury: Hope through research." http://www.ninds.nih.gov/disorders/sci/detail_sci.htm
[18]
Park, H., Huo, X., and Ghovanloo, M. New Ergonomic Headset for Tongue-Drive System with Wireless Smartphone Interface, Proc. of 33rd IEEE Eng. in Med. and Biol. Conf., (2011), 7344--7347.
[19]
Pereira, C., Neto, R., Reynaldo, A., Luzo, M., and Oliveira, R. Development and Evaluation of a Head-Controlled Human-Computer Interface with Mouse-Like Functions for Physically Disabled Users. Clinical Science, 64 (2009), 975--981.
[20]
Saponas, S., Kelly, D., Parviz, B. A., and Tan D. S. Optically Sensing Tongue Gestures for Computer Input. Proc. ACM UIST 2009, ACM Press (2009), 177--180.
[21]
Shein, F., Brownlow, N., Treviranus, J., and Pames, P. Climbing out of the rut: The future of interface technology, Proc. of the Visions Conf.: Augmentative and Alternative Comm. in the Next Decade, University of Delaware/Alfred I. duPont Institute, Wilmington, DE, (1990), 37--40.
[22]
Smith, A., Dunaway, J., Demasco, P., and Peichl D. Multimodal input for computer access and alternative communication. Proc. of the 2nd Annual ACM Conf. on Assist. Tech., ACM Press (1996), 80--85.
[23]
Yousefi, B., Huo, X., and Ghovanloo, M. Using Fitts's law for evaluating tongue drive system as a pointing device for computer access. Proc. of 32nd IEEE Eng. in Med. and Biol. Conf., (2010), 4404--4406.
[24]
Yousefi, B., Huo, X., and Ghovanloo, M. Quantitative and comparative assessment of learning in a tongue-operated computer input device. IEEE Trans. Info. Tech. in BioMed, 15 (2011), 747--457.

Cited By

View all
  • (2023)TOFI: Designing Intraoral Computer Interfaces for Gamified Myofunctional TherapyExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3573848(1-8)Online publication date: 19-Apr-2023
  • (2022)MyMove: Facilitating Older Adults to Collect In-Situ Activity Labels on a Smartwatch with SpeechProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517457(1-21)Online publication date: 29-Apr-2022
  • (2018)Exploring Accessible Smartwatch Interactions for People with Upper Body Motor ImpairmentsProceedings of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3173574.3174062(1-12)Online publication date: 21-Apr-2018
  • Show More Cited By

Index Terms

  1. Dual-mode tongue drive system: using speech and tongue motion to improve computer access for people with disabilities

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      WH '12: Proceedings of the conference on Wireless Health
      October 2012
      117 pages
      ISBN:9781450317603
      DOI:10.1145/2448096

      Sponsors

      • WLSA: Wireless-Life Sciences Alliance

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 23 October 2012

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. assistive technologies
      2. computer access
      3. disability
      4. speech recognition
      5. tongue motion
      6. wireless

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      WH '12
      Sponsor:
      • WLSA
      WH '12: Wireless Health 2012
      October 23 - 25, 2012
      California, San Diego

      Acceptance Rates

      Overall Acceptance Rate 35 of 139 submissions, 25%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)5
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 03 Oct 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)TOFI: Designing Intraoral Computer Interfaces for Gamified Myofunctional TherapyExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3573848(1-8)Online publication date: 19-Apr-2023
      • (2022)MyMove: Facilitating Older Adults to Collect In-Situ Activity Labels on a Smartwatch with SpeechProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517457(1-21)Online publication date: 29-Apr-2022
      • (2018)Exploring Accessible Smartwatch Interactions for People with Upper Body Motor ImpairmentsProceedings of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3173574.3174062(1-12)Online publication date: 21-Apr-2018
      • (2015)Personalized, Wearable Control of a Head-mounted Display for Users with Upper Body Motor ImpairmentsProceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems10.1145/2702123.2702188(221-230)Online publication date: 18-Apr-2015

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media