Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2737095.2737109acmconferencesArticle/Chapter ViewAbstractPublication PagescpsweekConference Proceedingsconference-collections
research-article

Tongue-n-cheek: non-contact tongue gesture recognition

Published: 13 April 2015 Publication History

Abstract

Tongue gestures are a key modality for augmentative and alternative communication in patients suffering from speech impairments and full-body paralysis. Systems for recognizing tongue gestures, however, are highly intrusive. They either rely on magnetic sensors built into dentures or artificial teeth deployed inside a patient's mouth or require contact with the skin using electromyography (EMG) sensors. Deploying sensors inside a patient's mouth can be uncomfortable for long-term use and contact-based sensors like EMG electrodes can cause skin abrasion. To address this problem, we present a novel contact-less sensor, called Tongue-n-Cheek, that captures tongue gestures using an array of micro-radars. The array of micro-radars act as proximity sensors and capture muscle movements when the patient performs the tongue gesture. Tongue-n-Cheek converts these movements into gestures using a novel signal processing algorithm. We demonstrate the efficacy of Tongue-n-Cheek and show that our system can reliably infer gestures with 95% accuracy and low latency.

References

[1]
International Perspectives on Spinal Cord Injury. WHO Press, World Health Organization, 20 Avenue Appia, 1211 Geneva 27, Switzerland, 2013.
[2]
http://www.spinalinjury101.org/details/levels-of-injury.
[3]
H. Kenneth Walker and W. Dallas Hall. Clinical Methods: The History, Physical, and Laboratory Examinations. 3rd edition. Butterworth-Heinemann, Boston, 1990.
[4]
Xueliang Huo, Jia Wang, and Maysam Ghovanloo. A magneto-inductive sensor based wireless tongue-computer interface. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 16(5): 497--504, August 2008.
[5]
M. Sasaki, K. Onishi, T. Arakawa, A. Nakayama, D. Stefanov, and M. Yamaguchi. Real-time estimation of tongue movement based on suprahyoid muscle activity. In Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE, pages 4605--4608. IEEE, July 2013.
[6]
Qiao Zhang, Shyamnath Gollakota, Ben Taskar, and Rajesh P. N. Rao. Non-intrusive tongue machine interface. In CHI '14 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 2555--2558. ACM, April 2014.
[7]
Li Liu, Shuo Niu, Jingjing Ren, and Jingyuan Zhang. Tongible: a non-contact tongue-based interaction technique. In ASSETS '12 Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility, pages 233--234. ACM, October 2012.
[8]
Xueliang Huo, Jia Wang, and Maysam Ghovanloo. Introduction and preliminary evaluation of the tongue drive system: wireless tongue-operated assistive technology for people with little or no upper-limb function. Journal of rehabilitation research and development, 45(6): 921--930, 2007.
[9]
T Scott Saponas, Daniel Kelly, Babak A Parviz, and Desney S Tan. Optically sensing tongue gestures for computer input. In Proceedings of the 22nd annual ACM symposium on User interface software and technology, pages 177--180. ACM, 2009.
[10]
Jonathan R. Wolpaw and Dennis J. McFarland. Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. Proceedings of the National Academy of Sciences of the United States of America, 101(51): 17849--17854, December 2004.
[11]
Kevin R Wheeler and Charles C Jorgensen. Gestures as input: Neuroelectric joysticks and keyboards. IEEE pervasive computing, 2(2): 56--61, 2003.
[12]
A Nelson, J Schmandt, P Shyamkumar, W Wilkins, D Lachut, N Banerjee, S Rollins, J Parkerson, and V Varadan. Wearable multi-sensor gesture recognition for paralysis patients. In Sensors, 2013 IEEE, pages 1--4. IEEE, 2013.
[13]
Alexander Nelson, Jackson Schmadt, William Wilkins, James P Parkerson, and Nilanjan Banerjee. System support for micro-harvester powered mobile sensing. In Real-Time Systems Symposium (RTSS), 2013 IEEE 34th, pages 258--267. IEEE, 2013.
[14]
David Beymer and Myron Flickner. Eye gaze tracking using an active stereo head. In Computer Vision and Pattern Recognition, 2003. Proceedings. 2003 IEEE Computer Society Conference on, volume 2, pages II--451. IEEE, 2003.
[15]
http://www.rfbeam.ch/products/k-lc2-transceiver/.
[16]
https://www.openimpulse.com/blog/products-page/product-category/hb100-microwave-sensor-module/.
[17]
Changzhi Li, V. M. Lubecke, O. Boric-Lubecke, and Jenshan Lin. A review on recent advances in doppler radar sensors for noncontact healthcare monitoring. Microwave Theory and Techniques, IEEE Transactions on, 61(5): 2046--2060, May 2013.
[18]
Liang Liu, Mihail Popescu, Marjorie Skubic, Marilyn Rantz, Tarik Yardibi, and Paul Cuddihy. Automatic fall detection based on doppler radar motion signature. In Pervasive Computing Technologies for Healthcare (PervasiveHealth), 2011 5th International Conference on, pages 222--225. IEEE, 2011.
[19]
Amy Droitcour, Victor Lubecke, Jenshan Lin, and Olga Boric-Lubecke. A microwave radio for doppler radar sensing of vital signs. In Microwave Symposium Digest, 2001 IEEE MTT-S International, volume 1, pages 175--178. IEEE, 2001.
[20]
O Boric Lubecke, P-W Ong, and VM Lubecke. 10 ghz doppler radar sensing of respiration and heart movement. In Bioengineering Conference, 2002. Proceedings of the IEEE 28th Annual Northeast, pages 55--56. IEEE, 2002.
[21]
Amy D Droitcour, Todd B Seto, Byung-Kwon Park, Shuhei Yamada, Alex Vergara, Charles El Hourani, Tommy Shing, Andrea Yuen, Victor M Lubecke, and O Boric-Lubecke. Non-contact respiratory rate measurement validation for hospitalized patients. In Engineering in Medicine and Biology Society, 2009. EMBC 2009. Annual International Conference of the IEEE, pages 4812--4815. IEEE, 2009.
[22]
Changzhan Gu, Ruijiang Li, Steve B Jiang, and Changzhi Li. A multi-radar wireless system for respiratory gating and accurate tumor tracking in lung cancer radiotherapy. In Engineering in Medicine and Biology Society, EMBC, 2011 Annual International Conference of the IEEE, pages 417--420. IEEE, 2011.
[23]
Richard G Lyons. Understanding digital signal processing. Pearson Education, 2010.
[24]
DRK Brownrigg. The weighted median filter. Communications of the ACM, 27(8): 807--818, 1984.

Cited By

View all
  • (2024)Whispering Wearables: Multimodal Approach to Silent Speech Recognition with Head-Worn DevicesProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685720(214-223)Online publication date: 4-Nov-2024
  • (2024)“Can It Be Customized According to My Motor Abilities?”: Toward Designing User-Defined Head Gestures for People with DystoniaProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642378(1-11)Online publication date: 11-May-2024
  • (2024)Tongue-Jaw Movement Recognition Through Acoustic Sensing on SmartphonesIEEE Transactions on Mobile Computing10.1109/TMC.2022.322259423:1(879-894)Online publication date: Jan-2024
  • Show More Cited By

Index Terms

  1. Tongue-n-cheek: non-contact tongue gesture recognition

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    IPSN '15: Proceedings of the 14th International Conference on Information Processing in Sensor Networks
    April 2015
    430 pages
    ISBN:9781450334754
    DOI:10.1145/2737095
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 April 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. hardware implementation
    2. micro-radars
    3. paralysis patients
    4. sensor design
    5. tongue gestures

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    IPSN '15
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 143 of 593 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)40
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 24 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Whispering Wearables: Multimodal Approach to Silent Speech Recognition with Head-Worn DevicesProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685720(214-223)Online publication date: 4-Nov-2024
    • (2024)“Can It Be Customized According to My Motor Abilities?”: Toward Designing User-Defined Head Gestures for People with DystoniaProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642378(1-11)Online publication date: 11-May-2024
    • (2024)Tongue-Jaw Movement Recognition Through Acoustic Sensing on SmartphonesIEEE Transactions on Mobile Computing10.1109/TMC.2022.322259423:1(879-894)Online publication date: Jan-2024
    • (2023)TongueMendous: IR-Based Tongue-Gesture Interface with Tiny Machine LearningProceedings of the 8th international Workshop on Sensor-Based Activity Recognition and Artificial Intelligence10.1145/3615834.3615843(1-8)Online publication date: 21-Sep-2023
    • (2023)TOFI: Designing Intraoral Computer Interfaces for Gamified Myofunctional TherapyExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3573848(1-8)Online publication date: 19-Apr-2023
    • (2023)Flexible gesture input with radars: systematic literature review and taxonomy of radar sensing integration in ambient intelligence environmentsJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-023-04606-914:6(7967-7981)Online publication date: 10-Apr-2023
    • (2023)Exploring user-defined gestures for lingual and palatal interactionJournal on Multimodal User Interfaces10.1007/s12193-023-00408-717:3(167-185)Online publication date: 10-Aug-2023
    • (2022)Radar-based monitoring system for medication tampering using data augmentation and multivariate time series classificationSmart Health10.1016/j.smhl.2021.10024523(100245)Online publication date: Mar-2022
    • (2021)Gesture-Radar: A Dual Doppler Radar Based System for Robust Recognition and Quantitative Profiling of Human GesturesIEEE Transactions on Human-Machine Systems10.1109/THMS.2020.303663751:1(32-43)Online publication date: Feb-2021
    • (2021)CanalScan: Tongue-Jaw Movement Recognition via Ear Canal Deformation SensingIEEE INFOCOM 2021 - IEEE Conference on Computer Communications10.1109/INFOCOM42981.2021.9488852(1-10)Online publication date: 10-May-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media