Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

WR-Hand: Wearable Armband Can Track User's Hand

Published: 14 September 2021 Publication History

Abstract

This paper presents WR-Hand, a wearable-based system tracking 3D hand pose of 14 hand skeleton points over time using Electromyography (EMG) and gyroscope sensor data from commercial armband. This system provides a significant leap in wearable sensing and enables new application potentials in medical care, human-computer interaction, etc. A challenge is the armband EMG sensors inevitably collect mixed EMG signals from multiple forearm muscles because of the fixed sensor positions on the device, while prior bio-medical models for hand pose tracking are built on isolated EMG signal inputs from isolated forearm spots for different muscles. In this paper, we leverage the recent success of neural networks to enhance the existing bio-medical model using the armband's EMG data and visualize our design to understand why our solution is effective. Moreover, we propose solutions to place the constructed hand pose reliably in a global coordinate system, and address two practical issues by providing a general plug-and-play version for new users without training and compensating for the position difference in how users wear their armbands. We implement a prototype using different commercial armbands, which is lightweight to execute on user's phone in real-time. Extensive evaluation shows the efficacy of the WR-Hand design.

References

[1]
2015. Myo Price. https://www.businessinsider.com/myo-armband-demo-ces-2015-2015-1.
[2]
2016. Leap Motion Goes Mobile Blog. http://blog.leapmotion.com/mobile-platform/.
[3]
2019. Leap Motion Applications. https://gallery.leapmotion.com/.
[4]
2019. Outpatient Hand Rehabilitation. https://www.stonybrookmedicine.edu/patientcare/physical-occupational-therapy/hand-rehabilitation.
[5]
2020. Everything you need to know about stroke. https://www.medicalnewstoday.com/articles/7624#treatment.
[6]
2020. Hand Rehabilitation: 5 Best Methods for Recovery at Home. https://www.flintrehab.com/hand-rehabilitation/.
[7]
2020. Immersive Control. https://www.ctrl-labs.com/.
[8]
2021. gForce Armband. http://www.oymotion.com/en/product32/149.
[9]
2021. Leap Motion. https://www.leapmotion.com/.
[10]
2021. Leap Motion Goes Mobile. https://developer.leapmotion.com/107.
[11]
João Gabriel Abreu, João Marcelo Teixeira, Lucas Silva Figueiredo, and Veronica Teichrieb. 2016. Evaluating sign language recognition using the myo armband. In Proc. of IEEE SVR.
[12]
Sonu Agarwal, Arindam Mondal, Gurdeepak Joshi, and Gaurav Gupta. 2017. Gestglove: A wearable device with gesture based touchless interaction. In Proc. of ACM AH.
[13]
Serdar Ates, Claudia JW Haarman, and Arno HA Stienen. 2017. SCRIPT passive orthosis: design of interactive hand and wrist exoskeleton for rehabilitation at home after stroke. Autonomous Robots 41, 3 (2017), 711--723.
[14]
Dzmitry Bahdanau, Kyung Hyun Cho, and Yoshua Bengio. 2015. Neural machine translation by jointly learning to align and translate. In Proc. of ICLR.
[15]
Matteo Bianchi, Paolo Salaris, and Antonio Bicchi. 2013. Synergy-based hand pose sensing: Optimal glove design. The International Journal of Robotics Research 32, 4 (2013), 407--424.
[16]
Adnane Boukhayma, Rodrigo de Bem, and Philip HS Torr. 2019. 3d hand shape and pose from images in the wild. In Proc. of IEEE CVPR.
[17]
Thomas S Buchanan, David G Lloyd, Kurt Manal, and Thor F Besier. 2004. Neuromusculoskeletal modeling: estimation of muscle forces and joint moments and movements from measurements of neural command. Journal of applied biomechanics 20, 4 (2004), 367--395.
[18]
Jingxiang Chen, Chao Liu, Rongxin Cui, and Chenguang Yang. 2019. Hand Tracking Accuracy Enhancement by Data Fusion Using Leap Motion and Myo Armband. In Proc. of IEEE ICUSAI.
[19]
Ke-Yu Chen, Shwetak N Patel, and Sean Keller. 2016. Finexus: Tracking precise motions of multiple fingertips using magnetic sensing. In Proc. of ACM CHI.
[20]
Chiho Choi, Sangpil Kim, and Karthik Ramani. 2017. Learning hand articulations by hallucinating heat distribution. In Proc. of the ICCV.
[21]
Ulysse Côté-Allard, Cheikh Latyr Fall, Alexandre Drouin, Alexandre Campeau-Lecours, Clément Gosselin, Kyrre Glette, François Laviolette, and Benoit Gosselin. 2019. Deep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning. IEEE Transactions on Neural Systems and Rehabilitation Engineering 27, 4 (2019), 760--771.
[22]
Mia Erickson, Heather F Smith, Carol Waggy, and Neal E Pratt. 2020. Anatomy and Kinesiology of the Hand. Rehabilitation of the Hand and Upper Extremity, E-Book (2020), 1.
[23]
Biyi Fang, Jillian Co, and Mi Zhang. 2017. DeepASL: Enabling Ubiquitous and Non-Intrusive Word and Sentence-Level Sign Language Translation. In Proc. of ACM Sensys.
[24]
Yaroslav Ganin, Evgeniya Ustinova, Hana Ajakan, Pascal Germain, Hugo Larochelle, François Laviolette, Mario Marchand, and Victor Lempitsky. 2016. Domain-adversarial training of neural networks. The Journal of Machine Learning Research 17, 1 (2016), 2096--2030.
[25]
Ruohan Gao, Bo Xiong, and Kristen Grauman. 2018. Im2flow: Motion hallucination from static images for action recognition. In Proc. of IEEE CVPR.
[26]
Liuhao Ge, Yujun Cai, Junwu Weng, and Junsong Yuan. 2018. Hand PointNet: 3d hand pose estimation using point sets. In Proc. of CVPR.
[27]
Liuhao Ge, Hui Liang, Junsong Yuan, and Daniel Thalmann. 2017. 3d convolutional neural networks for efficient and robust hand pose estimation from single depth images. In Proc. of IEEE CVPR.
[28]
Liuhao Ge, Zhou Ren, and Junsong Yuan. 2018. Point-to-point regression pointnet for 3d hand pose estimation. In Proc. of ECCV.
[29]
Oliver Glauser, Shihao Wu, Daniele Panozzo, Otmar Hilliges, and Olga Sorkine-Hornung. 2019. Interactive hand pose estimation using a stretch-sensing soft glove. ACM Transactions on Graphics 38, 4 (2019), 41.
[30]
Jun Gong, Xing-Dong Yang, and Pourang Irani. 2016. Wristwhirl: One-handed continuous smartwatch input using wrist gestures. In Proc. of ACM UIST.
[31]
Ian Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. 2014. Generative adversarial nets. In Proc. of NIPS.
[32]
Jose L Hernandez-Rebollar, Nicholas Kyriakopoulos, and Robert W Lindeman. 2002. The AcceleGlove: a whole-hand input device for virtual reality. In Proc. of ACM SIGGRAPH.
[33]
Nalinda Hettiarachchi, Zhaojie Ju, and Honghai Liu. 2015. A new wearable ultrasound muscle activity sensing system for dexterous prosthetic control. In Proc. of IEEE SMC.
[34]
Fang Hu, Peng He, Songlin Xu, Yin Li, and Cheng Zhang. 2020. FingerTrak: Continuous 3D Hand Pose Tracking by Deep Learning Hand Silhouettes Captured by Miniature Thermal Cameras on Wrist. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 4, 2 (2020), 1--24.
[35]
Xu Jia, Efstratios Gavves, Basura Fernando, and Tinne Tuytelaars. 2015. Guiding the long-short term memory model for image caption generation. In Proc. of IEEE ICCV.
[36]
Wenjun Jiang, Chenglin Miao, Fenglong Ma, Shuochao Yao, Yaqing Wang, Ye Yuan, Hongfei Xue, Chen Song, Xin Ma, Dimitrios Koutsonikolas, et al. 2018. Towards Environment Independent Device Free Human Activity Recognition. In Proc. of ACM MobiCom.
[37]
Xianta Jiang, Lukas-Karim Merhi, Zhen Gang Xiao, and Carlo Menon. 2017. Exploration of force myography and surface electromyography in hand gesture classification. Elsevier Medical engineering & physics 41 (2017), 63--73.
[38]
Yonghang Jiang, Zhenjiang Li, and Jianping Wang. 2019. Ptrack: Enhancing the applicability of pedestrian tracking with wearables. IEEE Transactions on Mobile Computing 18, 2 (2019), 431--443.
[39]
Haiyang Jin, Qing Chen, Zhixian Chen, Ying Hu, and Jianwei Zhang. 2016. Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task. CAAI Transactions on Intelligence Technology 1, 1 (2016), 104--113.
[40]
Andrej Karpathy, Justin Johnson, and Li Fei-Fei. 2015. Visualizing and understanding recurrent networks. arXiv preprint arXiv:1506.02078 (2015).
[41]
Cem Keskin, Furkan Kιraç, Yunus Emre Kara, and Lale Akarun. 2013. Real time hand pose estimation using depth sensors. In Consumer depth cameras for computer vision. Springer.
[42]
Agamemnon Krasoulis, Iris Kyranou, Mustapha Suphi Erden, Kianoush Nazarpour, and Sethu Vijayakumar. 2017. Improved prosthetic hand control with concurrent use of myoelectric and inertial measurements. Journal of neuroengineering and rehabilitation 14, 1 (2017), 71.
[43]
Gierad Laput, Robert Xiao, and Chris Harrison. 2016. Viband: High-fidelity bio-acoustic sensing using commodity smartwatch accelerometers. In Proc. of ACM UIST.
[44]
Yang Liu, Zhenjiang Li, Zhidan Liu, and Kaishun Wu. 2019. Real-time Arm Skeleton Tracking and Gesture Inference Tolerant to Missing Wearable Sensors. In Proc. of ACM MobiSys.
[45]
Zhiyuan Lu, Xiang Chen, Qiang Li, Xu Zhang, and Ping Zhou. 2014. A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices. IEEE transactions on human-machine systems 44, 2 (2014), 293--299.
[46]
Laurens van der Maaten and Geoffrey Hinton. 2008. Visualizing data using t-SNE. Journal of machine learning research 9, Nov (2008), 2579--2605.
[47]
Andreas Madsen. 2019. Visualizing memorization in RNNs. Distill 4, 3 (2019), e16.
[48]
Jameel Malik, Ahmed Elhayek, Fabrizio Nunnari, Kiran Varanasi, Kiarash Tamaddon, Alexis Heloir, and Didier Stricker. 2018. Deephps: End-to-end estimation of 3d hand pose and shape by learning from synthetic depth. In Proc. of IEEE 3DV.
[49]
Kurt Manal, Roger V Gonzalez, David G Lloyd, and Thomas S Buchanan. 2002. A real-time EMG-driven virtual arm. Computers in biology and medicine 32, 1 (2002), 25--36.
[50]
Jess McIntosh, Asier Marzo, and Mike Fraser. 2017. Sensir: Detecting hand gestures with a wearable bracelet using infrared transmission and reflection. In Proc. of ACM UIST.
[51]
Jess McIntosh, Charlie McNeill, Mike Fraser, Frederic Kerber, Markus Löchtefeld, and Antonio Krüger. 2016. EMPress: Practical hand gesture classification with wrist-mounted EMG and pressure sensing. In Proc. of ACM CHI.
[52]
Volodymyr Mnih, Nicolas Heess, Alex Graves, et al. 2014. Recurrent models of visual attention. In Proc. of NIPS.
[53]
Franziska Mueller, Florian Bernard, Oleksandr Sotnychenko, Dushyant Mehta, Srinath Sridhar, Dan Casas, and Christian Theobalt. 2018. Ganerated hands for real-time 3d hand tracking from monocular rgb. In Proc. of IEEE CVPR.
[54]
Jimson Ngeo, Tomoya Tamei, Kazushi Ikeda, and Tomohiro Shibata. 2015. Modeling dynamic high-DOF finger postures from surface EMG using nonlinear synergies in latent space representation. In Proc. of IEEE EMBC.
[55]
Jimson G Ngeo, Tomoya Tamei, and Tomohiro Shibata. 2014. Continuous and simultaneous estimation of finger kinematics using inputs from an EMG-to-muscle activation model. Journal of neuroengineering and rehabilitation 11, 1 (2014), 122.
[56]
Viet Nguyen, Siddharth Rupavatharam, Luyang Liu, Richard Howard, and Marco Gruteser. 2019. HandSense: Capacitive coupling-based Dynamic, Micro Finger Gesture Recognition. In Proc. of ACM SenSys.
[57]
Masa Ogata and Michita Imai. 2015. SkinWatch: skin gesture interaction for smart watch. In Proc. of ACM AH.
[58]
Salih Ertug Ovur, Hang Su, Wen Qi, Elena De Momi, and Giancarlo Ferrigno. 2021. Novel Adaptive Sensor Fusion Methodology for Hand Pose Estimation With Multileap Motion. IEEE Transactions on Instrumentation and Measurement 70 (2021), 1--8.
[59]
Timothy F O'Connor, Matthew E Fach, Rachel Miller, Samuel E Root, Patrick P Mercier, and Darren J Lipomi. 2017. The Language of Glove: Wireless gesture decoder with low-power and stretchable hybrid electronics. PloS one 12, 7 (2017), e0179766.
[60]
Godwin Ponraj and Hongliang Ren. 2018. Sensor fusion of leap motion controller and flex sensors using Kalman filter for human finger tracking. IEEE Sensors Journal 18, 5 (2018), 2042--2049.
[61]
GE Powell and IC Percival. 1979. A spectral entropy method for distinguishing regular and irregular motion of Hamiltonian systems. Journal of Physics A: Mathematical and General 12, 11 (1979), 2053.
[62]
Qaiser Riaz, Guanhong Tao, Björn Krüger, and Andreas Weber. 2015. Motion reconstruction using very few accelerometers and ground contacts. Graphical Models 79 (2015), 23--38.
[63]
Muhammad Shahzad, Alex X Liu, and Arjmand Samuel. 2013. Secure unlocking of mobile touch screen devices by simple gestures: you can see it but you can not do it. In Proc. of ACM MobiCom.
[64]
Sheng Shen, Mahanth Gowda, and Romit Roy Choudhury. 2018. Closing the Gaps in Inertial Motion Tracking. In Proc. of ACM MobiCom.
[65]
Sheng Shen, He Wang, and Romit Roy Choudhury. 2016. I am a smartwatch and i can track my user's arm. In Proc. of ACM MobiSys.
[66]
Nikhil A Shrirao, Narender P Reddy, and Durga R Kosuri. 2009. Neural network committees for finger joint angle estimation from surface EMG signals. Biomedical engineering online 8, 1 (2009), 2.
[67]
Jonathan Tompson, Murphy Stein, Yann Lecun, and Ken Perlin. 2014. Real-time continuous pose recovery of human hands using convolutional networks. ACM Transactions on Graphics 33, 5 (2014), 169.
[68]
Hoang Truong, Shuo Zhang, Ufuk Muncuk, Phuc Nguyen, Nam Bui, Anh Nguyen, Qin Lv, Kaushik Chowdhury, Thang Dinh, and Tam Vu. 2018. CapBand: Battery-free Successive Capacitance Sensing Wristband for Hand Gesture Recognition. In Proc. of ACM SenSys.
[69]
Jingpeng Wang, Liqiong Tang, and John E Bronlund. 2013. Surface EMG signal amplification and filtering. International Journal of Computer Applications 82, 1 (2013).
[70]
Frank Wouda, Matteo Giuberti, Giovanni Bellusci, and Peter Veltink. 2016. Estimation of full-body poses using only five inertial sensors: An eager or lazy learning approach? Sensors 16, 12 (2016), 2138.
[71]
Kelvin Xu, Jimmy Ba, Ryan Kiros, Kyunghyun Cho, Aaron Courville, Ruslan Salakhudinov, Rich Zemel, and Yoshua Bengio. 2015. Show, attend and tell: Neural image caption generation with visual attention. In Proc. of ICML.
[72]
Cheol-hwan Yoo, Seung-wook Kim, Seo-won Ji, Yong-goo Shin, and Sung-jea Ko. 2019. Capturing Hand Articulations using Recurrent Neural Network for 3D Hand Pose Estimation. Feedback 15 (2019), 16.
[73]
Shanxin Yuan, Guillermo Garcia-Hernando, Björn Stenger, Gyeongsik Moon, Ju Yong Chang, Kyoung Mu Lee, Pavlo Molchanov, Jan Kautz, Sina Honari, Liuhao Ge, et al. 2018. Depth-based 3d hand pose estimation: From current achievements to future goals. In Proc. of IEEE CVPR.
[74]
Xu Zhang, Xiang Chen, Wen-hui Wang, Ji-hai Yang, Vuokko Lantz, and Kong-qiao Wang. 2009. Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors. In Proc. of ACM IUI.
[75]
Yang Zhang, Robert Xiao, and Chris Harrison. 2016. Advancing hand gesture recognition with high resolution electrical impedance tomography. In Proc. of ACM UIST.
[76]
Zhaohui Zhang, Shipeng Xie, Mingxiu Chen, and Haichao Zhu. 2020. HandAugment: A Simple Data Augmentation Method for Depth-Based 3D Hand Pose Estimation. arXiv preprint arXiv:2001.00702 (2020).
[77]
Mingmin Zhao, Shichao Yue, Dina Katabi, Tommi S Jaakkola, and Matt T Bianchi. 2017. Learning sleep stages from radio signals: A conditional adversarial architecture. In Proc. of ICML.
[78]
Pengfei Zhou, Mo Li, and Guobin Shen. 2014. Use it free: Instantly knowing your phone attitude. In Proc. of ACM MobiCom.
[79]
Christian Zimmermann and Thomas Brox. 2017. Learning to estimate 3d hand pose from single rgb images. In Proc. of IEEE ICCV.

Cited By

View all
  • (2024)Lipwatch: Enabling Silent Speech Recognition on Smartwatches using Acoustic SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36596148:2(1-29)Online publication date: 15-May-2024
  • (2024)Sensing to Hear through MemoryProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595988:2(1-31)Online publication date: 15-May-2024
  • (2024)UHeadProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435518:1(1-28)Online publication date: 6-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 5, Issue 3
Sept 2021
1443 pages
EISSN:2474-9567
DOI:10.1145/3486621
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 September 2021
Published in IMWUT Volume 5, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Deep Learning
  2. Human Hand Pose Construction
  3. Mobile Sensing

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • Research Grants Council of Hong Kong

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)171
  • Downloads (Last 6 weeks)14
Reflects downloads up to 13 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Lipwatch: Enabling Silent Speech Recognition on Smartwatches using Acoustic SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36596148:2(1-29)Online publication date: 15-May-2024
  • (2024)Sensing to Hear through MemoryProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595988:2(1-31)Online publication date: 15-May-2024
  • (2024)UHeadProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435518:1(1-28)Online publication date: 6-Mar-2024
  • (2024)UFaceProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435468:1(1-27)Online publication date: 6-Mar-2024
  • (2024)EVLeSen: In-Vehicle Sensing with EV-Leaked SignalProceedings of the 30th Annual International Conference on Mobile Computing and Networking10.1145/3636534.3649389(679-693)Online publication date: 29-May-2024
  • (2024)Water Salinity Sensing with UAV-Mounted IR-UWB RadarACM Transactions on Sensor Networks10.1145/363351520:4(1-37)Online publication date: 11-May-2024
  • (2024)Wi-Cyclops: Room-Scale WiFi Sensing System for Respiration Detection Based on Single-AntennaACM Transactions on Sensor Networks10.1145/363295820:4(1-24)Online publication date: 11-May-2024
  • (2024)FSS-TagProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314577:4(1-24)Online publication date: 12-Jan-2024
  • (2024)LiqDetectorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314437:4(1-24)Online publication date: 12-Jan-2024
  • (2024)EchoWrist: Continuous Hand Pose Tracking and Hand-Object Interaction Recognition Using Low-Power Active Acoustic Sensing On a WristbandProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642910(1-21)Online publication date: 11-May-2024
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media