Abstract
Hand pose estimation (HPE) is a key technology for various Internet of Things (IoT) applications, such as sign language recognition, smart healthcare, and augmented reality/virtual reality experiences. Existing HPE methods based on visual or wearable sensing have limitations in terms of accuracy, robustness, privacy, and user-friendliness. Addressing these limitations, this paper proposes a novel HPE approach employing an efficient sensing network. This network comprises an inertial measurement unit (IMU) and five tensile sensors integrated within a glove prototype. Based on the kinematic analysis of human hand, this network is designed to capture the essential features of hand movements. To confront the scarcity of publicly available datasets, we devise a technique for synthesizing IMU and tensile sensor data from video sources. A decoupled neural model leveraging transformer and autoencoder is developed to map the sparse sensor data to a complete hand pose. Our methodology not only demonstrates superior accuracy and robustness but also respects user privacy and improves usability, offering a holistic wearable sensor-based solution for HPE and broadening the horizons for IoT implementations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Chen, Y., et al.: Model-based 3D hand reconstruction via self-supervised learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10451–10460 (2021)
Jiang, C., et al.: A2J-transformer: anchor-to-joint transformer network for 3D interacting hand pose estimation from a single RGB image. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 8846–8855 (2023)
Simon, T., Joo, H., Matthews, I., Sheikh, Y.: Hand keypoint detection in single images using multiview bootstrapping. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1145–1153 (2017)
Tan, P., et al.: Self-powered gesture recognition wristband enabled by machine learning for full keyboard and multicommand input. Adv. Mater. 34(21), 2200793 (2022)
Glauser, O., Wu, S., Panozzo, D., Hilliges, O., Sorkine-Hornung, O.: Interactive hand pose estimation using a stretch-sensing soft glove. ACM Trans. Graph. (ToG) 38(4), 1–15 (2019)
Connolly, J., Condell, J., O’Flynn, B., Sanchez, J.T., Gardiner, P.: IMU sensor-based electronic goniometric glove for clinical finger movement analysis. IEEE Sens. J. 18(3), 1273–1281 (2017)
Zhou, H., Lu, T., Liu, Y., Zhang, S., Liu, R., Gowda, M.: One ring to rule them all: an open source smartring platform for finger motion analytics and healthcare applications. In: Proceedings of the 8th ACM/IEEE Conference on Internet of Things Design and Implementation, p. 2738 (2023)
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
Duarte, A., et al.: How2sign: a large-scale multimodal dataset for continuous American sign language. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 2735–2744 (2021)
Zhang, F., et al.: MediaPipe hands: on-device real-time hand tracking. arXiv preprint arXiv:2006.10214 (2020)
Hu, E.J., et al.: LORA: low-rank adaptation of large language models. arXiv preprint arXiv:2106.09685 (2021)
Lin, J., Wu, Y., Huang, T.S.: Modeling the constraints of human hand motion. In: Proceedings Workshop on Human Motion, pp. 121–126. IEEE (2000)
Hu, F., He, P., Xu, S., Li, Y., Zhang, C.: FingerTrak: continuous 3D hand pose tracking by deep learning hand silhouettes captured by miniature thermal cameras on wrist. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 4(2), 1–24 (2020)
Cao, Z., Radosavovic, I., Kanazawa, A., Malik, J.: Reconstructing hand-object interactions in the wild. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 12417–12426 (2021)
Acknowledgments
The work was supported by High-Performance Computing Platform of BUPT.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhou, G., Liu, Y., Xiao, C., Yu, H. (2024). Efficient Sensing Network and Decoupled Neural Model for Hand Pose Estimation. In: Huang, DS., Zhang, C., Guo, J. (eds) Advanced Intelligent Computing Technology and Applications. ICIC 2024. Lecture Notes in Computer Science, vol 14871. Springer, Singapore. https://doi.org/10.1007/978-981-97-5609-4_21
Download citation
DOI: https://doi.org/10.1007/978-981-97-5609-4_21
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-5608-7
Online ISBN: 978-981-97-5609-4
eBook Packages: Computer ScienceComputer Science (R0)