Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

GelSplitter: Tactile Reconstruction from Near Infrared and Visible Images

  • Conference paper
  • First Online:
Intelligent Robotics and Applications (ICIRA 2023)

Abstract

The GelSight-like visual tactile (VT) sensor has gained popularity as a high-resolution tactile sensing technology for robots, capable of measuring touch geometry using a single RGB camera. However, the development of multi-modal perception for VT sensors remains a challenge, limited by the mono camera. In this paper, we propose the GelSplitter, a new framework approach the multi-modal VT sensor with synchronized multi-modal cameras and resemble a more human-like tactile receptor. Furthermore, we focus on 3D tactile reconstruction and implement a compact sensor structure that maintains a comparable size to state-of-the-art VT sensors, even with the addition of a prism and a near infrared (NIR) camera. We also design a photometric fusion stereo neural network (PFSNN), which estimates surface normals of objects and reconstructs touch geometry from both infrared and visible images. Our results demonstrate that the accuracy of RGB and NIR fusion is higher than that of RGB images alone. Additionally, our GelSplitter framework allows for a flexible configuration of different camera sensor combinations, such as RGB and thermal imaging.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Abad, A.C., Ranasinghe, A.: Visuotactile sensors with emphasis on gelsight sensor: a review. IEEE Sens. J. 20(14), 7628–7638 (2020). https://doi.org/10.1109/JSEN.2020.2979662

    Article  Google Scholar 

  2. Abad, A.C., Reid, D., Ranasinghe, A.: Haptitemp: a next-generation thermosensitive gelsight-like visuotactile sensor. IEEE Sens. J. 22(3), 2722–2734 (2022). https://doi.org/10.1109/JSEN.2021.3135941

    Article  Google Scholar 

  3. Arar, M., Ginger, Y., Danon, D., Bermano, A.H., Cohen-Or, D.: Unsupervised multi-modal image registration via geometry preserving image-to-image translation. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 13407–13416 (2020). https://doi.org/10.1109/CVPR42600.2020.01342

  4. Bao, L., et al.: Flexible electronic skin for monitoring of grasping state during robotic manipulation. Soft Rob. 10(2), 336–344 (2023). https://doi.org/10.1089/soro.2022.0014

    Article  Google Scholar 

  5. Castaño-Amoros, J., Gil, P., Puente, S.: Touch detection with low-cost visual-based sensor. In: International Conference on Robotics, Computer Vision and Intelligent Systems (2021)

    Google Scholar 

  6. Cui, S., Wang, R., Hu, J., Wei, J., Wang, S., Lou, Z.: In-hand object localization using a novel high-resolution visuotactile sensor. IEEE Trans. Industr. Electron. 69(6), 6015–6025 (2022). https://doi.org/10.1109/TIE.2021.3090697

    Article  Google Scholar 

  7. Cui, S., Wang, R., Hu, J., Zhang, C., Chen, L., Wang, S.: Self-supervised contact geometry learning by gelstereo visuotactile sensing. IEEE Trans. Instrum. Meas. 71, 1–9 (2022). https://doi.org/10.1109/TIM.2021.3136181

    Article  Google Scholar 

  8. Deng, X., Dragotti, P.L.: Deep convolutional neural network for multi-modal image restoration and fusion. IEEE Trans. Pattern Anal. Mach. Intell. 43(10), 3333–3348 (2021). https://doi.org/10.1109/TPAMI.2020.2984244

    Article  Google Scholar 

  9. Donlon, E., Dong, S., Liu, M., Li, J., Adelson, E., Rodriguez, A.: Gelslim: a high-resolution, compact, robust, and calibrated tactile-sensing finger. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1927–1934 (2018). https://doi.org/10.1109/IROS.2018.8593661

  10. Fang, B., Long, X., Sun, F., Liu, H., Zhang, S., Fang, C.: Tactile-based fabric defect detection using convolutional neural network with attention mechanism. IEEE Trans. Instrum. Meas. 71, 1–9 (2022). https://doi.org/10.1109/TIM.2022.3165254

    Article  Google Scholar 

  11. Hu, J., et al.: Gelstereo palm: a novel curved visuotactile sensor for 3d geometry sensing. IEEE Trans. Ind. Inf. 1–10 (2023). https://doi.org/10.1109/TII.2023.3241685

  12. James, J.W., Lepora, N.F.: Slip detection for grasp stabilization with a multifingered tactile robot hand. IEEE Trans. Rob. 37(2), 506–519 (2021). https://doi.org/10.1109/TRO.2020.3031245

    Article  Google Scholar 

  13. Lambeta, M., et al.: Digit: a novel design for a low-cost compact high-resolution tactile sensor with application to in-hand manipulation. IEEE Robot. Autom. Lett. 5(3), 3838–3845 (2020). https://doi.org/10.1109/LRA.2020.2977257

    Article  Google Scholar 

  14. Li, H., Wu, X.J.: Densefuse: a fusion approach to infrared and visible images. IEEE Trans. Image Process. 28(5), 2614–2623 (2019). https://doi.org/10.1109/TIP.2018.2887342

    Article  MathSciNet  Google Scholar 

  15. Lin, Y., Cheng, T., Zhong, Q., Zhou, W., Yang, H.: Dynamic spatial propagation network for depth completion. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 1638–1646 (2022)

    Google Scholar 

  16. Liu, S.Q., Adelson, E.H.: Gelsight fin ray: incorporating tactile sensing into a soft compliant robotic gripper. In: 2022 IEEE 5th International Conference on Soft Robotics (RoboSoft), pp. 925–931 (2022). https://doi.org/10.1109/RoboSoft54090.2022.9762175

  17. Liu, S.Q., Yañez, L.Z., Adelson, E.H.: Gelsight endoflex: a soft endoskeleton hand with continuous high-resolution tactile sensing. In: 2023 IEEE International Conference on Soft Robotics (RoboSoft), pp. 1–6 (2023). https://doi.org/10.1109/RoboSoft55895.2023.10122053

  18. Ma, D., Donlon, E., Dong, S., Rodriguez, A.: Dense tactile force estimation using gelslim and inverse fem. In: 2019 International Conference on Robotics and Automation (ICRA), pp. 5418–5424 (2019). https://doi.org/10.1109/ICRA.2019.8794113

  19. Ma, J., Xu, H., Jiang, J., Mei, X., Zhang, X.P.: Ddcgan: a dual-discriminator conditional generative adversarial network for multi-resolution image fusion. IEEE Trans. Image Process. 29, 4980–4995 (2020). https://doi.org/10.1109/TIP.2020.2977573

    Article  MATH  Google Scholar 

  20. Shuangping, J., Bingbing, Y., Minhao, J., Yi, Z., Jiajun, L., Renhe, J.: Darkvisionnet: low-light imaging via RGB-NIR fusion with deep inconsistency prior. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 1104–1112 (2022)

    Google Scholar 

  21. Singh, S., et al.: A review of image fusion: methods, applications and performance metrics. Digital Signal Process. 137, 104020 (2023). https://doi.org/10.1016/j.dsp.2023.104020

    Article  Google Scholar 

  22. Taylor, I.H., Dong, S., Rodriguez, A.: Gelslim 3.0: High-resolution measurement of shape, force and slip in a compact tactile-sensing finger. In: 2022 International Conference on Robotics and Automation (ICRA), pp. 10781–10787 (2022). https://doi.org/10.1109/ICRA46639.2022.9811832

  23. Wang, S., She, Y., Romero, B., Adelson, E.: Gelsight wedge: measuring high-resolution 3d contact geometry with a compact robot finger. In: 2021 IEEE International Conference on Robotics and Automation (ICRA), pp. 6468–6475 (2021). https://doi.org/10.1109/ICRA48506.2021.9560783

  24. Wang, Z., Wu, Y., Niu, Q.: Multi-sensor fusion in automated driving: a survey. IEEE Access 8, 2847–2868 (2020). https://doi.org/10.1109/ACCESS.2019.2962554

    Article  Google Scholar 

  25. Wu, X.A., Huh, T.M., Sabin, A., Suresh, S.A., Cutkosky, M.R.: Tactile sensing and terrain-based gait control for small legged robots. IEEE Trans. Rob. 36(1), 15–27 (2020). https://doi.org/10.1109/TRO.2019.2935336

    Article  Google Scholar 

  26. Xue, T., Wang, W., Ma, J., Liu, W., Pan, Z., Han, M.: Progress and prospects of multimodal fusion methods in physical human-robot interaction: a review. IEEE Sens. J. 20(18), 10355–10370 (2020). https://doi.org/10.1109/JSEN.2020.2995271

    Article  Google Scholar 

  27. Yamaguchi, A., Atkeson, C.G.: Tactile behaviors with the vision-based tactile sensor fingervision. Int. J. Humanoid Rob. 16(03), 1940002 (2019). https://doi.org/10.1142/S0219843619400024

    Article  Google Scholar 

  28. Yuan, W., Dong, S., Adelson, E.H.: Gelsight: high-resolution robot tactile sensors for estimating geometry and force. Sensors 17(12) (2017). https://www.mdpi.com/1424-8220/17/12/2762

  29. Zamir, S.W., et al.: Learning enriched features for fast image restoration and enhancement. IEEE Trans. Pattern Anal. Mach. Intell. 45(2), 1934–1948 (2023). https://doi.org/10.1109/TPAMI.2022.3167175

    Article  Google Scholar 

  30. Zhang, C., Cui, S., Cai, Y., Hu, J., Wang, R., Wang, S.: Learning-based six-axis force/torque estimation using gelstereo fingertip visuotactile sensing. In: 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3651–3658 (2022). https://doi.org/10.1109/IROS47612.2022.9981100

  31. Zhang, G., Du, Y., Yu, H., Wang, M.Y.: Deltact: a vision-based tactile sensor using a dense color pattern. IEEE Robot. Autom. Lett. 7(4), 10778–10785 (2022). https://doi.org/10.1109/LRA.2022.3196141

    Article  Google Scholar 

  32. Zhao, Z., Xu, S., Zhang, C., Liu, J., Zhang, J.: Bayesian fusion for infrared and visible images. Signal Process. 177, 107734 (2020). https://doi.org/10.1016/j.sigpro.2020.107734

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by the Guangdong University Engineering Technology Research Center for Precision Components of Intelligent Terminal of Transportation Tools (Project No. 2021GCZX002), and Guangdong HUST Industrial Technology Research Institute, Guangdong Provincial Key Laboratory of Digital Manufacturing Equipment.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hua Yang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lin, Y. et al. (2023). GelSplitter: Tactile Reconstruction from Near Infrared and Visible Images. In: Yang, H., et al. Intelligent Robotics and Applications. ICIRA 2023. Lecture Notes in Computer Science(), vol 14273. Springer, Singapore. https://doi.org/10.1007/978-981-99-6498-7_2

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-6498-7_2

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-6497-0

  • Online ISBN: 978-981-99-6498-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics