Abstract
Tactile sensing provides essential information about the state of the world for the robotic system to perform a successful and robust manipulation task. Integrating and exploiting tactile sensation enables the robotic systems to perform wider variety of manipulation tasks in unstructured environments relative to pure vision based systems. While slip detection and grip force control have been the focus of many research works, investigation of tactile dynamic behaviour based on robot actions is not yet sufficiently explored. This analysis can provide a tactile plant which can be used for both control methods and slip prediction using tactile signals. In this letter, we present a data driven approach to find an efficient tactile dynamic model with different tactile data representations. Having evaluated the performance of the trained models, it is shown that the tactile action conditional behaviour can be predicted in a sufficiently long time horizon in future for doing robot motion control.
Supported Cancer Research UK.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Tian, S., et al.: Manipulation by feel: touch-based control with deep predictive models. In: 2019 International Conference on Robotics and Automation (ICRA), pp. 818–824. IEEE (2019)
Johansson, R.S., Flanagan, J.R.: Coding and use of tactile signals from the fingertips in object manipulation tasks. Nat. Rev. Neurosci. 10(5), 345–359 (2009)
Yousef, H., Boukallel, M., Althoefer, K.: Tactile sensing for dexterous in-hand manipulation in robotics—a review. Sens. Actuat. A: Phys. 167(2), 171–187 (2011)
Romeo, R.A., Zollo, L.: Methods and sensors for slip detection in robotics: a survey. IEEE Access 8, 73027–73050 (2020)
Dahiya, R.S., Metta, G., Valle, M., Sandini, G.: Tactile sensing—from humans to humanoids. IEEE Trans. Rob. 26(1), 1–20 (2009)
Finn, C., Goodfellow, I., Levine, S.: Unsupervised learning for physical interaction through video prediction. arXiv preprint arXiv:1605.07157 (2016)
Zhou, X., Zhang, Z., Zhu, X., Liu, H., Liang, B.: Learning to predict friction and classify contact states by tactile sensor. In: 2020 IEEE 16th International Conference on Automation Science and Engineering (CASE), pp. 1243–1248. IEEE (2020)
Yamaguchi, A., Atkeson, C.G.: Recent progress in tactile sensing and sensors for robotic manipulation: can we turn tactile sensing into vision? Adv. Robot. 33(14), 661–673 (2019)
Yi, Z., Zhang, Y., Peters, J.: Biomimetic tactile sensors and signal processing with spike trains: a review. Sens. Actuat. A: Phys. 269, 41–52 (2018)
Saen, M., Ito, K., Osada, K.: Action-intention-based grasp control with fine finger-force adjustment using combined optical-mechanical tactile sensor. IEEE Sens. J. 14(11), 4026–4033 (2014)
Jamali, N., Sammut, C.: Slip prediction using hidden markov models: Multidimensional sensor data to symbolic temporal pattern learning. In: 2012 IEEE International Conference on Robotics and Automation, pp. 215–222. IEEE (2012)
Hogan, F.R., Ballester, J., Dong, S., Rodriguez, A.: Tactile dexterity: manipulation primitives with tactile feedback. In: 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 8863–8869. IEEE (2020)
Chen, W., Khamis, H., Birznieks, I., Lepora, N.F., Redmond, S.J.: Tactile sensors for friction estimation and incipient slip detection—toward dexterous robotic manipulation: a review. IEEE Sens. J. 18(22), 9049–9064 (2018)
Takahashi, K., Tan, J.: Deep visuo-tactile learning: estimation of tactile properties from images. In: 2019 International Conference on Robotics and Automation (ICRA), pp. 8951–8957. IEEE (2019)
Luo, S., Yuan, W., Adelson, E., Cohn, A.G., Fuentes, R.: ViTac: feature sharing between vision and tactile sensing for cloth texture recognition. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 2722–2727. IEEE (2018)
Murali, A., Li, Y., Gandhi, D., Gupta, A.: Learning to grasp without seeing. In: Xiao, J., Kröger, T., Khatib, O. (eds.) ISER 2018. SPAR, vol. 11, pp. 375–386. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-33950-0_33
Lee, M.A., et al.: Making sense of vision and touch: self-supervised learning of multimodal representations for contact-rich tasks. In: 2019 International Conference on Robotics and Automation (ICRA), pp. 8943–8950. IEEE (2019)
Zapata-Impata, B.S., Gil, P., Torres, F.: Learning spatio temporal tactile features with a ConvLSTM for the direction of slip detection. Sensors 19(3), 523 (2019)
Sutanto, G., et al.: Learning latent space dynamics for tactile servoing. In: 2019 International Conference on Robotics and Automation (ICRA), pp. 3622–3628. IEEE (2019)
Jolliffe, I.T., Cadima, J.: Principal component analysis: a review and recent developments. Philos. Trans. R. Soc. A: Math. Phys. Eng. Sci. 374(2065), 20150202 (2016)
Ng, A., et al.: Sparse autoencoder. CS294A Lect. Notes 72(2011), 1–19 (2011)
Konstantinova, J., Cotugno, G., Stilli, A., Noh, Y., Althoefer, K.: Object classification using hybrid fiber optical force, proximity sensor. In: 2017 IEEE SENSORS, pp. 1–3. IEEE (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Nazari, K., Mandill, W., Hanheide, M., Esfahani, A.G. (2021). Tactile Dynamic Behaviour Prediction Based on Robot Action. In: Fox, C., Gao, J., Ghalamzan Esfahani, A., Saaj, M., Hanheide, M., Parsons, S. (eds) Towards Autonomous Robotic Systems. TAROS 2021. Lecture Notes in Computer Science(), vol 13054. Springer, Cham. https://doi.org/10.1007/978-3-030-89177-0_29
Download citation
DOI: https://doi.org/10.1007/978-3-030-89177-0_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-89176-3
Online ISBN: 978-3-030-89177-0
eBook Packages: Computer ScienceComputer Science (R0)