Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1109/ROBIO.2018.8665228guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

Visionless Tele-Exploration of 3D Moving Objects

Published: 12 December 2018 Publication History

Abstract

This paper presents methods for improved teleoperation in dynamic environments in which the objects to be manipulated are moving, but vision may not meet size, biocompatibility, or maneuverability requirements. In such situations, the object could be tracked through non-geometric means, such as heat, radioactivity, or other markers. In order to safely explore a region, we use an optical time-of-flight pretouch sensor to detect (and range) target objects prior to contact. Information from these sensors is presented to the user via haptic virtual fixtures. This combination of techniques allows the teleoperator to “feel” the object without an actual contact event between the robot and the target object. Thus it provides the perceptual benefits of touch interaction to the operator, without incurring the negative consequences of the robot contacting unknown geometrical structures; premature contact can lead to damage or unwanted displacement of the target. The authors propose that as the geometry of the scene transitions from completely unknown to partially explored, haptic virtual fixtures can both prevent collisions and guide the user towards areas of interest, thus improving exploration speed. Experimental results show that for situations that are not amenable to vision, haptically-presented pretouch sensor information allows operators to more effectively explore moving objects.

References

[1]
B. Yang, P. Lancaster, and J. R. Smith, “Pre-touch sensing for sequential manipulation,” in Robotics and Automation, 2000. Proceedings. ICRA'17. IEEE International Conference on. IEEE, 2017.
[2]
A. Peon and D. Prattichizzo, “Reaction times to constraint violation in haptics: comparing vibration, visual and audio stimuli,” in World Haptics Conference (WHC), 2013, April 2013, pp. 657–661.
[3]
S. Izadi, D. Kim, O. Hilliges, D. Molyneaux, R. Newcombe, P. Kohli, J. Shotton, S. Hodges, D. Freeman, A. Davison, et al., “Kinectfusion: real-time 3d reconstruction and interaction using a moving depth camera,” in Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 2011, pp. 559–568.
[4]
A. Petrovskaya, O. Khatib, S. Thrun, and A. Y. Ng, “Bayesian estimation for autonomous object manipulation based on tactile sensors,” in Robotics and Automation, 2006. ICRA 2006. Proceedings 2006 IEEE International Conference on. IEEE, 2006, pp. 707–714.
[5]
B. Mayton, L. LeGrand, and J. R. Smith, “An electric field pretouch system for grasping and co-manipulation,” in Robotics and Automation (ICRA), 2010 IEEE International Conference on. IEEE, 2010, pp. 831–838.
[6]
D. Guo, P. Lancaster, L.-T. Jiang, F. Sun, and J. R. Smith, “Trans-missive optical pretouch sensing for robotic grasping,” in Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on. IEEE, 2015, pp. 5891–5897.
[7]
K. Koyama, Y. Suzuki, A. Ming, and M. Shimojo, “Integrated control of a multi-fingered hand and arm using proximity sensors on the fingertips,” in Robotics and Automation (ICRA), 2016 IEEE International Conference on. IEEE, 2016, pp. 4282–4288.
[8]
A. Maldonado, H. Alvarez, and M. Beetz, “Improving robot manipulation through fingertip perception,” in Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on. IEEE, 2012, pp. 2947–2954.
[9]
K. Hsiao, P. Nangeroni, M. Huber, A. Saxena, and A. Y. Ng, “Reactive grasping using optical proximity sensors,” in Robotics and Automation, 2009. ICRA '09. IEEE International Conference on. IEEE, 2009, pp. 2098–2105.
[10]
L.-T. Jiang and J. R. Smith, “Seashell effect pretouch sensing for robotic grasping,” in Robotics and Automation (ICRA), 2012 IEEE International Conference on. IEEE, 2012, pp. 2851–2858.
[11]
L.-T. Jiang and J. Smith, “A unified framework for grasping and shape acquisition via pretouch sensing,” in Robotics and Automation (ICRA), 2013 IEEE International Conference on, May 2013, pp. 999–1005.
[12]
K. Huang, L.-T. Jiang, J. R. Smith, and H. J. Chizeck, “Sensor-aided teleoperated grasping of transparent objects,” in Robotics and Automation (ICRA), 2015 IEEE International Conference on. IEEE, 2015, pp. 4953–4959.
[13]
P. F. Hokayem and M. W. Spong, “Bilateral teleoperation: An historical survey,” Automatica, vol. 42, no. 12, pp. 2035–2057, 2006.
[14]
D.-S. Kwon, J.-H. Ryu, P.-M. Lee, and S.-W. Hong, “Design of a teleoperation controller for an underwater manipulator,” in Robotics and Automation, 2000. Proceedings. ICRA ‘00. IEEE International Conference on, vol. 4. IEEE, 2000, pp. 3114–3119.
[15]
D. Hainsworth, “Teleoperation user interfaces for mining robotics,” Autonomous Robots, vol. 11, no. 1, pp. 19–28, 2001.
[16]
F. Ryden and H. Chizeck, “A proxy method for real-time 3-DOF haptic rendering of streaming point cloud data,” Haptics, IEEE Transactions on, vol. 6, no. 3, pp. 257–267, 2013.
[17]
F. Ryden, S. Kosari, and H. Chizeck, “Proxy method for fast haptic rendering from time varying point clouds,” in Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on, 2011, pp. 2614–2619.
[18]
A. Leeper, S. Chan, K. Hsiao, M. Ciocarlie, and K. Salisbury, “Constraint-based haptic rendering of point data for teleoperated robot grasping,” in Haptics Symposium (HAPTICS), 2012 IEEE, March 2012, pp. 377–383.
[19]
R. Kumar, A. Kapoor, and R. H. Taylor, “Preliminary experiments in robot/human cooperative microinjection,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4. IEEE, 2003, pp. 3186–3191.
[20]
A. Bettini, S. Lang, A. Okamura, and G. Hager, “Vision assisted control for manipulation using virtual fixtures: experiments at macro and micro scales,” in Robotics and Automation, 2002. Proceedings. ICRA '02. IEEE International Conference on, vol. 4, 2002, pp. 3354–3361 vol. 4.
[21]
P. Marayong, M. Li, A. Okamura, and G. Hager, “Spatial motion constraints: theory and demonstrations for robot guidance using virtual fixtures,” in Robotics and Automation, 2003. Proceedings. ICRA '03. IEEE International Conference on, vol. 2, Sept 2003, pp. 1954–1959 vol. 2.
[22]
A. Saxena, L. Wong, M. Quigley, and A. Y. Ng, “A vision-based system for grasping novel objects in cluttered environments,” in Robotics Research. Springer, 2011, pp. 337–348.
[23]
C. Goldfeder, M. Ciocarlie, J. Peretzman, H. Dang, and P. K. Allen, “Data-driven grasping with partial sensor data,” in Intelligent Robots and Systems, 2009. IROS 2009. IEEE/RSJ International Conference on. IEEE, 2009, pp. 1278–1283.
[24]
D. Rao, Q. V. Le, T. Phoka, M. Quigley, A. Sudsang, and A. Y. Ng, “Grasping novel objects with depth segmentation,” in Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on. IEEE, 2010, pp. 2578–2585.
[25]
J. Aleotti, D. L. Rizzini, and S. Caselli, “Perception and grasping of object parts from active robot exploration,” Journal of Intelligent & Robotic Systems, vol. 76, no. 3–4, pp. 401–425, 2014.
[26]
G. Kahn, P. Sujan, S. Patil, S. Bopardikar, J. Ryde, K. Goldberg, and P. Abbeel, “Active exploration using trajectory optimization for robotic grasping in the presence of occlusions,” in Robotics and Automation (ICRA), 2015 IEEE International Conference on. IEEE, 2015, pp. 4783–4790.
[27]
F. Ryden and H. Chizeck, “Forbidden-region virtual fixtures from streaming point clouds: Remotely touching and protecting a beating heart,” in Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on, Oct 2012, pp. 3308–3313.
[28]
D. C. Ruspini, K. Kolarov, and O. Khatib, “The haptic display of complex graphical environments,” in Proceedings of the 24th annual conference on Computer graphics and interactive techniques. ACM Press/Addison-Wesley Publishing Co., 1997, pp. 345–352.
[29]
A. Hornung, K. M. Wurm, M. Bennewitz, C. Stachniss, and W. Bur-gard, “Octomap: An efficient probabilistic 3d mapping framework based on octrees,” Autonomous Robots, vol. 34, no. 3, pp. 189–206, 2013.
[30]
Proximity and ambient light sensing (ALS) module, STMicroelectronics, 8 2014, rev. 6.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
2018 IEEE International Conference on Robotics and Biomimetics (ROBIO)
Dec 2018
5858 pages

Publisher

IEEE Press

Publication History

Published: 12 December 2018

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media