Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1109/ICRA48506.2021.9561688guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

UAV Target-Selection: 3D Pointing Interface System for Large-Scale Environment

Published: 30 May 2021 Publication History

Abstract

This paper presents a 3D pointing interface application to signal a UAV’s target in a large-scale environment. This system enables UAVs equipped with a monocular camera to determine which window of a building is selected by a human user in large-scale indoor or outdoor environments. The 3D pointing interface consists of three parts: YOLO, Open- Pose, and ORB-SLAM. YOLO detects the target objects, e.g., windows, OpenPose extracts the user pose, and ORB-SLAM builds a scale-dependent 3D map, a set of 3D sparse feature points. To obtain the visual scale, it performs a calibration step with the user standing in front of the UAV at a certain distance. We detail how we chose the gesture, localize and detect objects, and transform between coordinate systems. The real- world experiment results showed that the 3D pointing interface obtained a 0.73 F1-score average and a 0.58 F1-Score at the maximum distance of 25 meters between UAV and building.

References

[1]
Sawyer, S., Tapia, A., Pesheck, L. and Davenport, J., “Mobility and the first responder,” Communications of the ACM, vol. 47, no. 3, pp. 62–65, 2004.
[2]
Medeiros, A. C., Ratsamee, P., Uranishi, Y., Mashita, T., and Takemura, H., “Human-Drone Interaction: Using Pointing Gesture to De-fine a Target Object,” International Conference on Human-Computer Interaction (Springer), pp. 688–705, July 2020.
[3]
Funk, M., “Human-drone interaction: let’s get ready for flying user interfaces!” Interactions, vol. 25, no. 3, pp. 78–81, 2018.
[4]
Mitra, S., and Acharya, T., “Gesture Recognition: A Survey.” IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), vol. 37, no. 3, p. 311–324, 2007, https://doi.org/10.1109/TSMCC.2007.893280.
[5]
Van den Bergh, M., and Van Gool, L., “Combining RGB and ToF Cameras for Real-Time 3D Hand Gesture Interaction,” IEEE Workshop on Applications of Computer Vision (WACV), p. 66–72, January 2011, https://doi.org/10.1109/WACV.2011.5711485.
[6]
Nandakumar, R., Kellogg, B., and Gollakota, S., “Kinect sensor-based long-distance hand gesture recognition and fingertip detection with depth information,” Journal of Sensors, 2018.
[7]
Sun, Y., Weng, Y., Luo, B., Li, G., Tao, B., Jiang, D. and Chen, D., “Gesture recognition algorithm based on multi-scale feature fusion in RGB-D images,” IET Image Processing, 2020.
[8]
Choi, J. W., Ryu, S. J., and Kim, J. H., “Short-range radar based real-time hand gesture recognition using LSTM encoder,” IEEE Access, vol. 7, pp. 33 610–33 618, 2019.
[9]
Sun, Y., Fei, T., Li, X., Warnecke, A., Warsitz, E. and Pohl, N., “Real-time radar-based gesture detection and recognition built in an edge-computing platform,” IEEE Sensors Journal, vol. 20, no. 18, pp. 10 706–10 716, 2020.
[10]
Choi, J.W., Ryu, S.J. and Kim, J.H., “Short-range radar based real-time hand gesture recognition using LSTM encoder,” IEEE Access, vol. 7, pp. 33 610–33 618, 2019.
[11]
Dankovich, L. J., and Bergbreiter, S., “Gesture Recognition via Flexible Capacitive Touch Electrodes,” International Conference on Robotics and Automation (ICRA), pp. 9028–9034, May 2019.
[12]
Stetco, C., Mühlbacher-Karrer, S., Lucchi, M., Weyrer, M., Faller, L.M. and Zangl, H., “Gesture-based contactless control of mobile manipulators using capacitive sensing,” IEEE International Instrumentation and Measurement Technology Conference (I2MTC), vol. 7, pp. 1–6, 2020.
[13]
Pan, J., Luo, Y., Li, Y., Tham, C.K., Heng, C.H. and Thean, A.V.Y., “A Wireless Multi-Channel Capacitive Sensor System for Efficient Glove-Based Gesture Recognition With AI at the Edge,” IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 67, no. 9, pp. 1624–1628, 2020.
[14]
Chossat, J. B., Tao, Y., Duchaine, V., and Park, Y. L., “Wearable soft artificial skin for hand motion detection with embedded microfluidic strain sensing,” IEEE international conference on robotics and automation (ICRA), pp. 2568–2573, May 2015, https://doi.org/10.1109/ICRA.2015.7139544.
[15]
Nassour, J., Amirabadi, H.G., Weheabby, S., Al Ali, A., Lang, H. and Hamker, “A Robust Data-Driven Soft Sensory Glove for Human Hand Motions Identification and Replication,” IEEE Sensors Journal, vol. 20, no. 21, pp. 12 972–12 979, 2020.
[16]
Wen, F., Sun, Z., He, T., Shi, Q., Zhu, M., Zhang, Z., Li, L., Zhang, T. and Lee, C., 2020, “Machine learning glove using self-powered conductive superhydrophobic triboelectric textile for gesture recognition in VR/AR applications,” Advanced Science, vol. 7, no. 14, p. 2000261, 2020.
[17]
DelPreto, J., and Rus, D., “Sharing the load: human-robot team lifting using muscle activity,” International Conference on Robotics and Automation (ICRA), pp. 7906–7912, May 2019.
[18]
DelPreto, J., and Rus, D., “Plug-and-Play Gesture Control Using Muscle and Motion Sensors,” ACM/IEEE International Conference on Human-Robot Interaction, pp. 439–448, March 2020.
[19]
Kim, J., Mastnik, S., and Andre, E., “EMG-based hand gesture recognition for real-time biosignal interfacing,” International Conference on Intelligent User Interfaces, pp. 30–39, January 2008.
[20]
Samadani, A. A., and Kulic, D., “Hand Gesture Recognition Based on Surface Electromyography,” International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 4196–4199, August 2014.
[21]
Nandakumar, R., Kellogg, B., and Gollakota, S., “Wi-fi gesture recognition on existing devices,” arXiv preprint arXiv:1411.5394., 2014.
[22]
Li, C.L., Liu, M. and Cao, Z., “WiHF: Gesture and User Recognition with WiFi,” IEEE Transactions on Mobile Computing, 2020.
[23]
Bu, Q., Yang, G., Ming, X., Zhang, T., Feng, J. and Zhang, J., “Deep transfer learning for gesture recognition with WiFi signal,” Personal and Ubiquitous Computing, pp. 1–12, 2020.
[24]
Tolgyessy, M., Dekan, M., Duchon, F., Rodina, J., Hubinsky, P., and Chovanec, L. U., “Foundations of visual linear human–robot interaction via pointing gesture navigation,” International Journal of Social Robotics, vol. 9, no. 4, pp. 509–523, 2017.
[25]
Liu, T., Chen, Z., and Wang, X., “Automatic Instructional Pointing Gesture Recognition by Machine Learning in the Intelligent Learning Environment,” International Conference on Distance Education and Learning, pp. 153–157, May 2019.
[26]
Gromov, B., Guzzi, J., Gambardella, L.M. and Giusti, A., “Intuitive 3D Control of a Quadrotor in User Proximity with Pointing Gestures,” sensors, vol. 8, no. 9, p. 10, 2020.
[27]
Obaid, M., Kistler, F., Kasparaviciute, G., Yantac, A.E. and Fjeld, M., “How would you gesture navigate a drone? a user-centered approach to control a drone.” International Academic Mindtrek Conference, pp. 113–121, October 2016.
[28]
Lidar sensors for robotic applications, “https://www.sentekeurope.com/robotics-lidar.”
[29]
Mur-Artal, R. and Tardos, J.D., “Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras,” IEEE Transactions on Robotics, vol. 33, no. 5, pp. 1255–1262, 2017.
[30]
Redmon J, Farhadi A., “Yolov3: An incremental improvement,” arXiv preprint, 2018, arXiv:1804.02767.
[31]
Cao, Z., Hidalgo, G., Simon, T., Wei, S.E. and Sheikh, Y., “Openpose: Realtime multi-person 2d pose estimation using part affinity fields,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019.
[32]
Mur-Artal, R. and Tardos, J.D., “ORB-SLAM: tracking and mapping recognizable features,” Workshop on Multi View Geometry in Robotics (MVIGRO), vol. 2014, p. 2, 2014.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
2021 IEEE International Conference on Robotics and Automation (ICRA)
May 2021
9777 pages

Publisher

IEEE Press

Publication History

Published: 30 May 2021

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Oct 2024

Other Metrics

Citations

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media