A High-Precision Hand–Eye Coordination Localization Method under Convex Relaxation Optimization
Abstract
:1. Introduction
2. Coordinated Hand–Eye Reverse Gate Operation
2.1. Eye-in-Hand System Structure Design
- (1)
- A six-degree-of-freedom robotic arm allows highly accurate control and can ensure the accuracy of the reverse gate operation, safeguarding against operation failure and equipment damage. In some narrow and difficult-to-access spaces, the six-degree-of-freedom robotic arm can flexibly maneuver the inverting lever for the purpose of opening or closing the gate blade without the need for the operator to physically enter the narrow or dangerous area.
- (2)
- A depth camera, commonly using infrared or laser sensors, captures the three-dimensional information of a scene, allowing for the more accurate determination of the distance and shape of objects. Real-time video streaming can be used to monitor the progress of the reversing operation and the status of the equipment, helping the operator to pinpoint the location and position of the reversing equipment for the proper placement of switch levers, handles, or other controls
- (3)
- A uniquely designed jaw provides high-precision control, ensuring the accuracy of the reverse gate operation and reducing the risk of misuse. It is intended for use in rotating reversing gate equipment to ensure that meter buttons are opened or closed correctly. It can control the rotary movement via a robotic arm or other devices to precisely control the operation. Generally speaking, the eye-in-hand system adopts the eye-in-hand mounting method, where the camera is fixed on the end-effector gripper. This method provides relative flexibility and allows the camera to be moved with the robot for image acquisition, and the distance of the camera can be adjusted when facing target objects of different sizes to reduce measurement errors and improve accuracy.
2.2. Description of the Hand–Eye Calibration Problem
3. Dual Quaternions for Solving Hand–Eye Calibration Problems
3.1. Dual Quaternions
3.2. Solving the Calibration Equation Using Dual Quaternions
4. Convex Relaxation Global Optimization Algorithm for Solving Hand–Eye Calibration Equations
4.1. Convex Relaxation Global Optimization Algorithm
4.2. Convex Relaxation Optimization for Solving Hand–Eye Calibration Equations
5. Experiment and Result Analysis
5.1. Experimental Environment Construction
5.2. Data Acquisition
- (1)
- Manually adjust the robotic arm so that the ArUco code moves to the center of the camera’s field of view. Click ‘check starting pose’; if the check is successful, the interface will display ‘0/17’, indicating that the procedure is ready and can be started.
- (2)
- Click ‘Next Pose’, ‘Plan’, and ‘Execute’ in turn. The robot arm will move to a new position. If the ArUco code is within the camera’s field of view and can be detected successfully, proceed to the next step.
- (3)
- Click ‘Take Sample’ in Interface 2. If valid information appears in the Samples dialog box, this indicates that the first point calibration is successful.
- (4)
- Repeat steps 2 and 3 until all 17 points are calibrated.
5.3. Experimental Results and Analysis
6. Conclusions and Perspective
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Liu, Z.; Zhao, X.; Sui, J. PTZ control system of indoor rail inspection robot based on neural network prediction model. Procedia Comput. Sci. 2017, 107, 206–211. [Google Scholar] [CrossRef]
- Nandini, V.; Vishal, R.D.; Prakash, C.A. A review on applications of machine vision systems in industries. Indian J. Sci. Technol. 2016, 9, 1–5. [Google Scholar] [CrossRef]
- Chatterjee, A.; Govindu, V.M. Robust relative rotation averaging. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 958–972. [Google Scholar] [CrossRef]
- Shouyin, L.; Qinglin, Q.; Bin, Z. Evelopment of substation equipment inspection robot. Power Syst. Autom. 2006, 30, 94–98. [Google Scholar]
- Chen, B.; Pan, B. Camera calibration using synthetic random speckle pattern and digital image correlation. Opt. Lasers Eng. 2020, 126, 105919. [Google Scholar] [CrossRef]
- Chen, X.-A.; Xu, F. Robot localization system based on hand-eye stereo vision. Comput. Appl. 2005, 25 (Suppl. S1), 302–304. [Google Scholar]
- Karaman, S.; Frazzoli, E. Sampling-based algorithms for optimal motion planning. Int. J. Robot. Res. 2011, 30, 846–894. [Google Scholar] [CrossRef]
- He, Z. Design of Industrial Robot Sorting System Based on Machine Vision; Harbin Institute of Technology: Harbin, China, 2016; pp. 2–3. [Google Scholar]
- Gao, Y. Design and Implementation of Machine Vision-Based Workpiece Localization Identification Sorting System. Xiamen University: Xiamen, China, 2019. [Google Scholar]
- Tsai, R.Y.; Lenz, R.K. A new technique for fully autonomous and efficient 3 D robotics hand/eye calibration. IEEE Trans. Robot. Autom. 1989, 5, 345–358. [Google Scholar] [CrossRef]
- Shiu, Y.C.; Ahmad, S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX = XB. IEEE Trans. Robot. Autom. 1989, 5, 16–29. [Google Scholar] [CrossRef]
- Park, F.C.; Martin, B.J. Robot sensor calibration: Solving AX = XB on the Euclidean group. IEEE Trans. Robot. Autom. 1994, 10, 717–721. [Google Scholar] [CrossRef]
- Daniilidis, K. Hand-eye calibration using dual quaternions. Int. J. Robot. Res. 1999, 18, 286–298. [Google Scholar] [CrossRef]
- Cui, H.; Sun, R.; Fang, Z. A novel flexible two-step method for eye-to-hand calibration for robot assembly system. Meas. Control. 2020, 53, 2020–2029. [Google Scholar] [CrossRef]
- Andreff, N.; Horaud, R.; Espiau, B. On-line hand-eye calibration. In Proceedings of the Second International Conference on 3-D Digital Imaging and Modeling, Ottawa, ON, Canada, 8 October 1999; pp. 430–436. [Google Scholar]
- Wang, J.; Wang, T.; Yang, Y. Nonlinear optimal robot Hand-eye calibration. J. Xi’an Jiaotong Univ. 2011, 45, 15–20. [Google Scholar]
- Wang, J.; Duan, F.; Wang, R. Accurate calibration of hand-eye relationship for articulated arm visual inspection system. Comput. Eng. Appl. 2015, 51, 225–229. [Google Scholar]
- Zhang, Z.; Zhang, X.; Zheng, Z. A robot Hand-eye calibration method fusing rotational translation information. J. Instrum. 2015, 36, 2443–2450. [Google Scholar]
- Wang, L.; Min, H. Dual Quaternion Hand Eye Calibration Algorithm Based on LMI Optimization. Mach. Tool Hydraul. 2021, 49, 8–14. [Google Scholar]
- Li, W.; Lyu, N.; Dong, M. Robot hand-eye calibration by convex relaxation global optimization. J. Comput. Appl. 2017, 37, 1451. [Google Scholar]
- Putinar, M. Positive polynomials on compact semi-algebraic sets. Indiana Univ. Math. J. 1993, 42, 969–984. [Google Scholar] [CrossRef]
- Triggs, B. Autocalibration from planar scenes. In Proceedings of the Computer Vision—ECCV’98: 5th European Conference on Computer Vision, Freiburg, Germany, 2–6 June 1998; Springer: Berlin/Heidelberg, Germany, 1998; pp. 89–105. [Google Scholar]
- Lasserre, J.B. Moments, Positive Polynomials and Their Applications; World Scientific: Singapore, 2009; pp. 81–91. [Google Scholar]
Groups | Rotation (x,y,z,w) | Translation (x,y,z) | |||||
---|---|---|---|---|---|---|---|
1 | −0.24 | 0.4 | 0.55 | 0.05 | 0.04 | −0.01 | 0.65 |
2 | 0.05 | 0.03 | 0.39 | 0.74 | 0.66 | −0.14 | 0.07 |
3 | −0.45 | 0.42 | 0.55 | 0.06 | 0.06 | 0.22 | 0.84 |
4 | −0.04 | 0.11 | 0.44 | 0.7 | 0.59 | −0.11 | 0.39 |
5 | −0.34 | 0.42 | 0.31 | −0.17 | −0.06 | 0.01 | 0.92 |
6 | −0.26 | 0.35 | 0.51 | 0.19 | 0.27 | −0.03 | 0.81 |
7 | −0.08 | 0.15 | 0.45 | −0.66 | −0.64 | 0.12 | 0.39 |
8 | −0.25 | 0.35 | 0.53 | 0.2 | 0.17 | −0.03 | 0.78 |
9 | −0.33 | 0.49 | 0.55 | 0.07 | 0.07 | −0.01 | 0.68 |
10 | 0.02 | 0.05 | 0.41 | 0.72 | 0.65 | −0.13 | 0.19 |
11 | 0.12 | 0.08 | 0.43 | 0.16 | 0.58 | −0.23 | 0.14 |
12 | −0.23 | 0.42 | 0.39 | −0.12 | 0.14 | −0.32 | 0.44 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hua, J.; Su, Y.; Xin, D.; Guo, W. A High-Precision Hand–Eye Coordination Localization Method under Convex Relaxation Optimization. Sensors 2024, 24, 3830. https://doi.org/10.3390/s24123830
Hua J, Su Y, Xin D, Guo W. A High-Precision Hand–Eye Coordination Localization Method under Convex Relaxation Optimization. Sensors. 2024; 24(12):3830. https://doi.org/10.3390/s24123830
Chicago/Turabian StyleHua, Jin, Yuhang Su, Daxin Xin, and Weidong Guo. 2024. "A High-Precision Hand–Eye Coordination Localization Method under Convex Relaxation Optimization" Sensors 24, no. 12: 3830. https://doi.org/10.3390/s24123830