1. Introduction
Needle biopsy is a well-established procedure that allows for examination of abnormal tissue within the body. For example, percutaneous needle biopsy of suspected primary bone neoplasms is a well-established procedure in specialist centers [
1]. Fine needle biopsy has long been established as an accurate and safe procedure for tissue diagnosis of breast mass [
2,
3]. Amniocentesis is a technique for withdrawing amniotic fluid from the uterine cavity using a needle [
4,
5,
6,
7]. Often, these procedures are performed under image guidance. Although some of the needle biopsy procedures can be guided using imaging modalities such as fluoroscopy, CT, MRI, single-photon emission computed tomography (SPECT), positron emission tomography (PET), and optical imaging, there are procedures such as amniocentesis which require continuous ultrasound (US) guidance when taking the safety of the mother and the baby into consideration. US is regarded as one of the most common imaging modalities for needle biopsy guidance as it is relatively cheap, readily available, and uses no ionizing radiation.
US-guided needle biopsies are often accomplished with hand held and stereotactic biopsy procedure, which are operator dependent. Moreover, such procedures require extensive training exercises, are difficult to regulate, and are more challenging to perform when small lesions are found. Consequently, hand held US-guided biopsies do not always yield ideal results.
To address these challenges, one of the proposed technologies is to integrate a robotic system with US imaging [
3,
8]. In such a robot-assisted, US-guided needle biopsy system, it is essential to conduct calibration of a US probe and to perform hand-eye calibration of the robot in order to establish a link between intra-operatively acquired US images and robot-assisted needle insertion. Based on a high-precision optical tracking system, novel methods for US probe and robot hand-eye calibration are proposed. Specifically, we first fix optically trackable markers to the US probe and to the robot, respectively. We then design a five-wire phantom to calibrate the US probe. Finally, an effective method taking advantage of steady movement of the robot but without the need to solve the
equation is proposed for hand-eye calibration. After calibration, our system allows for in situ definition of target lesions and aiming trajectories from intra-operatively acquired US images in order to align the robot for precise needle biopsy. The contributions of our paper can be summarized as:
We design a five-wire phantom. Based on this phantom, we propose a novel method for ultrasound probe calibration.
We propose an effective method for hand-eye calibration, which unlike previous work, does not need to solve the equation, or a calibration frame.
Comprehensive experiments are conducted to evaluate the efficacy of the proposed calibration methods as well as the overall system accuracy.
2. Overview of Our Robot-Assisted Ultrasound-Guided Needle Biopsy System
Our robot-assisted US-guided needle biopsy system consists of a master computer equipped with a frame grabber (DVI2USB 3.0, Epiphan Systems Inc., Ottawa, ON, Canada), an US machine (ACUSON OXANA2, Siemens Healthcare GmbH, Marburg, Germany) with a 45-mm linear array probe of 9L4 Transducer (Siemens Medical Solutions USA Inc., Pennsylvania, CA, USA), an optical tracking camera (Polaris Vega XT, Northern Digital Inc., Ontario, ON, Canada), and a robot arm (UR 5e, Universal robots Inc., Odense, Denmark) with a biopsy guide. Via the frame grabber, the master computer can grab real-time US images with a frequency of 10 Hz. It also communicates with the tracking camera to get poses of different tracking frames and with the remote controller of the UR robot in order to realize a steady movement and to receive feedback information, such as robot poses.
During a needle biopsy procedure, the target lesion and the aiming trajectory are planned in the US image grabbed by the master computer. Then, the pose of the guide will be adjusted to align with the planned biopsy trajectory. Thus, it is essential to determine the spatial transformation from the two-dimensional (2D) US imaging space to the three-dimension (3D) robot space, as shown in
Figure 1. The transformation can be obtained via three different calibration procedures, including US probe calibration, hand-eye calibration, and TCP (Tool Center Point) calibration.
A robot-assisted ultrasound-guided needle biopsy procedure involves following coordinate systems (COS) as shown in
Figure 1. The 3D COS of the optical tracking camera is represented by
; the 3D COS of the reference frame on the end effector is by
; the 3D COS of the robotic flange is by
; the 3D COS of the guiding tube is by
; the 3D COS of the robot base is by
; the 2D COS of the US image is by
; the 3D COS of the plane where the US image is located is by
; the 3D COS of the reference frame attached to the US probe is by
; the 3D COS of the reference frame attached to the patient/phantom is by
. At any time, poses of different tracking frames with respect to the tracking camera such as
,
,
, are known. At the same time, the pose of the robotic flange with respect to the robot base
is known. This transformation information can be retrieved from the API (Application Programming Interface) of the associated devices.
A biopsy trajectory can be defined from an intra-operatively acquired US image by a target point
and a unit vector
that indicates the direction of the trajectory. To simplify the derivation and expression, the planned trajectory in the image COS
is written in a format of a
matrix, as:
The planned trajectory in the robot-base COS is presented by
, which is obtained by the following chain of transformations:
where
represents the homogeneous transformation of the tracking camera COS
relative to the robot-base COS
and is determined by:
where
represents the homogeneous transformation of the flange COS relative to the robot-base COS, and
is the inverse of
, which is the homogeneous transformation of the COS of the reference frame on the end effector relative to the tracking camera COS
.
Similar to the definition of the planned trajectory, pose of the center line of the guiding tube in the robot-base COS can be defined by
, which is defined by two end points of the center line,
and
:
To realize the robotic assistance for needle biopsy, the robot is controlled to provide a corresponding pose, so that the center axis of the guiding tube is aligned with the planned trajectory, which can be modeled as:
The complete system requires knowing three spatial transformations, i.e., , and , of which is obtained by US probe calibration, is by hand-eye calibration, and is by TCP (Tool Center Point) calibration. The accuracy of the spatial calibrations will affect the biopsy accuracy. Below, we will present details about these three calibration procedures.
6. Discussion
Previous studies of needle biopsy have emphasized the applications of fluoroscopy and CT as imaging modalities [
32,
33]. Compared with these imaging modalities, US has a major advantage in that it is free of risk from ionizing radiation to both the patient and staff. In addition, robot systems have the advantage to ensure the stability and accuracy [
30,
34]. Taking advantage of an ultrasound system and a robot arm, we developed and validated a robot-assisted system for a safe needle biopsy.
Three spatial calibration methods, including US probe calibration, hand-eye calibration, and TCP calibration, were developed for the robot-assisted biopsy system to realize a rapid registration of patient-image-robot. We validated the US probe calibration by reconstruction analysis of wire phantoms. Our method also achieved a higher accuracy than previously reported results [
13,
15,
16,
35]. Different from previous works [
10,
12,
17], our US probe calibration is not dependent upon the known geometric parameters, which makes it easier to manufacture a calibration phantom. We further investigated a combination of the hand-eye calibration and TCP calibration by drilling experiments.
It is worth discussing the proposed hand-eye calibration method. Our method does not need to solve the equation “
” as required by previously introduced hand-eye calibration methods [
36]. In comparison with methods depending on iterative solutions [
24,
25] or probabilistic models [
22,
37], our method is much faster. Our method also eliminates the requirement of an additional calibration frame as in [
19,
20]. Our hand-eye calibration transformation is derived based on the movement trajectories of the reference frame attached to the end effector, taking advantage of the steady movement of a robot.
There are limitations in our study. First, we did not consider the influence of respiratory motion, which may degrade the performance of the proposed system. Second, the accuracy of the proposed system was affected by the elastic deformation and friction of the target object, which conformed with the finding reported in [
31]. Nonetheless, results from our comprehensive experiments demonstrated that the proposed robot-assisted system could achieve sub-millimeter accuracy.