Medical Imaging 2006: Visualization, Image-Guided Procedures, and Display, 2006
To completely remove a tumor from a diseased kidney, while minimizing the resection of healthy ti... more To completely remove a tumor from a diseased kidney, while minimizing the resection of healthy tissue, the surgeon must be able to accurately determine its location, size and shape. Currently, the surgeon mentally estimates these parameters by examining pre-operative Computed Tomography (CT) images of the patient's anatomy. However, these images do not reflect the state of the abdomen or organ
The purpose of this article is to compare transrectal ultrasound (TRUS) biopsy accuracies of oper... more The purpose of this article is to compare transrectal ultrasound (TRUS) biopsy accuracies of operators with different levels of prostate MRI experience using cognitive registration versus MRI-TRUS fusion to assess the preferred method of TRUS prostate biopsy for MRI-identified lesions. SUBJECTS AND METHODS; One hundred patients from a prospective prostate MRI-TRUS fusion biopsy study were reviewed to identify all patients with clinically significant prostate adenocarcinoma (PCA) detected on MRI-targeted biopsy. Twenty-five PCA tumors were incorporated into a validated TRUS prostate biopsy simulator. Three prostate biopsy experts, each with different levels of experience in prostate MRI and MRI-TRUS fusion biopsy, performed a total of 225 simulated targeted biopsies on the MRI lesions as well as regional biopsy targets. Simulated biopsies performed using cognitive registration with 2D TRUS and 3D TRUS were compared with biopsies performed under MRI-TRUS fusion. Two-dimensional and 3D TRUS sampled only 48% and 45% of clinically significant PCA MRI lesions, respectively, compared with 100% with MRI-TRUS fusion. Lesion sampling accuracy did not statistically significantly vary according to operator experience or tumor volume. MRI-TRUS fusion-naïve operators showed consistent errors in targeting of the apex, midgland, and anterior targets, suggesting that there is biased error in cognitive registration. The MRI-TRUS fusion expert correctly targeted the prostate apex; however, his midgland and anterior mistargeting was similar to that of the less-experienced operators. MRI-targeted TRUS-guided prostate biopsy using cognitive registration appears to be inferior to MRI-TRUS fusion, with fewer than 50% of clinically significant PCA lesions successfully sampled. No statistically significant difference in biopsy accuracy was seen according to operator experience with prostate MRI or MRI-TRUS fusion.
Prostate motion due to transrectal ultrasound (TRUS) probe pressure and patient movement causes t... more Prostate motion due to transrectal ultrasound (TRUS) probe pressure and patient movement causes target misalignments during 3D TRUS-guided biopsy. Several solutions have been proposed to perform 2D-3D registration for motion compensation. To improve registration accuracy and robustness, we developed and evaluated a registration algorithm whose optimization is based on learned prostate motion characteristics relative to different tracked probe positions and prostate sizes. We performed a principal component analysis of previously observed motions and utilized the principal directions to initialize Powell's direction set method during optimization. Compared with the standard initialization, our approach improved target registration error to 2.53 +/- 1.25 mm after registration. Multiple initializations along the major principal directions improved the robustness of the method at the cost of additional execution time of 1.5 s. With a total execution time of 3.2 s to perform motion compensation, this method is amenable to useful integration into a clinical 3D guided prostate biopsy workflow.
Biopsy of the prostate using ultrasound guidance is the clinical gold standard for diagnosis of p... more Biopsy of the prostate using ultrasound guidance is the clinical gold standard for diagnosis of prostate adenocarcinoma. The current prostate biopsy procedure is limited to using 2D transrectal ultrasound (TRUS) images to target biopsy sites and record biopsy core locations for postbiopsy confirmation. Localization of the 2D image in its actual 3D position is ambiguous and limits procedural accuracy and reproducibility. We have developed a 3D TRUS prostate biopsy system that provides 3D intrabiopsy information for needle guidance and biopsy location recording. The system conforms to the workflow and imaging technology of the current biopsy procedure, making it easier for clinical integration. In this paper, we describe the system design and validate the system accuracy by performing mock biopsies on US/CT multimodal patient-specific prostate phantoms. Our biopsy system generated 3D patient-specific models of the prostate with volume errors less than 3.5% and mean boundary errors of less than 1 mm. Using the 3D biopsy system, needles were guided to within 2.3 +/- 1.0 mm of 3D targets and with a high probability of biopsying clinically significant tumors. The positions of the actual biopsy sites were accurately localized to within 1.5 +/- 0.8 mm.
Magnetic resonance imaging (MRI)-targeted, 3D transrectal ultrasound (TRUS)-guided "fusion&q... more Magnetic resonance imaging (MRI)-targeted, 3D transrectal ultrasound (TRUS)-guided "fusion" prostate biopsy intends to reduce the ∼23% false negative rate of clinical two-dimensional TRUS-guided sextant biopsy. Although it has been reported to double the positive yield, MRI-targeted biopsies continue to yield false negatives. Therefore, the authors propose to investigate how biopsy system needle delivery error affects the probability of sampling each tumor, by accounting for uncertainties due to guidance system error, image registration error, and irregular tumor shapes. T2-weighted, dynamic contrast-enhanced T1-weighted, and diffusion-weighted prostate MRI and 3D TRUS images were obtained from 49 patients. A radiologist and radiology resident contoured 81 suspicious regions, yielding 3D tumor surfaces that were registered to the 3D TRUS images using an iterative closest point prostate surface-based method to yield 3D binary images of the suspicious regions in the TRUS con...
In targeted 3D transrectal ultrasound (TRUS)-guided biopsy, patient and prostate movement during ... more In targeted 3D transrectal ultrasound (TRUS)-guided biopsy, patient and prostate movement during the procedure can cause target misalignments that hinder accurate sampling of preplanned suspicious tissue locations. Multiple solutions have been proposed for displacement compensation via registration of intraprocedural TRUS images to a baseline 3D TRUS image acquired at the beginning of the biopsy procedure. While 2D TRUS images are widely used for intraprocedural guidance, some solutions utilize richer intraprocedural images such as bi- or multiplanar TRUS or 3D TRUS, acquired by specialized probes. In this work, the impact of such richer intraprocedural imaging on displacement compensation accuracy was measured to evaluate the tradeoff between cost and complexity of intraprocedural imaging versus improved displacement compensation. Baseline and intraprocedural 3D TRUS images were acquired from 29 patients at standard sextant-template biopsy locations. Planes extracted from 3D TRUS images acquired at sextant positions were used to simulate 2D and 3D intraprocedural information available in different potential clinically relevant scenarios for co-registration with the baseline 3D TRUS image. In practice, intraprocedural 3D information can be acquired either via the use of specialized ultrasound probes (e.g., multiplanar or 3D probes) or via axial rotation of a tracked 2D TRUS probe. Registration accuracy was evaluated by calculating the target registration error (TRE) using manually identified homologous intrinsic fiducial markers (microcalcifications). The TRE was analyzed separately at the base, mid-gland and apex regions of the prostate. The results indicate that TRE improved gradually as the number of intraprocedural imaging planes used in registration was increased, implying that 3D TRUS information assisted the registration algorithm to robustly converge to more accurate solutions. The acquisition of a partial volume up to the angle of rotation supported more accurate displacement compensation than acquiring biplane configurations. Additional intraprocedural 3D TRUS image information was more beneficial to registration accuracy in the base and apex regions as compared with the mid-gland region. While the majority of the registrations using 2D TRUS images provided a clinically desired level of accuracy, intraprocedural 3D imaging helped improve the overall registration accuracy and robustness, especially in the base and apex regions of the prostate. These results are helpful for devising image-based registration methods for displacement compensation when designing 3D TRUS-guided biopsy systems.
Prostate biopsy procedures are generally limited to 2D transrectal ultrasound (TRUS) imaging for ... more Prostate biopsy procedures are generally limited to 2D transrectal ultrasound (TRUS) imaging for biopsy needle guidance. This limitation results in needle position ambiguity and an insufficient record of biopsy core locations in cases of prostate re-biopsy. We have developed a multi-jointed mechanical device that supports a commercially available TRUS probe with an integrated needle guide for precision prostate biopsy. The device is fixed at the base, allowing the joints to be manually manipulated while fully supporting its weight throughout its full range of motion. Means are provided to track the needle trajectory and display this trajectory on a corresponding TRUS image. This allows the physician to aim the needle-guide at predefined targets within the prostate, providing true 3D navigation. The tracker has been designed for use with several end-fired transducers that can be rotated about the longitudinal axis of the probe to generate 3D images. The tracker reduces the variabilit...
Three-dimensional (3D) prostate image segmentation is useful for cancer diagnosis and therapy gui... more Three-dimensional (3D) prostate image segmentation is useful for cancer diagnosis and therapy guidance, but can be time-consuming to perform manually and involves varying levels of difficulty and interoperator variability within the prostatic base, midgland (MG), and apex. In this study, the authors measured accuracy and interobserver variability in the segmentation of the prostate on T2-weighted endorectal magnetic resonance (MR) imaging within the whole gland (WG), and separately within the apex, midgland, and base regions. The authors collected MR images from 42 prostate cancer patients. Prostate border delineation was performed manually by one observer on all images and by two other observers on a subset of ten images. The authors used complementary boundary-, region-, and volume-based metrics [mean absolute distance (MAD), Dice similarity coefficient (DSC), recall rate, precision rate, and volume difference (ΔV)] to elucidate the different types of segmentation errors that they observed. Evaluation for expert manual and semiautomatic segmentation approaches was carried out. Compared to manual segmentation, the authors' semiautomatic approach reduces the necessary user interaction by only requiring an indication of the anteroposterior orientation of the prostate and the selection of prostate center points on the apex, base, and midgland slices. Based on these inputs, the algorithm identifies candidate prostate boundary points using learned boundary appearance characteristics and performs regularization based on learned prostate shape information. The semiautomated algorithm required an average of 30 s of user interaction time (measured for nine operators) for each 3D prostate segmentation. The authors compared the segmentations from this method to manual segmentations in a single-operator (mean whole gland MAD = 2.0 mm, DSC = 82%, recall = 77%, precision = 88%, and ΔV = - 4.6 cm(3)) and multioperator study (mean whole gland MAD = 2.2 mm, DSC = 77%, recall = 72%, precision = 86%, and ΔV = - 4.0 cm(3)). These results compared favorably with observed differences between manual segmentations and a simultaneous truth and performance level estimation reference for this data set (whole gland differences as high as MAD = 3.1 mm, DSC = 78%, recall = 66%, precision = 77%, and ΔV = 15.5 cm(3)). The authors found that overall, midgland segmentation was more accurate and repeatable than the segmentation of the apex and base, with the base posing the greatest challenge. The main conclusions of this study were that (1) the semiautomated approach reduced interobserver segmentation variability; (2) the segmentation accuracy of the semiautomated approach, as well as the accuracies of recently published methods from other groups, were within the range of observed expert variability in manual prostate segmentation; and (3) further efforts in the development of computer-assisted segmentation would be most productive if focused on improvement of segmentation accuracy and reduction of variability within the prostatic apex and base.
Medical Imaging 2006: Visualization, Image-Guided Procedures, and Display, 2006
To completely remove a tumor from a diseased kidney, while minimizing the resection of healthy ti... more To completely remove a tumor from a diseased kidney, while minimizing the resection of healthy tissue, the surgeon must be able to accurately determine its location, size and shape. Currently, the surgeon mentally estimates these parameters by examining pre-operative Computed Tomography (CT) images of the patient's anatomy. However, these images do not reflect the state of the abdomen or organ
The purpose of this article is to compare transrectal ultrasound (TRUS) biopsy accuracies of oper... more The purpose of this article is to compare transrectal ultrasound (TRUS) biopsy accuracies of operators with different levels of prostate MRI experience using cognitive registration versus MRI-TRUS fusion to assess the preferred method of TRUS prostate biopsy for MRI-identified lesions. SUBJECTS AND METHODS; One hundred patients from a prospective prostate MRI-TRUS fusion biopsy study were reviewed to identify all patients with clinically significant prostate adenocarcinoma (PCA) detected on MRI-targeted biopsy. Twenty-five PCA tumors were incorporated into a validated TRUS prostate biopsy simulator. Three prostate biopsy experts, each with different levels of experience in prostate MRI and MRI-TRUS fusion biopsy, performed a total of 225 simulated targeted biopsies on the MRI lesions as well as regional biopsy targets. Simulated biopsies performed using cognitive registration with 2D TRUS and 3D TRUS were compared with biopsies performed under MRI-TRUS fusion. Two-dimensional and 3D TRUS sampled only 48% and 45% of clinically significant PCA MRI lesions, respectively, compared with 100% with MRI-TRUS fusion. Lesion sampling accuracy did not statistically significantly vary according to operator experience or tumor volume. MRI-TRUS fusion-naïve operators showed consistent errors in targeting of the apex, midgland, and anterior targets, suggesting that there is biased error in cognitive registration. The MRI-TRUS fusion expert correctly targeted the prostate apex; however, his midgland and anterior mistargeting was similar to that of the less-experienced operators. MRI-targeted TRUS-guided prostate biopsy using cognitive registration appears to be inferior to MRI-TRUS fusion, with fewer than 50% of clinically significant PCA lesions successfully sampled. No statistically significant difference in biopsy accuracy was seen according to operator experience with prostate MRI or MRI-TRUS fusion.
Prostate motion due to transrectal ultrasound (TRUS) probe pressure and patient movement causes t... more Prostate motion due to transrectal ultrasound (TRUS) probe pressure and patient movement causes target misalignments during 3D TRUS-guided biopsy. Several solutions have been proposed to perform 2D-3D registration for motion compensation. To improve registration accuracy and robustness, we developed and evaluated a registration algorithm whose optimization is based on learned prostate motion characteristics relative to different tracked probe positions and prostate sizes. We performed a principal component analysis of previously observed motions and utilized the principal directions to initialize Powell's direction set method during optimization. Compared with the standard initialization, our approach improved target registration error to 2.53 +/- 1.25 mm after registration. Multiple initializations along the major principal directions improved the robustness of the method at the cost of additional execution time of 1.5 s. With a total execution time of 3.2 s to perform motion compensation, this method is amenable to useful integration into a clinical 3D guided prostate biopsy workflow.
Biopsy of the prostate using ultrasound guidance is the clinical gold standard for diagnosis of p... more Biopsy of the prostate using ultrasound guidance is the clinical gold standard for diagnosis of prostate adenocarcinoma. The current prostate biopsy procedure is limited to using 2D transrectal ultrasound (TRUS) images to target biopsy sites and record biopsy core locations for postbiopsy confirmation. Localization of the 2D image in its actual 3D position is ambiguous and limits procedural accuracy and reproducibility. We have developed a 3D TRUS prostate biopsy system that provides 3D intrabiopsy information for needle guidance and biopsy location recording. The system conforms to the workflow and imaging technology of the current biopsy procedure, making it easier for clinical integration. In this paper, we describe the system design and validate the system accuracy by performing mock biopsies on US/CT multimodal patient-specific prostate phantoms. Our biopsy system generated 3D patient-specific models of the prostate with volume errors less than 3.5% and mean boundary errors of less than 1 mm. Using the 3D biopsy system, needles were guided to within 2.3 +/- 1.0 mm of 3D targets and with a high probability of biopsying clinically significant tumors. The positions of the actual biopsy sites were accurately localized to within 1.5 +/- 0.8 mm.
Magnetic resonance imaging (MRI)-targeted, 3D transrectal ultrasound (TRUS)-guided "fusion&q... more Magnetic resonance imaging (MRI)-targeted, 3D transrectal ultrasound (TRUS)-guided "fusion" prostate biopsy intends to reduce the ∼23% false negative rate of clinical two-dimensional TRUS-guided sextant biopsy. Although it has been reported to double the positive yield, MRI-targeted biopsies continue to yield false negatives. Therefore, the authors propose to investigate how biopsy system needle delivery error affects the probability of sampling each tumor, by accounting for uncertainties due to guidance system error, image registration error, and irregular tumor shapes. T2-weighted, dynamic contrast-enhanced T1-weighted, and diffusion-weighted prostate MRI and 3D TRUS images were obtained from 49 patients. A radiologist and radiology resident contoured 81 suspicious regions, yielding 3D tumor surfaces that were registered to the 3D TRUS images using an iterative closest point prostate surface-based method to yield 3D binary images of the suspicious regions in the TRUS con...
In targeted 3D transrectal ultrasound (TRUS)-guided biopsy, patient and prostate movement during ... more In targeted 3D transrectal ultrasound (TRUS)-guided biopsy, patient and prostate movement during the procedure can cause target misalignments that hinder accurate sampling of preplanned suspicious tissue locations. Multiple solutions have been proposed for displacement compensation via registration of intraprocedural TRUS images to a baseline 3D TRUS image acquired at the beginning of the biopsy procedure. While 2D TRUS images are widely used for intraprocedural guidance, some solutions utilize richer intraprocedural images such as bi- or multiplanar TRUS or 3D TRUS, acquired by specialized probes. In this work, the impact of such richer intraprocedural imaging on displacement compensation accuracy was measured to evaluate the tradeoff between cost and complexity of intraprocedural imaging versus improved displacement compensation. Baseline and intraprocedural 3D TRUS images were acquired from 29 patients at standard sextant-template biopsy locations. Planes extracted from 3D TRUS images acquired at sextant positions were used to simulate 2D and 3D intraprocedural information available in different potential clinically relevant scenarios for co-registration with the baseline 3D TRUS image. In practice, intraprocedural 3D information can be acquired either via the use of specialized ultrasound probes (e.g., multiplanar or 3D probes) or via axial rotation of a tracked 2D TRUS probe. Registration accuracy was evaluated by calculating the target registration error (TRE) using manually identified homologous intrinsic fiducial markers (microcalcifications). The TRE was analyzed separately at the base, mid-gland and apex regions of the prostate. The results indicate that TRE improved gradually as the number of intraprocedural imaging planes used in registration was increased, implying that 3D TRUS information assisted the registration algorithm to robustly converge to more accurate solutions. The acquisition of a partial volume up to the angle of rotation supported more accurate displacement compensation than acquiring biplane configurations. Additional intraprocedural 3D TRUS image information was more beneficial to registration accuracy in the base and apex regions as compared with the mid-gland region. While the majority of the registrations using 2D TRUS images provided a clinically desired level of accuracy, intraprocedural 3D imaging helped improve the overall registration accuracy and robustness, especially in the base and apex regions of the prostate. These results are helpful for devising image-based registration methods for displacement compensation when designing 3D TRUS-guided biopsy systems.
Prostate biopsy procedures are generally limited to 2D transrectal ultrasound (TRUS) imaging for ... more Prostate biopsy procedures are generally limited to 2D transrectal ultrasound (TRUS) imaging for biopsy needle guidance. This limitation results in needle position ambiguity and an insufficient record of biopsy core locations in cases of prostate re-biopsy. We have developed a multi-jointed mechanical device that supports a commercially available TRUS probe with an integrated needle guide for precision prostate biopsy. The device is fixed at the base, allowing the joints to be manually manipulated while fully supporting its weight throughout its full range of motion. Means are provided to track the needle trajectory and display this trajectory on a corresponding TRUS image. This allows the physician to aim the needle-guide at predefined targets within the prostate, providing true 3D navigation. The tracker has been designed for use with several end-fired transducers that can be rotated about the longitudinal axis of the probe to generate 3D images. The tracker reduces the variabilit...
Three-dimensional (3D) prostate image segmentation is useful for cancer diagnosis and therapy gui... more Three-dimensional (3D) prostate image segmentation is useful for cancer diagnosis and therapy guidance, but can be time-consuming to perform manually and involves varying levels of difficulty and interoperator variability within the prostatic base, midgland (MG), and apex. In this study, the authors measured accuracy and interobserver variability in the segmentation of the prostate on T2-weighted endorectal magnetic resonance (MR) imaging within the whole gland (WG), and separately within the apex, midgland, and base regions. The authors collected MR images from 42 prostate cancer patients. Prostate border delineation was performed manually by one observer on all images and by two other observers on a subset of ten images. The authors used complementary boundary-, region-, and volume-based metrics [mean absolute distance (MAD), Dice similarity coefficient (DSC), recall rate, precision rate, and volume difference (ΔV)] to elucidate the different types of segmentation errors that they observed. Evaluation for expert manual and semiautomatic segmentation approaches was carried out. Compared to manual segmentation, the authors' semiautomatic approach reduces the necessary user interaction by only requiring an indication of the anteroposterior orientation of the prostate and the selection of prostate center points on the apex, base, and midgland slices. Based on these inputs, the algorithm identifies candidate prostate boundary points using learned boundary appearance characteristics and performs regularization based on learned prostate shape information. The semiautomated algorithm required an average of 30 s of user interaction time (measured for nine operators) for each 3D prostate segmentation. The authors compared the segmentations from this method to manual segmentations in a single-operator (mean whole gland MAD = 2.0 mm, DSC = 82%, recall = 77%, precision = 88%, and ΔV = - 4.6 cm(3)) and multioperator study (mean whole gland MAD = 2.2 mm, DSC = 77%, recall = 72%, precision = 86%, and ΔV = - 4.0 cm(3)). These results compared favorably with observed differences between manual segmentations and a simultaneous truth and performance level estimation reference for this data set (whole gland differences as high as MAD = 3.1 mm, DSC = 78%, recall = 66%, precision = 77%, and ΔV = 15.5 cm(3)). The authors found that overall, midgland segmentation was more accurate and repeatable than the segmentation of the apex and base, with the base posing the greatest challenge. The main conclusions of this study were that (1) the semiautomated approach reduced interobserver segmentation variability; (2) the segmentation accuracy of the semiautomated approach, as well as the accuracies of recently published methods from other groups, were within the range of observed expert variability in manual prostate segmentation; and (3) further efforts in the development of computer-assisted segmentation would be most productive if focused on improvement of segmentation accuracy and reduction of variability within the prostatic apex and base.
Uploads
Papers by Derek Cool