Abstract
Recently, the nnU-Net network had achieved excellent performance in many medical image segmentation tasks. However, it also had some obvious problems, such as being able to only perform fully supervised tasks, excessive resource consumption in the predict. Therefore, in the abdominal multi-organ challenge of FLARE23, only incomplete labeled data was provided, and the size of them was too large, which made the original nnU-Net difficult to run. Based on this, we had designed a framework that utilized generated pseudo labels and two-stage segmentation for fast and effective prediction. Specifically, we designed three nnU-Net, one for generating high-quality pseudo labels for unlabeled data, the other for generating coarse segmentation to guide cropping, and the third for achieving effective segmentation. Our method achieved an average DSC score of 88.87% and 38.00% for the organs and lesions on the validation set and the average running time and area under GPU memory-time cure are 45 s and 3000 MB, respectively.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Badrinarayanan, V., Kendall, A., Cipolla, R.: SegNet: a deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 39(12), 2481–2495 (2017)
Bilic, P., et al.: The liver tumor segmentation benchmark (LiTS). Med. Image Anal. 84, 102680 (2023)
Cao, H., et al.: Swin-Unet: Unet-like pure transformer for medical image segmentation. In: Karlinsky, L., Michaeli, T., Nishino, K. (eds.) ECCV 2022. LNCS, vol. 13803, pp. 205–218. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-25066-8_9
Clark, K., et al.: The cancer imaging archive (TCIA): maintaining and operating a public information repository. J. Digit. Imaging 26(6), 1045–1057 (2013)
Gatidis, S., et al.: The AutoPET challenge: towards fully automated lesion segmentation in oncologic PET/CT imaging. Preprint at Research Square (Nature Portfolio) (2023). https://doi.org/10.21203/rs.3.rs-2572595/v1
Gatidis, S., et al.: A whole-body FDG-PET/CT dataset with manually annotated tumor lesions. Sci. Data 9(1), 601 (2022)
Heller, N., et al.: The state of the art in kidney and kidney tumor segmentation in contrast-enhanced CT imaging: results of the KiTS19 challenge. Med. Image Anal. 67, 101821 (2021)
Heller, N., et al.: An international challenge to use artificial intelligence to define the state-of-the-art in kidney and kidney tumor segmentation in CT imaging. Proc. Am. Soc. Clin. Oncol. 38(6), 626 (2020)
Huang, Z., et al.: Revisiting nnU-net for iterative pseudo labeling and efficient sliding window inference. In: Ma, J., Wang, B. (eds.) FLARE 2022. LNCS, vol. 13816, pp. 178–189. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-23911-3_16
Isensee, F., Jaeger, P.F., Kohl, S.A., Petersen, J., Maier-Hein, K.H.: nnU-net: a self-configuring method for deep learning-based biomedical image segmentation. Nat. Methods 18(2), 203–211 (2021)
Ma, J., He, Y., Li, F., Han, L., You, C., Wang, B.: Segment anything in medical images. Nat. Commun. 15, 654 (2024)
Ma, J., et al.: Fast and low-GPU-memory abdomen CT organ segmentation: the flare challenge. Med. Image Anal. 82, 102616 (2022)
Ma, J., et al.: Unleashing the strengths of unlabeled data in pan-cancer abdominal organ quantification: the FLARE22 challenge. arXiv preprint arXiv:2308.05862 (2023)
Ma, J., et al.: AbdomenCT-1k: is abdominal organ segmentation a solved problem? IEEE Trans. Pattern Anal. Mach. Intell. 44(10), 6695–6714 (2022)
Pavao, A., et al.: CodaLab competitions: an open source platform to organize scientific challenges. J. Mach. Learn. Res. 24(198), 1–6 (2023)
Ronneberger, O., Fischer, P., Brox, T.: U-net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28
Simpson, A.L., et al.: A large annotated medical image dataset for the development and evaluation of segmentation algorithms. arXiv preprint arXiv:1902.09063 (2019)
Wang, E., Zhao, Y., Wu, Y.: Cascade dual-decoders network for abdominal organs segmentation. In: Ma, J., Wang, B. (eds.) FLARE 2022. LNCS, vol. 13816, pp. 202–213. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-23911-3_18
Wasserthal, J., et al.: TotalSegmentator: robust segmentation of 104 anatomic structures in CT images. Radiol.: Artif. Intell. 5(5), e230024 (2023)
Wu, M., Ding, W., Yang, M., Huang, L.: Multi-depth boundary-aware left atrial scar segmentation network. In: Zhuang, X., Li, L., Wang, S., Wu, F. (eds.) LAScarQS 2022. LNCS, vol. 13586, pp. 16–23. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-31778-1_2
You, C., Xiang, J., Su, K., Zhang, X., Dong, S., Onofrey, J., Staib, L., Duncan, J.S.: Incremental learning meets transfer learning: application to multi-site prostate MRI segmentation. In: Albarqouni, S., et al. (eds.) DeCaF FAIR 2022. LNCS, vol. 13573, pp. 3–16. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-18523-6_1
Yushkevich, P.A., Gao, Y., Gerig, G.: ITK-SNAP: an interactive tool for semi-automatic segmentation of multi-modality biomedical images. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3342–3345 (2016)
Zhang, X., Yang, X., Huang, L., Huang, L.: Two stage of histogram matching augmentation for domain generalization: application to left atrial segmentation. In: Zhuang, X., Li, L., Wang, S., Wu, F. (eds.) LAScarQS 2022. LNCS, vol. 13586, pp. 60–68. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-31778-1_6
Acknowledgements
The authors of this paper declare that the segmentation method they implemented for participation in the FLARE 2023 challenge has not used any pre-trained models nor additional datasets other than those provided by the organizers. The proposed solution is fully automatic without any manual intervention. We thank all the data owners for making the CT scans publicly available and CodaLab [15] for hosting the challenge platform.
This work was supported by National Natural Science Foundation of China (62271149), Fujian Provincial Natural Science Foundation project (2021J02019, 2021J01578).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Yang, X., Zhang, X., Yan, X., Ding, W., Chen, H., Huang, L. (2024). Abdomen Multi-organ Segmentation Using Pseudo Labels and Two-Stage. In: Ma, J., Wang, B. (eds) Fast, Low-resource, and Accurate Organ and Pan-cancer Segmentation in Abdomen CT. FLARE 2023. Lecture Notes in Computer Science, vol 14544. Springer, Cham. https://doi.org/10.1007/978-3-031-58776-4_4
Download citation
DOI: https://doi.org/10.1007/978-3-031-58776-4_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-58775-7
Online ISBN: 978-3-031-58776-4
eBook Packages: Computer ScienceComputer Science (R0)