Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

On the Effectiveness of 3D Vision Transformers for the Prediction of Prostate Cancer Aggressiveness

  • Conference paper
  • First Online:
Image Analysis and Processing. ICIAP 2022 Workshops (ICIAP 2022)

Abstract

Prostate cancer is the most frequent male neoplasm in European men. To date, the gold standard for determining the aggressiveness of this tumor is the biopsy, an invasive and uncomfortable procedure. Before the biopsy, physicians recommend an investigation by multiparametric magnetic resonance imaging, which may serve the radiologist to gather an initial assessment of the tumor. The study presented in this work aims to investigate the role of Vision Transformers in predicting prostate cancer aggressiveness based only on imaging data. We designed a 3D Vision Transformer able to process volumetric scans, and we optimized it on the ProstateX-2 challenge dataset by training it from scratch. As a term of comparison, we also designed a 3D Convolutional Neural Network, and we optimized it in a similar fashion. The results obtained by our preliminary investigations show that Vision Transformers, even without extensive optimization and customization, can ensure an improved performance with respect to Convolutional Neural Networks and might be comparable with other more fine-tuned solutions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. World Health Organization, I.A.f.R.o.C: Fact sheet on cancer incidence in Europe (2020). https://gco.iarc.fr/today/data/factsheets/populations/908-europe-fact-sheets.pdf

  2. Turkbey, B., et al.: Prostate imaging reporting and data system version 2.1: 2019 update of prostate imaging reporting and data system version 2. Eur. Urol. 76(3), 340–351 (2019). https://doi.org/10.1016/j.eururo.2019.02.033

  3. Barentsz, J.O., et al.: PI-RADS prostate imaging-reporting and data system: 2015, version 2. Eur. Urol. 69, 16–40 (2016). https://doi.org/10.1016/j.eururo.2015.08.052

  4. Mohler, J.L., et al.: Prostate cancer, version 2.2019, NCCN clinical practice guidelines in oncology. J. Natl Compreh. Cancer Netw. 17(5), 479–505 (2019). https://doi.org/10.6004/jnccn.2019.0023

  5. Vickers, A.J.: Effects of magnetic resonance imaging targeting on overdiagnosis and overtreatment of prostate cancer. Eur. Urol. 80(5), 567–572 (2021). https://doi.org/10.1016/j.eururo.2021.06.026

    Article  Google Scholar 

  6. Liu, S., Zheng, H., Feng, Y., Li, W.: Prostate cancer diagnosis using deep learning with 3d multiparametric MRI. Med. Imaging 2017: Comput. Aid. Diagn. 10134, 581–584 (2017). SPIE https://doi.org/10.48550/arXiv.1703.04078

  7. Mehrtash, A., et al.: Classification of clinical significance of MRI prostate findings using 3D convolutional neural networks. In: Medical Imaging 2017: Comput. Aid. Diagn. 10134, 101342 (2017). International Society for Optics and Photonics https://doi.org/10.1117/12.2277123

  8. Mehta, P., Antonelli, M., Ahmed, H.U., Emberton, M., Punwani, S., Ourselin, S.: Computer-aided diagnosis of prostate cancer using multiparametric mri and clinical features: A patient-level classification framework. Med. Image Anal. 73, 102153 (2021). https://doi.org/10.1016/j.media.2021.102153

  9. Song, Y., et al.: Computer-aided diagnosis of prostate cancer using a deep convolutional neural network from multiparametric MRI. J. Magn. Reson. Imaging 48(6), 1570–1577 (2018). https://doi.org/10.1002/jmri.26047

  10. Yuan, Y., et al.: Prostate cancer classification with multiparametric MRI transfer learning model. Med. Phys. 46(2), 756–765 (2019). https://doi.org/10.1002/mp.13367

  11. Bertelli, E., et al.: Machine and deep learning prediction of prostate cancer aggressiveness using multiparametric MRI. Front. Oncol. 11, 802964–802964 (2021). https://doi.org/10.3389/fonc.2021.802964

  12. Mehta, P., et al.: Autoprostate: towards automated reporting of prostate MRI for prostate cancer assessment using deep learning. Cancers 13(23), 6138 (2021). https://doi.org/10.3390/cancers13236138

  13. Wang, Z., Liu, C., Cheng, D., Wang, L., Yang, X., Cheng, K.-T.: Automated detection of clinically significant prostate cancer in MP-MRI images based on an end-to-end deep neural network. IEEE Trans Medi. Imag. 37(5), 1127–1139 (2018). https://doi.org/10.1109/TMI.2017.2789181

  14. Yoo, S., Gujrathi, I., Haider, M.A., Khalvati, F.: Prostate cancer detection using deep convolutional neural networks. Sci. Rep. 9(1), 1–10 (2019). https://doi.org/10.1038/s41598-019-55972-4

  15. Dosovitskiy, A., et al.: An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929 (2020). https://doi.org/10.48550/arXiv.2010.11929

  16. Carion, N., Massa, F., Synnaeve, G., Usunier, N., Kirillov, A., Zagoruyko, S.: End-to-end object detection with transformers. In: European Conference on Computer Vision, pp. 213–229 (2020). https://doi.org/10.48550/arXiv.2005.12872

  17. Ranftl, R., Bochkovskiy, A., Koltun, V.: Vision transformers for dense prediction. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 12179–12188 (2021). https://doi.org/10.48550/arXiv.2103.13413

  18. Matsoukas, C., Haslum, J.F., Söderberg, M., Smith, K.: Is it time to replace CNNs with transformers for medical images. arXiv preprint arXiv:2108.09038 (2021). https://doi.org/10.48550/arXiv.2108.09038

  19. Litjens, G., Debats, O., Barentsz, J., Karssemeijer, N., Huisman, H.: Prostatex challenge data. Cancer Imag. Arch. 10, 9 (2017)

    Google Scholar 

  20. Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G.: PyTorch: An Imperative Style, High-Performance Deep Learning Library. Curran Associates, Inc. (2019). https://doi.org/10.48550/arXiv.1912.01703

  21. Chollet, F., et al.: Keras. GitHub (2015). https://github.com/fchollet/keras

  22. Developers, T.: TensorFlow. Zenodo (2021). https://doi.org/10.5281/zenodo.5593257

  23. Harris, C.R., et al.: Array programming with NumPy. Nature 585(7825), 357–362 (2020). https://doi.org/10.1038/s41586-020-2649-2

  24. Pedregosa, F.: Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011). https://doi.org/10.5555/1953048.2078195

  25. Mason, D., Scaramallion, Rhaxton, Mrbean-Bremen, Suever, J., Vanessasaurus: pydicom/pydicom: pydicom 2.1.2. Zenodo (2020). https://doi.org/10.5281/zenodo.4313150

  26. Clark, A.: Pillow (PIL Fork) Documentation. readthedocs (2015)

    Google Scholar 

  27. Reback, J., McKinney, W., jbrockmendel, den Bossche, J.V., Augspurger, T., Cloud, P.: pandas-dev/pandas: Pandas 1.2.4. Zenodo (2021). https://doi.org/10.5281/zenodo.4681666

  28. Van Rossum, G.: The Python Library Reference, release 3.8.2. Python Software Foundation (2020)

    Google Scholar 

Download references

Acknowledgements

The research leading to these results has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 952159 (ProCAncer-I) and from the Regional Project PAR FAS Tuscany - NAVIGATOR. The funders had no role in the design of the study, collection, analysis and interpretation of data, or writing the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eva Pachetti .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pachetti, E., Colantonio, S., Pascali, M.A. (2022). On the Effectiveness of 3D Vision Transformers for the Prediction of Prostate Cancer Aggressiveness. In: Mazzeo, P.L., Frontoni, E., Sclaroff, S., Distante, C. (eds) Image Analysis and Processing. ICIAP 2022 Workshops. ICIAP 2022. Lecture Notes in Computer Science, vol 13374. Springer, Cham. https://doi.org/10.1007/978-3-031-13324-4_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-13324-4_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-13323-7

  • Online ISBN: 978-3-031-13324-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics