Abstract
Although using convolutional neural networks (CNNs) for computer-aided diagnosis (CAD) has made tremendous progress in the last few years, the small medical datasets remain to be the major bottleneck in this area. To address this problem, researchers start looking for information out of the medical datasets. Previous efforts mainly leverage information from natural images via transfer learning. More recent research work focuses on integrating knowledge from medical practitioners, either letting networks resemble how practitioners are trained, how they view images, or using extra annotations. In this paper, we propose a scheme named Domain Guided-CNN (DG-CNN) to incorporate the margin information, a feature described in the consensus for radiologists to diagnose cancer in breast ultrasound (BUS) images. In DG-CNN, attention maps that highlight margin areas of tumors are first generated, and then incorporated via different approaches into the networks. We have tested the performance of DG-CNN on our own dataset (including 1485 ultrasound images) and on a public dataset. The results show that DG-CNN can be applied to different network structures like VGG and ResNet to improve their performance. For example, experimental results on our dataset show that with a certain integrating mode, the improvement of using DG-CNN over a baseline network structure ResNet18 is 2.17% in accuracy, 1.69% in sensitivity, 2.64% in specificity and 2.57% in AUC (Area Under Curve). To the best of our knowledge, this is the first time that the margin information is utilized to improve the performance of deep neural networks in diagnosing breast cancer in BUS images.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv:1409.1556, 2015. https://arxiv.org/abs/1409.1556, Nov. 2021.
He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, June 2016, pp.770-778. DOI: https://doi.org/10.1109/CVPR.2016.90.
Shin S Y, Lee S, Yun I D, Lee K M. Joint weakly and semi-supervised deep learning for localization and classification of masses in breast ultrasound images. IEEE Trans. Med. Imaging, 2019, 38(3): 762-774. DOI: https://doi.org/10.1109/TMI.2018.2872031.
Xu X, Lu Q, Yang L, Hu S X, Chen D Z, Hu Y, Shi Y. Quantization of fully convolutional networks for accurate biomedical image segmentation. In Proc. the IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2018, pp.8300-8308. DOI: https://doi.org/10.1109/CVPR.2018.00866.
Zhou Z, Shin J Y, Zhang L, Gurudu S R, Gotway M B, Liang J. Fine-tuning convolutional neural networks for biomedical image analysis: Actively and incrementally. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, July 2017, pp.4761-4772. DOI: https://doi.org/10.1109/CVPR.2017.506.
Esteva A, Kuprel B, Novoa R A, Ko J M, Swetter S M, Blau H M, Thrun S. Dermatologist-level classification of skin cancer with deep neural networks. Nature, 2017, 542(7639): 115-118. DOI: https://doi.org/10.1038/nature21056.
Huynh B, Drukker K, Giger M. MO-DE-207B-06: Computer-aided diagnosis of breast ultrasound images using transfer learning from deep convolutional neural networks. Med. Phys., 2016, 43(6): 3705-3705. DOI: https://doi.org/10.1118/1.4957255.
Yap M H, Pons G, Marti J, Ganau S, Sentis M, Zwiggelaar R, Davison A K, Marti R. Automated breast ultrasound lesions detection using convolutional neural networks. IEEE J. Biomed. Health Inform., 2018, 22(4): 1218-1226. DOI: https://doi.org/10.1109/JBHI.2017.2731873.
Tajbakhsh N, Shin J Y, Gurudu S R, Hurst R T, Kendall C B, Gotway M B, Liang J. Convolutional neural networks for medical image analysis: Full training or fine tuning? IEEE Trans. Med. Imaging, 2016, 35(5): 1299-1312. DOI: https://doi.org/10.1109/TMI.2016.2535302.
Guan Q, Huang Y, Zhong Z, Zheng Z, Zheng L, Yang Y. Diagnose like a radiologist: Attention guided convolutional neural network for thorax disease classification. arXiv:1801.09927, 2018. https://arxiv.org/abs/1801.09927, Nov. 2021.
González-Díaz I. DermaKNet: Incorporating the knowledge of dermatologists to convolutional neural networks for skin lesion diagnosis. IEEE J. Biomed. Health Inform., 2018, 23(2): 547-559. DOI: https://doi.org/10.1109/JBHI.2018.2806962.
Li L, Xu M, Wang X, Jiang L, Liu H. Attention based glaucoma detection: A large-scale database and CNN model. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, June 2019, pp.10571-10580. DOI: https://doi.org/10.1109/CVPR.2019.01082.
Fang L, Wang C, Li S, Rabbani H, Chen X, Liu Z. Attention to lesion: Lesion-aware convolutional neural network for retinal optical coherence tomography image classification. IEEE Trans. Med. Imaging, 2019, 38(8): 1959-1970. DOI: https://doi.org/10.1109/TMI.2019.2898414.
Mitsuhara M, Fukui H, Sakashita Y, Ogata T, Hirakawa T, Yamashita T, Fujiyoshi H. Embedding human knowledge in deep neural network via attention map. arXiv:1905.03540, 2019. https://arxiv.org/abs/1905.03540, May 2021.
Dorsi C, Bassett L, Feisg S, Lee C I, Lehman C D, Bassett L W. Breast Imaging Reporting and Data System (BI-RADS). Oxford University Press, 2018.
Bian C, Lee R, Chou Y, Cheng J. Boundary regularized convolutional neural network for layer parsing of breast anatomy in automated whole breast ultrasound. In Proc. the 20th International Conference on Medical Image Computing and Computer-Assisted Intervention, Sept. 2017, pp.259-266. DOI: https://doi.org/10.1007/978-3-319-66179-7_30.
Maicas G, Bradley A P, Nascimento J C, Reid I, Carneiro G. Training medical image analysis systems like radiologists. In Proc. the 21st International Conference on Medical Image Computing and Computer-Assisted Intervention, September 2018, pp.546-554. DOI: https://doi.org/10.1007/978-3-030-00928-1_62.
Liu J, Li W, Zhao N, Cao K, Yin Y, Song Q, Chen H, Gong X. Integrate domain knowledge in training CNN for ultrasonography breast cancer diagnosis. In Proc. the 21st International Conference on Medical Image Computing and Computer-Assisted Intervention, September 2018, pp.868-875. DOI: https://doi.org/10.1007/978-3-030-00934-2_96.
Wang X, Peng Y, Lu Y, Lu Z, Summers R M. TieNet: Text-image embedding network for common thorax disease classification and reporting in chest X-rays. In Proc. the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2018, pp.9049-9058. DOI: 10.1109/CVPR.2018.00943.
Berg W A, Cosgrove D O, Doré C J et al. Shear-wave elastography improves the specificity of breast US: the BE1 multinational study of 939 masses. Radiology, 2012, 262(2): 435-449. DOI: https://doi.org/10.1148/radiol.11110640.
Dobruch-Sobczak K, Piotrzkowska-Wróblewska H, Rosz-kowska-Purska K, Nowicki A, Jakubowsi W. Usefulness of combined BI-RADS analysis and Nakagami statistics of ultrasound echoes in the diagnosis of breast lesions. Clin. Radiol., 2017, 72(4): 339-339. DOI: https://doi.org/10.1016/j.crad.2016.11.009.
Liu Y, Cheng M M, Hu X, Wang K, Bai X. Richer convolutional features for edge detection. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, July 2017, pp.5872-5881. DOI: https://doi.org/10.1109/CVPR.2017.622.
Arbeláez P, Maire M, Fowlkes C, Malik J. Contour detection and hierarchical image segmentation. IEEE Trans. Pattern Anal. Mach. Intell., 2011, 33(5): 898-916. DOI: https://doi.org/10.1109/TPAMI.2010.161.
Lin M, Chen Q, Yan S. Network in network. arXiv:1312.44-00, 2013. https://arxiv.org/abs/1312.4400, March 2021.
Zhou B, Khosla A, Lapedriza A, Oliva A, Torralba A. Learning deep features for discriminative localization. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, June 2016, pp.2921-2929. DOI: https://doi.org/10.1109/CVPR.2016.319.
Shelhamer E, Long J, Darrell T. Fully convolutional networks for semantic segmentation. IEEE Trans. Pattern Anal. Mach. Intell., 2017, 39(4): 640-651. DOI: https://doi.org/10.1109/TPAMI.2016.2572683.
He K, Gkioxari G, Dollar P, Girshick R. Mask R-CNN. In Proc. the IEEE International Conference on Computer Vision, October 2017, pp.2980-2988. DOI: https://doi.org/10.1109/ICCV.2017.322.
Han S, Kang H K, Jeong J Y et al. A deep learning frame-work for supporting the classification of breast lesions in ultrasound images. Phys. Med. Biol., 2017, 62(19): 7714-7728. DOI: https://doi.org/10.1088/1361-6560/aa82ec.
Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell., 2017, 39(6): 1137-1149. DOI: https://doi.org/10.1109/TPAMI.2016.2577031.
Author information
Authors and Affiliations
Corresponding author
Supplementary Information
ESM 1
(PDF 132 kb)
Rights and permissions
About this article
Cite this article
Xie, XZ., Niu, JW., Liu, XF. et al. DG-CNN: Introducing Margin Information into Convolutional Neural Networks for Breast Cancer Diagnosis in Ultrasound Images. J. Comput. Sci. Technol. 37, 277–294 (2022). https://doi.org/10.1007/s11390-020-0192-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11390-020-0192-0