Combined Input Deep Learning Pipeline for Embryo Selection for In Vitro Fertilization Using Light Microscopic Images and Additional Features
Abstract
:1. Introduction
2. Materials and Methods
2.1. Dataset
2.2. Embryo Pregnancy Prediction
2.3. Preprocessing
2.4. Backbone Architecture Selection
2.5. Pseudo-Features
2.6. Custom Weight Using Simple Framework for Contrastive Learning of Visual Representations (SimCLR)
2.7. Hyperparameter Optimization
3. Results
- Selecting a baseline model: Each baseline model was trained on the dataset. The backbone of the best-performing model was selected for the next experiment and used for generating pseudo-features.
- Evaluating the integration of pseudo-features: Pseudo-features were created using the optimal model from a previous experiment. The features were then combined with the baseline model to create a pipeline according to Figure 3. The same pipeline using the real Istanbul grading feature was developed in parallel and compared with the pseudo-features. In addition, the patient age label was scaled in such a way that the mean value and standard deviation were zero.
- Optimization using Optuna: The best model from Experiment 2 was then optimized using Optuna [29]. The performances of the model before and after optimization were compared.
- Comparison of the best baseline with the custom weight: A custom-weight model was developed using the same method as used in Experiments 2 and 3. To obtain the best model, the model was compared with the model in Experiment 3.
3.1. Evaluation Metrics
3.2. Baseline Model Selection
3.3. Pseudo-Features Integration
3.4. Hyperparameter Optimization Using Optuna and Custom Weights
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Centers for Disease Control and Prevention. Assisted Reproductive Technology (ART) Clinic Information. U.S. Department of Health and Human Services. [Online]. Available online: https://nccd.cdc.gov/drh_art/rdPage.aspx?rdReport=DRH_ART.ClinicInfo&rdRequestForward=True&ClinicId=9999&ShowNational=1#rdTabPanel-tab3 (accessed on 12 October 2024).
- Suebthawinkul, C.; Numchaisrika, P.; Chaengsawang, A.; Pilaisangsuree, V.; Summat, S.; Sereepapong, W. Determining Factors Influencing the Successful Embryo Transfer and Pregnancy during the Frozen Cycle of In Vitro Fertilization: A Retrospective Cohort Study. Int. J. Fertil. Steril. 2024, 18, 352–361. [Google Scholar] [CrossRef] [PubMed]
- Alpha Scientists in Reproductive Medicine and ESHRE Special Interest Group of Embryology. The Istanbul consensus workshop on embryo assessment: Proceedings of an expert meeting. Hum. Reprod. 2011, 26, 1270–1283. [Google Scholar] [CrossRef] [PubMed]
- Gardner, D.; Lane, M.; Stevens, J.; Schlenker, T.; Schoolcraft, W. Reprint of: Blastocyst score affects implantation and pregnancy outcome: Towards a single blastocyst transfer. Fertil. Steril. 2019, 112, e81–e84. [Google Scholar] [CrossRef] [PubMed]
- Luke, B.; Brown, M.B.; Stern, J.E.; Jindal, S.K.; Racowsky, C.; Ball, G.D. Using the society for assisted reproductive technology clinic outcome system morphological measures to predict live birth after assisted reproductive technology. Fertil. Steril. 2014, 102, 1338–1344. Available online: https://www.sciencedirect.com/science/article/pii/S0015028214018834 (accessed on 13 July 2024). [CrossRef]
- Zhan, Q.; Sierra, E.; Malmsten, J.; Ye, Z.; Rosenwaks, Z.; Zaninovic, N. Blastocyst score, a blastocyst quality ranking tool, is a predictor of blastocyst ploidy and implantation potential. F&S Rep. 2020, 1, 133–141. Available online: https://www.sciencedirect.com/science/article/pii/S2666334120300155 (accessed on 13 July 2024).
- Ibrahim, H.A.; Thamilvanan, M.N.; Zaian, A.; Supriyanto, E. Fertility Assessment Model for Embryo Grading Using Convolutional Neural Network (CNN). In Proceedings of the 2022 International Conference on Healthcare Engineering (ICHE), Johor, Malaysia, 23–25 September 2022; pp. 1–4. [Google Scholar] [CrossRef]
- Berntsen, J.; Rimestad, J.; Lassen, J.T.; Tran, D.; Kragh, M.F. Robust and generalizable embryo selection based on artificial intelligence and time-lapse image sequences. PLoS ONE 2022, 17, e0262661. [Google Scholar] [CrossRef]
- Chen, T.-J.; Zheng, W.-L.; Liu, C.-H.; Huang, I.; Lai, H.-H.; Liu, M. Using deep learning with large dataset of microscope images to develop an automated embryo grading system. Fertil. Reprod. 2019, 1, 51–56. [Google Scholar] [CrossRef]
- Tran, D.; Cooke, S.; Illingworth, P.J.; Gardner, D.K. Deep learning as a predictive tool for fetal heart pregnancy following time-lapse incubation and blastocyst transfer. Hum. Reprod. 2019, 34, 1011–1018. [Google Scholar] [CrossRef]
- Sappakit, T.; Onthuam, K.; Limsila, T.; Chaichaowarat, R.; Suebthawinkul, C. Oocyte Microscopic Image Fertilization Prediction based on First Polar Body Morphology using YOLOv8. In Proceedings of the 2024 46th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Orlando, FL, USA, 15–19 July 2024. [Google Scholar]
- Kan-Tor, Y.; Zabari, N.; Erlich, I.; Szeskin, A.; Amitai, T.; Richter, D.; Or, Y.; Shoham, Z.; Hurwitz, A.; Har-Vardi, I.; et al. Automated evaluation of human embryo blastulation and implantation potential using deep-learning. Adv. Intell. Syst. 2020, 2, 2000080. [Google Scholar] [CrossRef]
- Suebthawinkul, C.; Babayev, E.; Zhou, L.T.; Lee, H.C.; Duncan, F.E. Quantitative morphokinetic parameters identify novel dynamics of oocyte meiotic maturation and cumulus expansion. Biol. Reprod. 2022, 107, 1097–1112. [Google Scholar] [CrossRef]
- Suebthawinkul, C.; Babayev, E.; Lee, H.C.; Duncan, F.E. Morphokinetic parameters of mouse oocyte meiotic maturation and cumulus expansion are not affected by reproductive age or ploidy status. J. Assist. Reprod. Genet. 2023, 40, 1197–1213. [Google Scholar] [CrossRef] [PubMed]
- Berman, A.; Anteby, R.; Efros, O.; Klang, E.; Soffer, S. Deep learning for embryo evaluation using time-lapse: A systematic review of diagnostic test accuracy. Am. J. Obstet. Gynecol. 2023, 229, 490–501. [Google Scholar] [CrossRef] [PubMed]
- Thirumalaraju, P.; Kanakasabapathy, M.K.; Bormann, C.L.; Gupta, R.; Pooniwala, R.; Kandula, H.; Souter, I.; Dimitriadis, I.; Sha, H. Evaluation of deep convolutional neural networks in classifying human embryo images based on their morphological quality. Heliyon 2021, 7, e06298. [Google Scholar] [CrossRef] [PubMed]
- Charnpinyo, N.; Suthicharoenpanich, K.; Onthuam, K.; Engphaiboon, S.; Chaichaowarat, R.; Suebthawinkul, C.; Siricharoen, P. Embryo selection for IVF using machine learning techniques based on light microscopic images of embryo and additional factors. In Proceedings of the 2023 45th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Sydney, Australia, 24–27 July 2023. [Google Scholar]
- Goodfellow, I.J.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial networks. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
- Dirvanauskas, D.; Maskeliūnas, R.; Raudonis, V.; Damaševičius, R.; Scherer, R. HEMIGEN: Human Embryo Image Generator Based on Generative Adversarial Networks. Sensors 2019, 19, 3578. [Google Scholar] [CrossRef]
- Shen, C.; Lamba, A.; Zhu, M.; Zhang, R.; Zernicka-Goetz, M.; Yang, C. Stain-free detection of embryo polarization using deep learning. Sci. Rep. 2022, 12, 2404. [Google Scholar] [CrossRef]
- Wu, C.; Yan, W.; Li, H.; Li, J.; Wang, H.; Chang, S.; Yu, T.; Jin, Y.; Ma, C.; Luo, Y.; et al. A classification system of day 3 human embryos using deep learning. Biomed. Signal Process. Control 2021, 70, 102943. [Google Scholar] [CrossRef]
- Liao, Q.; Zhang, Q.; Feng, X.; Huang, H.; Xu, H.; Tian, B.; Liu, J.; Yu, Q.; Guo, N.; Liu, Q.; et al. Development of deep learning algorithms for predicting blastocyst formation and quality by time-lapse monitoring. Commun. Biol. 2021, 4, 415. [Google Scholar] [CrossRef]
- VerMilyea, M.; Hall, J.M.; Diakiw, S.M.; Johnston, A.; Nguyen, T.; Perugini, D.; Miller, A.; Picou, A.; Murphy, A.P.; Perugini, M.; et al. Development of an artificial intelligence-based assessment model for prediction of embryo viability using static images captured by optical light microscopy during IVF. Hum. Reprod. 2020, 35, 770784. [Google Scholar] [CrossRef]
- Liu, H.; Zhang, Z.; Gu, Y.; Dai, C.; Shan, G.; Song, H.; Li, D.; Chen, W.; Lin, G.; Sun, Y. Development and evaluation of a live birth prediction model for evaluating human blastocysts: A retrospective study. Elife 2023, 12, e83662. [Google Scholar] [CrossRef]
- Miyagi, Y.; Habara, T.; Hirata, R.; Hayashi, N. Predicting a live birth by artificial intelligence incorporating both the blastocyst image and conventional embryo evaluation parameters. Artif. Intell. Med. Imaging 2020, 1, 94–107. [Google Scholar] [CrossRef]
- Thompson, S.M.; Onwubalili, N.; Brown, K.; Jindal, S.K.; McGovern, P.G. Blastocyst expansion score and trophectoderm morphology strongly predict successful clinical pregnancy and live birth following elective single embryo blastocyst transfer (eSET): A national study. J. Assist. Reprod. Genet. 2013, 30, 1577–1581. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
- Bakkensen, J.B.; Brady, P.; Carusi, D.; Romanski, P.; Thomas, A.M.; Racowsky, C. Association between blastocyst morphology and pregnancy and perinatal outcomes following fresh and cryopreserved embryo transfer. J. Assist. Reprod. Genet. 2019, 36, 2315–2324. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
- Bradski, G. The OpenCV Library. Dr. Dobb’s J. Softw. Tools 2000, 120, 122–125. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Tan, M.; Le, Q.V. Efficientnet: Rethinking model scaling for convolutional neural networks. In Proceedings of the 36th International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; Available online: http://arxiv.org/abs/1905.11946 (accessed on 13 July 2024).
- Dai, Z.; Liu, H.; Le, Q.V.; Tan, M. Coatnet: Marrying convolution and attention for all data sizes. Adv. Neural Inf. Process. Syst. 2021, 34, 3965–3977. [Google Scholar]
- Chollet, F. Xception: Deep learning with depthwise separable convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv 2020, arXiv:2010.11929. [Google Scholar]
- Mirza, M.; Osindero, S. Conditional Generative Adversarial Nets. arXiv 2014, arXiv:1411.1784. [Google Scholar]
- Arjovsky, M.; Chintala, S.; Bottou, L. Wasserstein GAN. arXiv 2017, arXiv:1701.07875. [Google Scholar]
- Chen, T.; Kornblith, S.; Norouzi, M.; Hinton, G. A simple framework for contrastive learning of visual representations. In Proceedings of the 37th International Conference on Machine Learning, Virtual Event, 13–18 July 2020. [Google Scholar]
- Akiba, T.; Sano, S.; Yanase, T.; Ohta, T.; Koyama, M. Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019. [Google Scholar]
Factors | All | Pregnant | Non-Pregnant |
---|---|---|---|
N (%) | N (%) | N (%) | |
Age (years, mean ± SD) | 37.8 ± 4.0 | 36.8 ± 3.7 | 38.2 ± 4.0 |
Stage of Embryo | |||
Cavitate/Early Cavitate | 379 (31.74) | 77 (20.75) | 302 (36.70) |
Early Blastocyst | 44 (3.66) | 18 (4.85) | 26 (3.16) |
Blastocyst | 74 (6.20) | 26 (7.00) | 48 (5.83) |
Expanded | 476 (39.87) | 151 (40.70) | 325 (39.49) |
Hatching/Hatched | 221 (18.51) | 99 (26.68) | 122 (14.82) |
Inner Cell Mass (ICM) | |||
Good | 366 (30.65) | 133 (35.85) | 233 (28.31) |
Fair | 295 (24.71) | 111 (29.92) | 184 (22.36) |
Poor | 110 (9.21) | 32 (8.63) | 78 (9.48) |
No Label | 423 (35.43) | 95 (25.61) | 328 (39.85) |
Trophectoderm (TE) | |||
Good | 205 (17.17) | 91 (24.53) | 114 (13.85) |
Fair | 395 (33.08) | 146 (39.35) | 249 (30.26) |
Poor | 171 (14.32) | 39 (10.51) | 132 (16.04) |
No Label | 423 (35.43) | 95 (25.61) | 328 (39.85) |
Model | Metrics | ||||
---|---|---|---|---|---|
F1-Score | Sensitivity | Specificity | Accuracy | AUC * | |
ResNet-34 | 51.03% | 57.14% | 50.00% | 52.27% | 60.0% |
EfficientNet-B0 | 61.45% | 70.00% | 59.33% | 62.73% | 67.2% |
CoAtNet-2 | 60.34% | 60.00% | 64.00% | 62.73% | 65.0% |
Xception | 56.55% | 50.00% | 64.67% | 60.00% | 62.3% |
ViT-Tiny-S16 | 56.56% | 44.59% | 69.33% | 61.36% | 64.1% |
Model | Metrics | ||||
---|---|---|---|---|---|
F1-Score | Sensitivity | Specificity | Accuracy | AUC * | |
EfficientNet-B0 | 61.45% | 70.00% | 59.33% | 62.73% | 67.2% |
EfficientNet-B0 + Three Labelled Features (Stage, ICM *, TE *) + Age | 59.84% | 55.41% | 66.67% | 63.18% | 64.33% |
EfficientNet-B0 + Three Pseudo Features (Stage, ICM, TE) + Age | 61.72% | 54.05% | 70.91% | 65.59% | 63.01% |
Metrics | |||||
---|---|---|---|---|---|
F1-Score | Sensitivity | Specificity | Accuracy | AUC * | |
Pseudo Features | 61.72% | 54.05% | 70.91% | 65.69% | 63.01% |
Pseudo Features + Optuna | 65.02% | 56.76% | 74.55% | 69.04% | 66.98% |
Custom-weight model trained using GAN * images | 62.54% | 52.70% | 73.33% | 66.95% | 64.78% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Onthuam, K.; Charnpinyo, N.; Suthicharoenpanich, K.; Engphaiboon, S.; Siricharoen, P.; Chaichaowarat, R.; Suebthawinkul, C. Combined Input Deep Learning Pipeline for Embryo Selection for In Vitro Fertilization Using Light Microscopic Images and Additional Features. J. Imaging 2025, 11, 13. https://doi.org/10.3390/jimaging11010013
Onthuam K, Charnpinyo N, Suthicharoenpanich K, Engphaiboon S, Siricharoen P, Chaichaowarat R, Suebthawinkul C. Combined Input Deep Learning Pipeline for Embryo Selection for In Vitro Fertilization Using Light Microscopic Images and Additional Features. Journal of Imaging. 2025; 11(1):13. https://doi.org/10.3390/jimaging11010013
Chicago/Turabian StyleOnthuam, Krittapat, Norrawee Charnpinyo, Kornrapee Suthicharoenpanich, Supphaset Engphaiboon, Punnarai Siricharoen, Ronnapee Chaichaowarat, and Chanakarn Suebthawinkul. 2025. "Combined Input Deep Learning Pipeline for Embryo Selection for In Vitro Fertilization Using Light Microscopic Images and Additional Features" Journal of Imaging 11, no. 1: 13. https://doi.org/10.3390/jimaging11010013
APA StyleOnthuam, K., Charnpinyo, N., Suthicharoenpanich, K., Engphaiboon, S., Siricharoen, P., Chaichaowarat, R., & Suebthawinkul, C. (2025). Combined Input Deep Learning Pipeline for Embryo Selection for In Vitro Fertilization Using Light Microscopic Images and Additional Features. Journal of Imaging, 11(1), 13. https://doi.org/10.3390/jimaging11010013