Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Advertisement

Computing infrastructure construction and optimization for high-performance computing and artificial intelligence

  • Review Paper
  • Published:
CCF Transactions on High Performance Computing Aims and scope Submit manuscript

Abstract

The emergence of supercomputers has brought rapid development to human life and scientific research. Today, the new wave of artificial intelligence (AI) not only brings convenience to people's lives, but also changes the engineering and scientific high-performance computation. AI technologies provide more efficient and accurate computing methods for many fields. These ongoing changes pose new challenges to the design of computing infrastructures, which will be addressed in this survey in details. This survey first describes the distinguished progress of combining AI and high-performance computing (HPC) in scientific computation, analyzes several typical scenarios, and summarizes the characteristics of the corresponding requirements of computing resources. On this basis, this survey further lists four general methods for integrating AI computing with conventional HPC, as well as their key features and application scenarios. Finally, this survey introduces the design strategy of the Peng Cheng Cloud Brain II Supercomputing Center in improving AI computing capability and cluster communication efficiency, which helped it won the first place in the IO500 and AIPerf rankings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Availability of data and material

Not applicable.

Code availability

Not applicable.

References

  • Altschul, S.F., Madden, T.L., Schäffer, A.A., Zhang, J., Zhang, Z., Miller, W., Lipman, D.J.: Gapped BLAST and PSI-BLAST: A new generation of protein database search programs. Nucleic Acids Res. 25, 3389–3402 (1997)

    Article  Google Scholar 

  • Behler, J., Parrinello, M.: Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev.lett. 98, 146401 (2007)

    Article  Google Scholar 

  • Bohacek, R.S., McMartin, C., Guida, W.C.: The art and practice of structure-based drug design: a molecular modeling perspective. Med. Res. Rev. 16, 3–50 (1996)

    Article  Google Scholar 

  • Cao, D.-S., Xu, Q.-S., Hu, Q.-N., Liang, Y.-Z.: ChemoPy: Freely available python package for computational biology and chemoinformatics. Bioinformatics 29, 1092–1094 (2013)

    Article  Google Scholar 

  • Ceriotti, M., More, J., Manolopoulos, D.E.: i-PI: A python interface for ab initio path integralmolecular dynamics simulations. Comput. Phys. Commun. 185, 1019–1026 (2014)

    Article  Google Scholar 

  • Chen, K., Wu, Y., Zheng, W.: MadFS: A high performance supercomputing buffer file system. Big Data Res. 7, 2021031 (2021). ((In Chinese))

    Google Scholar 

  • Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.R.: Machine learning of accurate energy-conserving molecular force fields. Sci. Adv. 3, e1603015 (2017)

    Article  Google Scholar 

  • Han, J., Zhang, L., Car, R., Weinan, E.: Deep potential: a general representation of a many-body potential energy surface. Commun. Comput. Phys. 23, 629–639 (2018)

    Article  MathSciNet  Google Scholar 

  • Jia, W., Wang, H., Chen, M., Lu, D., Lin, L., Car, R., Weinan, E., Zhang, L.: Pushing the limit of molecular dynamics with ab initio accuracy to 100 million atoms with machine learning. arXiv:2005.00223 (2020).

  • Jin, X., Cai, S., Li, H., Karniadakis, G.E.: NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations. J. Comput. Phys. 426, 109951 (2021)

    Article  MathSciNet  Google Scholar 

  • Jumper, J., Evans, R., Pritzel, A., Green, T., Figurnov, M., Ronneberger, O., Tunyasuvunakool, K., Bates, R., Žídek, A., Potapenko, A., Bridgland, A., Meyer, C., Kohl, S.A.A., Ballard, A.J., Cowie, A., Romera-Paredes, B., Nikolov, S., Jain, R., Adler, J., Back, T., Petersen, S., Reiman, D., Clancy, E., Zielinski, M., Steinegger, M., Pacholska, M., Berghammer, T., Bodenstein, S., Silver, D., Vinyals, O., Senior, A.W., Kavukcuoglu, K., Kohli, P., Hassabis, D.: Highly accurate protein structure prediction with AlphaFold. Nature (2021). https://doi.org/10.1038/s41586-021-03819-2

    Article  Google Scholar 

  • Kurth, T., Treichler, S., Romero, J., Mudigonda, M., Luehr, N., Phillips, E., Mahesh, A., Matheson, M., Deslippe, J., Fatica, M., Prabhat, P., Houston, M.: Exascale deep learning for climate analytics. SC18: international conference for high performance computing, networking storage analysis, pp 649–660 (2018)

  • Li, Z., Kovachki N., Azizzadenesheli, K., Liu, B., Bhattacharya, K., Stuart A., Anandkumar A.: Fourier neural operator for parametric partial differential equations. arXiv:2010.08895 (2020)

  • Liang, X.: Ascend AI Processor architecture and programming. Tsinghua University Press, Beijing (2019).. ((In Chinese))

    Google Scholar 

  • Liu, D., Xu, C., He, W., Xu, Z., Fu, W., Zhang, L., Yang, J., Peng, G., Han, D., Bai, X., Qiao, N.: AutoGenome: an autoML tool for genomic research. bioRxiv (2019). https://doi.org/10.1101/842526

    Article  Google Scholar 

  • Long, Z., Lu, Y., Ma, X., Dong, B.: PDE-Net: learning PDEs from data. In: Proceedings of 35th international conference on machine learning, PMLR, vol. 80, pp. 3208–3216 (2018)

  • Long, Z., Lu, Y., Dong, B.: PDE-Net 2.0: learning PDEs from data with a numeric-symbolic hybrid deep network. J. Comput. Phys. 399, 108925 (2019)

    Article  MathSciNet  Google Scholar 

  • Mathuriya, A., Bard, D., Mendygral, P., Meadows, L., Arnemann, J., Shao, L., He, S., Kärnä, T., Moise, D., Pennycook, S.J., Maschhoff, K., Sewall, J., Kumar, N., Ho, S., Ringenburg, M.F., Prabhat, P., Lee, V.: CosmoFlow: Using deep learning to learn the universe at scale. SC18. In: International conference for high performance computing, networking, storage and analysis, pp 819–829 (2018)

  • Plimpton, S.: Fast parallel algorithms for short-range molecular dynamics. J. Comput. Phys. 117, 1–19 (1995)

    Article  Google Scholar 

  • Raissi, M., Perdikaris, P., Karniadakis, G.E.: Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 378, 686–707 (2019)

    Article  MathSciNet  Google Scholar 

  • Remmert, M., Biegert, A., Hauser, A., Söding, J.: HHblits: Lightning-fast iterative protein sequence searching by HMM–HMM alignment. Nat. Methods 9, 173–175 (2012)

    Article  Google Scholar 

  • Ren, Z., Liu, Y., Shi, T., Xie, L., Zhou, Y., Zhai, J., Zhang, Y., Zhang, Y., Chen, W.: AIPerf: automated machine learning as an AI-HPC benchmark. Big Data Min. Anal. 4, 208–220 (2021)

    Article  Google Scholar 

  • Schneider, G., Böhm, H.J.: Virtual screening and fast automated docking methods. Drug Discov. Today 7, 64–70 (2002)

    Article  Google Scholar 

  • Schütt, K.T., Sauceda, H.E., Kindermans, P.-J., Tkatchenko, A., Müller, K.-R.: SchNet—a deep learning architecture for molecules and materials. J. Chem. Phys. 148, 241722 (2018)

    Article  Google Scholar 

  • Segler, M.H., Preuss, M., Waller, M.P.: Planning chemical syntheses with deep neural networks and symbolic AI. Nature 555, 604–610 (2018)

    Article  Google Scholar 

  • Senior, A.W., Evans, R., Jumper, J., Kirkpatrick, J., Sifre, L., Green, T., Qin, C., Žídek, A., Nelson, A.W.R., Bridgland, A., Penedones, H., Petersen, S., Simonyan, K., Crossan, S., Kohli, P., Jones, D.T., Silver, D., Kavukcuoglu, K., Hassabis, D.: Protein structure prediction using multiple deep neural networks in the 13th Critical Assessment of Protein Structure Prediction (CASP13). Proteins Struct. Funct. Bioinform. 87, 1141–1148 (2019)

    Article  Google Scholar 

  • Senior, A.W., Evans, R., Jumper, J., Kirkpatrick, J., Sifre, L., Green, T., Qin, C., Žídek, A., Nelson, A.W.R., Bridgland, A., Penedones, H., Petersen, S., Simonyan, K., Crossan, S., Kohli, P., Jones, D.T., Silver, D., Kavukcuoglu, K., Hassabis, D.: Improved protein structure prediction using potentials from deep learning. Nature 577, 706–710 (2020)

    Article  Google Scholar 

  • Sheng, H., Yang, C.: PFNN: a penalty-free neural network method for solving a class of second-order boundary-value problems on complex geometries. J. Comput. Phys. 428, 110085 (2021)

    Article  MathSciNet  Google Scholar 

  • Stokes, J.M., Yang, K., Swanson, K., Jin, W., Cubillos-Ruiz, A., Donghua, N.M., MacNair, C.R., French, S., Carfrae, L.A., Bloom-Ackermann, Z., Tran, V.M., Chiappino-Pepe, A., Badran, A.H., Andrews, J.W., Chory, E.J., Church, G.M., Brown, E.D., Jaakkola, T.S., Barzilay, R., Collins, J.J.: A deep learning approach to antibiotic discovery. Cell 180, 688–702 (2020)

    Article  Google Scholar 

  • Wang, H., Zhang, L., Han, J., Weinan, E.: DeePMD-kit: a deep learning package for many-body potential energy representation and molecular dynamics. Comput. Phys. Commun. 228, 178–184 (2018)

    Article  Google Scholar 

  • Yang, K., Swanson, K., Jin, W., Coley, C., Eiden, P., Gao, H., Guzman-Perez, A., Hopper, T., Kelley, B., Mathea, M., Palmer, A., Settels, V., Jaakkola, T., Jensen, K., Barzilay, R.: Analyzing learned molecular representations for property prediction. J. Chem. Inf. Model. 59, 3370–3388 (2019)

    Article  Google Scholar 

  • Zeng, W., Ren, X., Su, T., Wang, H., Liao, Y., Wang, Z., Jiang, X., Yang, Z., Wang, K., Zhang, X., Li, C., Gong, Z., Yao, Y., Huang, X., Wang, J., Yu, J., Guo, Q., Yu, Y., Zhang, Y., Wang, J., Tao, H., Yan, D., Yi, Z., Peng, F., Jiang, F., Zhang, H., Deng, L., Zhang, Y., Lin, Z., Zhang, C., Zhang, S., Guo, M., Gu, S., Fan, G., Wang, Y., Jin, X., Liu, Q., Tian, Y.: PanGu-α: Large-scale autoregressive pretrained Chinese language models with auto-parallel computation. arXiv:2104.12369 (2021)

  • Zhang, L., Han, J., Wang, H., Saidi, W.A., Car, R., Weinan, E.: End-to-end symmetry preserving inter-atomic potential energy model for finite and extended systems. In: Proceedings of the 32nd international conference on neural information processing systems, pp. 4436–4446 (2018)

Download references

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bin Zhou.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Su, Y., Zhou, J., Ying, J. et al. Computing infrastructure construction and optimization for high-performance computing and artificial intelligence. CCF Trans. HPC 3, 331–343 (2021). https://doi.org/10.1007/s42514-021-00080-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42514-021-00080-x

Keywords