Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3635035.3635040acmotherconferencesArticle/Chapter ViewAbstractPublication PageshpcasiaConference Proceedingsconference-collections
research-article
Open access

Parallelized Remapping Algorithms for km-scale Global Weather and Climate Simulations with Icosahedral Grid System

Published: 19 January 2024 Publication History

Editorial Notes

A corrigendum was issued for this paper on February 13, 2024. You can download the corrigendum from the Supplemental Material section of this citation page.

Abstract

In weather and climate research, latitude–longitude grid data are typically used for analysis and visualization, and remapping from model native grids to latitude–longitude grids typically requires a significant amount of time. Here, we developed a series of parallelized remapping algorithms for NICAM, a global weather and climate model with an icosahedral grid system, and demonstrated their performance with global 14–0.87-km mesh model data on the supercomputer Fugaku. The original remapping tool in NICAM supports parallelization only in reading and interpolating data. In our proposed algorithms, the process of data writing is parallelized by separating output files or using the MPI-IO library, both of which enable us to remap 0.87-km mesh data with 670 million horizontal grid points and 94 vertical levels. The benchmark with 14-km mesh data shows that the developed algorithms significantly outperform the original algorithm in terms of elapsed time (by 7.4–8.7 times) and memory usage (by 2.8–5.0 times). Among the proposed algorithms, the separation of output files, along with reduced MPI communication size, leads to a better performance in the elapsed time and its scalability, and the use of the MPI-IO library leads to a better performance in memory usage. The remapping year per wall-clock day, assuming a six-hourly output interval, is up to 0.56 with 3.5-km mesh data, demonstrating the feasibility of handling global cloud-resolving climate simulation data in a practical time. This study demonstrates the importance of IO performance, including MPI-IO, in accelerating weather and climate research on future supercomputers.

Supplementary Material

PDF File (3635040-corrigendum.pdf)
Corrigendum to "Parallelized Remapping Algorithms for km-scale Global Weather and Climate Simulations with Icosahedral Grid System" by Kodama et al., Proceedings of the International Conference on High Performance Computing in Asia-Pacific Region (HPCAsia '24).

References

[1]
Takashi Arakawa, Takahiro Inoue, and Masaki Sato. 2014. Performance Evaluation and Case Study of a Coupling Software ppOpen-MATH/MP. Procedia Comput. Sci. 29 (Jan. 2014), 924–935. https://doi.org/10.1016/j.procs.2014.05.083
[2]
Takashi Arakawa, Takahiro Inoue, Hisashi Yashiro, and Masaki Satoh. 2020. Coupling library Jcup3: its philosophy and application. Progress in Earth and Planetary Science 7, 1 (Dec. 2020), 6. https://doi.org/10.1186/s40645-019-0320-z
[3]
Ayse Bagbaba. 2021. A Comparative Study of MPI-IO Libraries for Offloading of Collective I/O Tasks. In 2021 International Conference on Engineering and Emerging Technologies (ICEET). IEEE, 1–6. https://doi.org/10.1109/ICEET53442.2021.9659767
[4]
Nico Caltabiano. 2023. World Climate Resaerch Programme Digital Earths Lighthouse Activity. https://www.wcrp-climate.org/digital-earths. https://www.wcrp-climate.org/digital-earths Accessed: 2023-12-13.
[5]
Philip W Jones. 1999. First- and Second-Order Conservative Remapping Schemes for Grids in Spherical Coordinates. Mon. Weather Rev. 127, 9 (Sept. 1999), 2204–2210. https://doi.org/10.1175/1520-0493(1999)127<2204:FASOCR>2.0.CO;2
[6]
Junghan Kim, Young Cheol Kwon, and Tae-Hun Kim. 2018. A Scalable High-Performance I/O System for a Numerical Weather Forecast Model on the Cubed-Sphere Grid. Asia-Pacific Journal of Atmospheric Sciences 54, 1 (June 2018), 403–412. https://doi.org/10.1007/s13143-018-0021-3
[7]
Chihiro Kodama, Masaaki Terai, Akira T A T Noda, Yohei Yamada, Masaki Satoh, Tatsuya Seiki, Shin-Ichi S-I Iga, Hisashi Yashiro, Hirofumi Tomita, and Kazuo Minami. 2014. Scalable rank-mapping algorithm for an icosahedral grid system on the massive parallel computer with a 3-D torus network. Parallel Comput. 40, 8 (Aug. 2014), 362–373. https://doi.org/10.1016/j.parco.2014.06.002
[8]
Peter H Lauritzen and Ramachandran D Nair. 2008. Monotone and conservative cascade remapping between spherical grids (CaRS): Regular latitude–longitude and cubed-sphere grids. Mon. Weather Rev. 136, 4 (April 2008), 1416–1432. https://doi.org/10.1175/2007mwr2181.1
[9]
Hiroaki Miura, Masaki Satoh, Tomoe Nasuno, Akira T Noda, and Kazuyoshi Oouchi. 2007. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model. Science 318, 5857 (Dec. 2007), 1763–1765. https://doi.org/10.1126/science.1148443
[10]
Hiroaki Miura, Tamaki Suematsu, Yuta Kawai, Yoko Yamagami, Daisuke Takasuka, Yuki Takano, Ching-Shu Hung, Kazuya Yamazaki, Chihiro Kodama, Yoshiyuki Kajikawa, and Yukio Masumoto. 2023. Asymptotic matching between weather and climate models. Bull. Am. Meteorol. Soc. -1, aop (Nov. 2023). https://doi.org/10.1175/BAMS-D-22-0128.1
[11]
Yoshiaki Miyamoto, Yoshiyuki Kajikawa, Ryuji Yoshida, Tsuyoshi Yamaura, Hisashi Yashiro, and Hirofumi Tomita. 2013. Deep moist atmospheric convection in a subkilometer global simulation. Geophys. Res. Lett. 40, 18 (Sept. 2013), 4922–4926. https://doi.org/10.1002/grl.50944
[12]
Masaki Satoh, Taroh Matsuno, Hirofumi Tomita, Hiroaki Miura, Tomoe Nasuno, and Shin-Ichi Iga. 2008. Nonhydrostatic icosahedral atmospheric model (NICAM) for global cloud resolving simulations. J. Comput. Phys. 227, 7 (March 2008), 3486–3514. https://doi.org/10.1016/j.jcp.2007.02.006
[13]
Masaki Satoh, Hirofumi Tomita, Hisashi Yashiro, Hiroaki Miura, Chihiro Kodama, Tatsuya Seiki, Akira T Noda, Yohei Yamada, Daisuke Goto, Masahiro Sawada, Takemasa Miyoshi, Yosuke Niwa, Masayuki Hara, Tomoki Ohno, Shin-Ichi Iga, Takashi Arakawa, Takahiro Inoue, and Hiroyasu Kubokawa. 2014. The non-hydrostatic icosahedral atmospheric model: Description and development. Progress in Earth and Planetary Science 1, 1 (Dec. 2014), 18. https://doi.org/10.1186/s40645-014-0018-1
[14]
Uwe Schulzweida. 2023. Parallelized operators (CDO User Guide). https://code.mpimet.mpg.de/projects/cdo/embedded. https://code.mpimet.mpg.de/projects/cdo/embedded Accessed: 2023-12-13.
[15]
Bjorn Stevens. 2023. The Berlin Summit for EVE. https://eve4climate.org/. https://eve4climate.org/ Accessed: 2023-12-13.
[16]
Bjorn Stevens, Masaki Satoh, Ludovic Auger, Joachim Biercamp, Christopher S Bretherton, Xi Chen, Peter Düben, Falko Judt, Marat Khairoutdinov, Daniel Klocke, Chihiro Kodama, Luis Kornblueh, Shian-Jiann Lin, Philipp Neumann, William M Putman, Niklas Röber, Ryosuke Shibuya, Benoit Vanniere, Pier Luigi Vidale, Nils Wedi, and Linjiong Zhou. 2019. DYAMOND: the DYnamics of the Atmospheric general circulation Modeled On Non-hydrostatic Domains. Progress in Earth and Planetary Science 6, 1 (Sept. 2019), 1–17. https://doi.org/10.1186/s40645-019-0304-z
[17]
Daisuke Takasuka, Chihiro Kodama, Tamaki Suematsu, Tomoki Ohno, Yohei Yamada, Tatsuya Seiki, Hisashi Yashiro, Masuo Nakano, Hiroaki Miura, Akira T Noda, Tomoe Nasuno, Tomoki Miyakawa, and Ryusuke Masunaga. 2023. How can we improve the seamless representation of climatological statistics and weather toward reliable global K-scale climate simulations? (March 2023). https://doi.org/10.22541/essoar.167839719.95840260/v1
[18]
Hirofumi Tomita, Koji Goto, and Masaki Satoh. 2008. A New Approach to Atmospheric General Circulation Model: Global Cloud Resolving Model NICAM and its Computational Performance. SIAM J. Sci. Comput. 30, 6 (Jan. 2008), 2755–2776. https://doi.org/10.1137/070692273
[19]
Hirofumi Tomita, Hiroaki Miura, Shin-ichi Iga, Tomoe Nasuno, and Masaki Satoh. 2005. A global cloud-resolving simulation: Preliminary results from an aqua planet experiment. Geophys. Res. Lett. 32, 8 (2005), L08805. https://doi.org/10.1029/2005GL022459
[20]
Hirofumi Tomita and Masaki Satoh. 2004. A new dynamical framework of nonhydrostatic global model using the icosahedral grid. Fluid Dyn. Res. 34, 6 (June 2004), 357–400. https://doi.org/10.1016/j.fluiddyn.2004.03.003
[21]
Paul Voosen. 2020. Europe is building a ‘digital twin’ of Earth to revolutionize climate forecasts. Science (Oct. 2020). https://doi.org/10.1126/science.abf0687
[22]
David L Williamson. 2007. The evolution of dynamical cores for global atmospheric models. J. Meteorol. Soc. Japan 85B (2007), 241–269. https://doi.org/10.2151/jmsj.85B.241
[23]
Hisashi Yashiro, Yoshiyuki Kajikawa, Yoshiaki Miyamoto, Tsuyoshi Yamaura, Ryuji Yoshida, and Hirofumi Tomita. 2016. Resolution Dependence of the Diurnal Cycle of Precipitation Simulated by a Global Cloud-System Resolving Model. SOLAIAT 12, 0 (2016), 272–276. https://doi.org/10.2151/sola.2016-053
[24]
Xavier Yepes-Arbós, Gijs van den Oord, Mario C Acosta, and Glenn D Carver. 2022. Evaluation and optimisation of the I/O scalability for the next generation of Earth system models: IFS CY43R3 and XIOS 2.0 integration as a case study. Geosci. Model Dev. 15, 2 (Jan. 2022), 379–394. https://doi.org/10.5194/gmd-15-379-2022
[25]
Yinlong Zou, Wei Xue, and Shenshen Liu. 2014. A case study of large-scale parallel I/O analysis and optimization for numerical weather prediction system. Future Gener. Comput. Syst. 37 (July 2014), 378–389. https://doi.org/10.1016/j.future.2013.12.039

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
HPCAsia '24: Proceedings of the International Conference on High Performance Computing in Asia-Pacific Region
January 2024
185 pages
ISBN:9798400708893
DOI:10.1145/3635035
This work is licensed under a Creative Commons Attribution International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 January 2024

Check for updates

Author Tags

  1. Icosahedral grid system
  2. MPI
  3. MPI-IO
  4. Remapping
  5. Weather and climate simulation

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • MEXT
  • JSPS KAKENHI

Conference

HPCAsia 2024

Acceptance Rates

Overall Acceptance Rate 69 of 143 submissions, 48%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 232
    Total Downloads
  • Downloads (Last 12 months)232
  • Downloads (Last 6 weeks)28
Reflects downloads up to 15 Oct 2024

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media