Abstract
Scientific data and computing-intensive applications become more and more widely used. Different computing solutions have different protocols and architectures, they should be chosen carefully in the design of computing projects of large scientific communities. In a modern world of diverse computing resources such as grids, clouds, and supercomputers the choice can be difficult. Therefore, software developed for integration of various computing and storage resources into a single infrastructure, the so-called interware, makes this choice easier. The DIRAC interware is one of these products. It proved to be an effective solution for many experiments in High Energy Physics and some other areas of science providing means for seamless access to distributed computing and storage resources. The DIRAC interware was deployed in the Joint Institute for Nuclear Research to serve the needs of different scientific groups by providing a single interface to a variety of computing resources: grid cluster, computing cloud, supercomputer Govorun, disk and tape storage systems. The DIRAC based solution was proposed for the currently operational Baryonic Matter at Nuclotron experiment as well as for the future experiment Multi-Purpose Detector at the Nuclotron-based Ion Collider fAcility. Both experiments have requirements making the use of heterogeneous computing resources necessary. A set of tests was introduced in order to demonstrate the performance of the JINR distributed computing system.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bozzi, C., Roiser, S.: The LHCb software and computing upgrade for Run 3: opportunities and challenges. J. Phys.: Conf. Ser. 898, 112002 (2017). https://doi.org/10.1088/1742-6596/898/10/112002
SKA Telescope. https://www.skatelescope.org/software-and-computing/. Accessed 19 Aug 2019
Kalinichenko, L., et al.: Data access challenges for data intensive research in Russia. Inform. App. 10(1), 2–22 (2016). https://doi.org/10.14357/19922264160101
Korenkov, V., Pelevanyuk, I., Tsaregorodtsev, A.: DIRAC system as a mediator between hybrid resources and data intensive domains. In: Selected Papers of the XXI International Conference on Data Analytics and Management in Data Intensive Domains (DAMDID/RCDL 2019), vol. 2523, pp. 73–84, Kazan, Russia (2019)
France Grilles. http://www.france-grilles.fr. Accessed 01 Feb 2020
Britton, D., et al.: GridPP: the UK grid for particle physics. Philos. Trans. R. Soc. A 367, 2447–2457 (2009)
European Open Science Cloud. https://www.eosc-portal.eu. Accessed 19 Aug 2019
Tsaregorodtsev, A.: DIRAC distributed computing services. J. Phys: Conf. Ser. 513(3), 032096 (2014). https://doi.org/10.1088/1742-6596/513/3/032096
Gergel, V., Korenkov, V., Pelevanyuk, I., Sapunov, M., Tsaregorodtsev, A., Zrelov, P.: Hybrid distributed computing service based on the dirac interware. Commun. Comput. Inf. Sci. 706, 105–118 (2017). https://doi.org/10.1007/978-3-319-57135-5_8
Baginyan, A., et al.: The CMS Tier1 at JINR: five years of operations. In: Proceedings of VIII International Conference on Distributed Computing and Grid-technologies in Science and Education, vol. 2267, pp. 1–10 (2018)
Baranov, A., et al.: New features of the JINR cloud. In: Proceedings of VIII International Conference on Distributed Computing and Grid-technologies in Science and Education, vol. 2267, pp. 257–261 (2018)
Adam, Gh., et al.: IT-ecosystem of the HybriLIT heterogeneous platform for high-performance computing and training of IT-specialists. In: Proceedings of VIII International Conference on Distributed Computing and Grid-technologies in Science and Education, vol. 2267, pp. 638–644 (2018)
Peters, A.J., et al.: EOS as the present and future solution for data storage at CERN. J. Phys.: Conf. Ser. 664, 042042 (2015). https://doi.org/10.1088/1742-6596/664/4/042042
dCache, the Overview. https://www.dcache.org/manuals/dcache-whitepaper-light.pdf. Accessed 19 Aug 2019
Michelotto, M., et al.: A comparison of HEP code with SPEC benchmarks on multi-core worker nodes. J. Phys.: Conf. Ser. 219, 052009 (2010). https://doi.org/10.1088/1742-6596/219/5/052009
Charpentier, P.: Benchmarking worker nodes using LHCb productions and comparing with HEPSpec06. J. Phys.: Conf. Ser. 898, 082011 (2017). https://doi.org/10.1088/1742-6596/898/8/082011 (IOP Conf. Series)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Korenkov, V., Pelevanyuk, I., Tsaregorodtsev, A. (2020). Integration of the JINR Hybrid Computing Resources with the DIRAC Interware for Data Intensive Applications. In: Elizarov, A., Novikov, B., Stupnikov, S. (eds) Data Analytics and Management in Data Intensive Domains. DAMDID/RCDL 2019. Communications in Computer and Information Science, vol 1223. Springer, Cham. https://doi.org/10.1007/978-3-030-51913-1_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-51913-1_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-51912-4
Online ISBN: 978-3-030-51913-1
eBook Packages: Computer ScienceComputer Science (R0)