


default search action
20th PP 2022: Seattle, WA, USA
- Xiaoye S. Li, Keita Teranishi:
Proceedings of the 2022 SIAM Conference on Parallel Processing for Scientific Computing, PP 2022, Seattle, WA, USA, February 23-26, 2022. SIAM 2022, ISBN 978-1-61197-714-1 - Xinran Zhu
, Yang Liu, Pieter Ghysels, David Bindel, Xiaoye S. Li:
GPTuneBand: Multi-task and Multi-fidelity Autotuning for Large-scale High Performance Computing Applications. 1-13 - Yannick Funk, Markus Götz
, Hartwig Anzt:
Prediction of Optimal Solvers for Sparse Linear Systems Using Deep Learning. 14-24 - Alice Gatti, Zhixiong Hu, Tess E. Smidt, Esmond G. Ng, Pieter Ghysels:
Deep Learning and Spectral Embedding for Graph Partitioning. 25-36 - Malachi Phillips, Stefan Kerkemeier, Paul F. Fischer:
Tuning Spectral Element Preconditioners for Parallel Scalability on GPUs. 37-48 - Shelby Lockhart, David J. Gardner
, Carol S. Woodward, Stephen Thomas, Luke N. Olson:
Performance of Low Synchronization Orthogonalization Methods in Anderson Accelerated Fixed Point Solvers. 49-59 - François Pacaud, Michel Schanen
, Daniel Adrian Maldonado, Alexis Montoison
, Valentin Churavy, Julian Samaroo, Mihai Anitescu
:
Batched Second-Order Adjoint Sensitivity for Reduced Space Methods. 60-71 - Tim Baer, Raghavendra Kanakagiri, Edgar Solomonik:
Parallel Minimum Spanning Forest Computation using Sparse Matrix Kernels. 72-83

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.