FreshGNN: Reducing Memory Access via Stable Historical Embeddings for Graph Neural Network Training
Abstract
References
Recommendations
Reducing communication in graph neural network training
SC '20: Proceedings of the International Conference for High Performance Computing, Networking, Storage and AnalysisGraph Neural Networks (GNNs) are powerful and flexible neural networks that use the naturally sparse connectivity information of the data. GNNs represent this connectivity as sparse matrices, which have lower arithmetic intensity and thus higher ...
Accelerating Graph Neural Network Training on ReRAM-Based PIM Architectures via Graph and Model Pruning
Graph neural networks (GNNs) are used for predictive analytics on graph-structured data, and they have become very popular in diverse real-world applications. Resistive random-access memory (ReRAM)-based PIM architectures can accelerate GNN training. ...
A stable online self-constructing recurrent neural network
ISNN'11: Proceedings of the 8th international conference on Advances in neural networks - Volume Part IIIA new online self-constructing recurrent neural network (SCRNN) model is proposed, of which the network structure could adjust according to the specific problem in real time. If the approximation performance of SCRNN is insufficient, SCRNN can create ...
Comments
Information & Contributors
Information
Published In
![cover image Proceedings of the VLDB Endowment](/cms/asset/0f31c831-67bd-4668-b84a-814894f6a114/3648160.cover.jpg)
Publisher
VLDB Endowment
Publication History
Check for updates
Badges
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 56Total Downloads
- Downloads (Last 12 months)56
- Downloads (Last 6 weeks)14
Other Metrics
Citations
View Options
Get Access
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in