DynaHB: A Communication-Avoiding Asynchronous Distributed Framework with Hybrid Batches for Dynamic GNN Training
Abstract
References
Index Terms
- DynaHB: A Communication-Avoiding Asynchronous Distributed Framework with Hybrid Batches for Dynamic GNN Training
Recommendations
NeutronStream: A Dynamic GNN Training Framework with Sliding Window for Graph Streams
Existing Graph Neural Network (GNN) training frameworks have been designed to help developers easily create performant GNN implementations. However, most existing GNN frameworks assume that the input graphs are static, but ignore that most real-world ...
HongTu: Scalable Full-Graph GNN Training on Multiple GPUs
PACMMODFull-graph training on graph neural networks (GNN) has emerged as a promising training method for its effectiveness. Full-graph training requires extensive memory and computation resources. To accelerate this training process, researchers have proposed ...
A debiased self-training framework with graph self-supervised pre-training aided for semi-supervised rumor detection
AbstractExisting rumor detection models have achieved remarkable performance in fully-supervised settings. However, it is time-consuming and labor-intensive to obtain extensive labeled rumor data. To mitigate the reliance on labeled data, semi-supervised ...
Highlights- A self-training framework for semi-supervised rumor detection is proposed.
- Graph self-supervised pre-training is employed to alleviate confirmation bias.
- Self-adaptive thresholds are designed to generate reliable pseudo-labels.
Comments
Information & Contributors
Information
Published In
Publisher
VLDB Endowment
Publication History
Check for updates
Badges
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 99Total Downloads
- Downloads (Last 12 months)99
- Downloads (Last 6 weeks)26
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in