Graph-Representation Learning

Sparsity-Aware Communication for Distributed Graph Neural Network Training

Sparsity-Aware Communication for Distributed Graph Neural Network Training

Graph Neural Networks (GNNs) are a computationally efficient method to learn embeddings and classifications on graph data. However, GNN training has low computational intensity, making communication costs the bottleneck for scalability. Sparse-matrix …

Communication-Avoiding Algorithms for Full-Batch and Mini-Batch GNN Training

Communication-Avoiding Algorithms for Full-Batch and Mini-Batch GNN Training

Distributed Matrix-Based Sampling for Graph Neural Network Training

Graph Neural Networks (GNNs) offer a compact and computationally efficient way to learn embeddings and classifications on graph data. GNN models are frequently large, making distributed minibatch training necessary. The primary contribution of this …

Reducing Communication in Graph Neural Network Training

Graph Neural Networks (GNNs) are powerful and flexible neural networks that use the naturally sparse connectivity information of the data. GNNs represent this connectivity as sparse matrices, which have lower arithmetic intensity and thus higher …