Is there a GraphSAINT example for distributed GNN training?

Hi, I’m trying to reproduce GraphSAINT in distributed training scenario based on code of graphsaint and distributed training.

However, I found that GraphSAINT is implemented in an offline mode, which makes it hard to be incorporate with dgl.DistDataLoader. Could anyone give me some suggestions?

Besides, is there examples to implement classical sampling strategies such as FastGCN, VR-GCN, and clusterGCN in distributed scenarios?

Looking forward to the reply.

Currently we do not have plans to support other sampling strategies in distributed training. Is your graph storable in a single machine (without node/edge features)? If so, then maybe you could try duplicating the graph structure on every machine?

cc @classicsong

@BarclayII Thanks for the prompt reply.

Yes, the graph structure can be stored in a single machine.

So I wonder if I can run the program in this way: 1) pre-load the whole graph structure at each server and offline generate several sampled graphs using GraphSAINT at each server; 2) use dgl.distributed to load sampled subgraphs at each server; and run the program distributedly.