Has DGL implemented FastGCN or LADIES?
In this issue: Any example with the layer wise sampling strategies like FastGCN, AdaptiveSampling? · Issue #578 · dmlc/dgl · GitHub, I found a method for implementing FastGCN using dgl.contrib.sampling.LayerSampler
. However, I couldn’t find this feature in the latest version of DGL. Is dgl.dataloading.LaborSampler
the same class? Additionally, it seems that this class cannot be applied to distributed node classification.
Are there any examples of implementing distributed FastGCN and LADIES?
DGL currently do not have FastGCN or LADIES implementation. LABOR sampling is indeed a kind of layer sampling though, but we don’t have any plan on distributed LABOR sampling either. Why do you like to use layer sampling instead of neighbor/subgraph sampling?
From my understanding, it is a matter of writing wrappers around the existing dgl.sampling.sample_labors
inside the file dgl/python/dgl/distributed/graph_services.py at master · dmlc/dgl · GitHub similar to how it is done for sample_neighbors
. Labor Sampler in DGL implements this paper: [2210.13339] Layer-Neighbor Sampling -- Defusing Neighborhood Explosion in GNNs
I want use fastgcn as my baseline.so how to use layer-wise sample in dgl?
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.