GraphSage unsupervised loss function based on randomwalks

The paper “Inductive Representation Learning on Large Graphs (Graphsage)” has an unsupervised loss function based on random walks (sec 3.2) that is similar to deepwalk and node2vec, I couldn’t find it anywhere.

Is there an implementation or a plan to implement this loss function in dgl?
Thanks.

I don’t know if the team plans to add this into the source. But in case you want to implement it yourself:

I didn’t find any example implementations of random walks, but I found this function in the source. The other conceptual piece of the loss function is negative sampling. That should be relatively straight-forward to implement after evaluating a batch. The rest of the loss function is just sigmoid, dot products…etc. I also found this PR, which seems highly relevant to implementing GraphSage/Negative Sampling. Hope this helps.

1 Like

@mohamadmahdi3 @zjost We do have plan to implement the unsupervised GraphSAGE example. This is one of the motivations that drive our design of new sampler APIs (RFC here). We have finished core functionalities such as random walk and neighbor sampling. Please stay tune and we will push the example soon.

2 Likes