GraphSage unsupervised loss function

The paper “Inductive Representation Learning on Large Graphs (Graphsage)” has an unsupervised loss function based on random walks (sec 3.2) that is similar to deepwalk and node2vec, I couldn’t find it anywhere.

Is there an implementation or a plan to implement this loss function in dgl?

For instance, for a binary cross entropy loss function you just have to use : torch.nn.functional.binary_cross_entropy.
Is there a similar function for the loss of the authors of the section 3.2 ?

Thank you all !

Is this example helpful?

Not really… There is no implementation of the loss function that I was talking about and of the authors research paper (section 3.2. and see image below).

It seems that I have to code it myself.
But in another post ( GraphSage unsupervised loss function based on randomwalks - Questions - Deep Graph Library (dgl.ai)) you said that an example will be implemented. Did you do it ?

Thanks a lot for your fast reply ! It is not what I wanted but it still can be usefull for implementing the loss function I was talking about.

Hi @minjie, did you find something to help me or does no one have code this loss function yet ?

Thank you.

I think https://github.com/dmlc/dgl/blob/master/examples/pytorch/graphsage/advanced/train_lightning_unsupervised.py#L66 is very close to the equation you mentioned above, if not exactly the same. You just perform negative log likelihood minimization for positive edges and maximization for negative edges.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.