Dropout while inferring in the GraphSage unsupervised example

Hi all,

I’m trying to understand the GraphSage inferring mechanism, running the version on the master branch, and I ran into a problem:

I’ve saved the trained model and trying to get the embedding of one of the nodes in the RedditDataset, node 0. I’ve verified that this node is always the same, by looking at the features and the neighbors.

I’m running this code:

model = th.load('....')
data = RedditDataset(self_loop=False)
g = data[0]
pred = model.inference(g, g.ndata['feat'], 2, 'cpu')
print(pred[0])

And every time I run, I get a different answer:

tensor([-0.0519,  0.0181,  0.2280,  0.1237, -0.0898,  0.2659, -0.0409,  0.1525,
        -0.1326,  0.1273, -0.2317,  0.1212, -0.1560,  0.1527,  0.0564, -0.2442],

tensor([-0.0705,  0.0263,  0.2385,  0.1227, -0.0850,  0.2589, -0.0523,  0.1527,
        -0.1339,  0.1320, -0.2215,  0.1180, -0.1632,  0.1539,  0.0366, -0.2489],

tensor([-0.0587,  0.0167,  0.2268,  0.1195, -0.0910,  0.2601, -0.0573,  0.1521,
        -0.1338,  0.1367, -0.2224,  0.1124, -0.1642,  0.1414,  0.0500, -0.2448],

From my understanding, there should not be any changes at all, as the features and the model are fixed. What am I missing?

Thanks!
Ben

1 Like

So, it seems that the problem is actually the DropOut while inferring, found here:

I can’t find it in the original GraphSage paper. They do reference a paper that uses DropOut to achieve better results, but I’m not sure that they use DropOut while inferring as well.

Let me know what you think, I’ll be happy to submit a PR to fix it.

Thanks,
Ben

You’ll need to call model.eval() before the inference statement. That switches the underlying dropout modules (or batch normalization if being used) to evaluation mode which is deterministic.

The code in train_sampling_unsupervised.py wraps inference() call with model.eval() so it should work fine already.

Thank you very much, it works :slight_smile:

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.