How to Implement the Renormalization Trick

Hi, thanks for making this awesome graph library.
I’m trying to implement the GCN and reproduce the result in Semi-Supervised Classification with Graph Convolutional Networks.
With support of Graph Convolutional Network tutorial, I can make the simple GCN and obtained some good accuracy.
But the implementation of normalization of the adjacency matrix did not go well.
I tried

class GCN(nn.Module):
    def __init__(self, in_feats, out_feats, activation):
        super(GCN, self).__init__()
        self.apply_mod = NodeApplyModule(in_feats, out_feats, activation)

    def forward(self, g, feature):
        g.ndata['h'] = feature
        # --- normalization ---
        g.ndata['h'] = g.ndata['h'] / g.ndata['sqrt_deg']  
        g.update_all(fn.copy_src(src='h', out='m'), 
                     fn.sum(msg='m', out='m_sum'))
        g.ndata['h'] = g.ndata['m_sum'] / g.ndata['sqrt_deg'] 
        # ---------------------
        g.apply_nodes(func=self.apply_mod)
        return g.ndata.pop('h')

with

g = DGLGraph(data.graph)

g.ndata['deg'] = g.out_degrees(g.nodes()).float()   
g.ndata['deg'] = g.ndata['deg'].view(2708, 1)
g.ndata['sqrt_deg'] = th.sqrt(g.ndata['deg'])

But this method had bad accuracy…
Please tell me some good idea.

We have one implementaion of GCN with normalization, which you can find here.

You need to do the normalization in both message and reduce procedure, which is in Line 50 ~ Line 56 of the code mentioned above.

1 Like

To me, your implementation actually looks correct. As @VoVAllen said, maybe starting from our example code is a good idea. There are several tricks other than the renormalization such as adding self-loops to the graph and dropout. Also, dataset can be another concern if it is not the typical cora/citeseer. Let us know if you cannot reproduce the accuracy using our examples.

Thanks for replies!
Referring to the the implementation I have got almost correct value to be expected.