 # Does my understanding of GCN implementation right?

``````def gcn_message(edges):
return {'msg' : edges.src['h']}

def gcn_reduce(nodes):
return {'h' : torch.sum(nodes.mailbox['msg'], dim=1)}

class GCNLayer(nn.Module):
def __init__(self, in_feats, out_feats):
super(GCNLayer, self).__init__()
self.linear = nn.Linear(in_feats, out_feats)

def forward(self, g, inputs):

g.ndata['h'] = inputs

g.send(g.edges(), gcn_message)

g.recv(g.nodes(), gcn_reduce)

h = g.ndata.pop('h')

return self.linear(h)
``````

My question is about `gcn_message` and `gcn_reduce`.

If we pass all the edges into `gcn_message`, does it mean that each edge will send message from its own node?

If we pass all the nodes into `gcn_reduce`, does it mean that each node will recv the message from all of its edges?

if I edit `gcn_reduce` to

``````def gcn_reduce(nodes):
recv_msg = nodes.mailbox['msg'] # [1,9,34]
recv_msg2 = torch.sum(recv_msg, dim=1) # [1,34]
return {'h' : recv_msg2}
``````

I debug and stop at the first time. What does `[1,9,34]` mean in the code above?

Thank you.

DGL did degree bucketing to accelerate the computation. The computation on nodes with same in-degrees are batched together. For example, if both node 1 and 2 have 3 in edges, the incoming message would be batched together, therefore the shape of `nodes.mailbox['msg']` would be [2(bucket size, two nodes has same in degrees), 3(node’s degree is 3), feat_size].

Thank you!

If we pass all the edges into `gcn_message` , does it mean that each edge will send message from its own node?

If we pass all the nodes into `gcn_reduce` , does it mean that each node will recv the message from all of its edges?

Am I right？

I am sorry.

“both node 1 and 2 has 3 in edges”