RGCN and skeep connection in hidden layer


I am getting confused by the message_func of the RGCN example provided in the documentation.
As I understand in, in the case that it is a hidden layer, you multiply the weights by the output of the previous layer h, but then you multiply that result again by norm which, as I understand it, corresponds to the initial values of the network edges.

def message_func(edges):
    w = weight[edges.data['rel_type']]        
    msg = torch.bmm(edges.src['h'].unsqueeze(1), w).squeeze()
    msg = msg * edges.data['norm']
    return {'msg': msg}

What am I missing ?

I don’t fully get the question here.

When you aggregate messages on nodes, you might want to do a weighted summation to make sure it’s stable, otherwise nodes with more edges will gradually explode. And the weight here is the normalization. This normalization constant (i.e. edges.data['norm']) is defined on the graph structure: the number of in-coming edges of the same relation type.

Thank you for your feedback. Indeed that question was a bit naive and reflected my understanding at some point. I thought that the model integrated not only the number of connections but also their values.
My reputation was too low to delete the post when I realized it was not a constructive question.

@YohanObadia Don’t worry. People have questions here and there. I won’t delete your post as others may have similar questions as yours.