About add self loop and edge weight during message passing

hi @ mufeili
I want to ask a question about add self loop function.
here is a toy example:
g = dgl.graph((torch.tensor([0, 1, 2, 0]), torch.tensor([1, 2, 0, 2])))

g.ndata[‘feat’]=torch.tensor([1,2,3])

g.edata[‘w’]=torch.tensor([1,2,3,4])

when I called this function, g = dgl.add_self_loop(g)

the edge data become: {‘w’: tensor([1, 2, 3, 4, 0, 0, 0])} 0 is added to the end.

then if I want to apply edge weight during message passing using a gcn layer.

the weight of selfloop is 0, so the add selfloop function doesn’t change the message passing process, Right?
edge weight code from Graphconv:
graph.edata[’_edge_weight’] = edge_weight
aggregate_fn = fn.u_mul_e(‘h’, ‘_edge_weight’, ‘m’)

Your understanding is correct.

so should I have to change the edge weight from 0 to 1 ?

and here is one little suggestion: dgl.to_bidirected(g,copy_ndata=True)

this function only copy the node data, but sometimes edgedata need to be copied as well.

I have constructed a unidirect graph, but neither the to_bidirected and add self loop function meet my needs to copy ndata and edata, and add self loop for apply edge weight in message passing

so should I have to change the edge weight from 0 to 1 ?

and here is one little suggestion: dgl.to_bidirected(g,copy_ndata=True)

this function only copy the node data, but sometimes edgedata need to be copied as well.

I have constructed a unidirect graph, but neither the to_bidirected and add self loop function meet my needs to copy ndata and edata, and add self loop for apply edge weight in message passing

hi @mufeili I have another question, if my node feature is randomly initialized, should I add the features to the optimizer? thanks.

  1. You should change the edge weight from 0 to 1.
  2. Thanks for the suggestions on improving to_bidirected and add_self_loop.
  3. Yes, you should add the randomly initialized node embedding to the optimizer.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.