Add_self_loop batched graph lost batch information

Hi,

If I use
g = dgl.add_self_loop(g)
on a batched graph, the new graph is no longer batched (g.batch_num_nodes() will now just be a list of length 1). Is this a bug or the intended behavior?

Hi, it is an intended behavior that any graph transformation will result in the loss of batch information. For a workaround, you may use set_batch_num_nodes and set_batch_num_edges.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.