About batch training

When I use the sum of all node features as GNN model forward function’s return, the batch doesn’t work. Is there any alternative to the sum function ?
my code looks like

def forward(self, g, h):
    g.edata['h'] = h
    h = self.nn_layers(h) # (V, edge_out_feats) V is the number of edges
    return h.sum

In my opinion, the problem is that the graph after batch in DGL is a large graph, so the return value after sum cannot be distinguished

I assume you want to sum over all the edge features for each graph individually? If so, try dgl.sum_edges.

1 Like

Wow! It’s worked! Thank you a lot!
Do I have to read the entire DGL document to find the functions I need, or do I have to be imaginative and observant, observe how functions are named, and believe that a graph-processing library should come with such functions :thinking:

What you are trying to do is quite similar to graph classification, which I think is similar to what GIN does. I would say that examples and tutorials (e.g. Training a GNN for Graph Classification — DGL 0.7.1 documentation) should be a good starting point.

You’re right. Actually,I have also practiced GIN on my own data sets.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.