Understand Graph Classification GraphConv

Dear experts, please explain me like I am 5 because I am trying to learn Graph Clasification from this and I dont understand what should I pass to forward in feats parameter :frowning:

Exactly I have problem with h parm - I dont know what should I pass here.

From GraphConv documentation I can read it is:

If a torch.Tensor is given, it represents the input feature of shape (N,Din)(N,Din) where DinDin is size of input feature, NN is the number of nodes. If a pair of torch.Tensor is given, which is the case for bipartite graph, the pair must contain two tensors of shape (Nin,Dinsrc)(Nin,Dinsrc) and (Nout,Dindst)(Nout,Dindst).

https://docs.dgl.ai/api/python/nn.pytorch.html#dgl.nn.pytorch.conv.GraphConv.forward

I don’t understand it at all :frowning:
This is my example raw simple dgl graph. Anyone could explain me in this example what should I pass to this parm to perform Graph Classification?

Graph:
Graph(num_nodes=8, num_edges=28,
      ndata_schemes={'confidence': Scheme(shape=(), dtype=torch.float64), 'name_id': Scheme(shape=(), dtype=torch.int64), 'area': Scheme(shape=(), dtype=torch.float64)}
      edata_schemes={'distance': Scheme(shape=(), dtype=torch.float64)})
Node features:
{'confidence': tensor([0.7001, 0.8522, 0.5990, 0.5557, 0.5872, 0.5647, 0.5773, 0.7529],
       dtype=torch.float64), 'name_id': tensor([19, 19, 19, 19, 19, 19, 19, 19]), 'area': tensor([ 680., 1075.,  875., 1116.,  504.,  460.,  484.,  576.],
       dtype=torch.float64)}

g in def forward(g, h) is a batched graph which is usually generated by dgl.dataloading.GrapDatalaoder, h is a batched tensor which contains all the features in the batched graph. So message passing works on the batched graph, trained with h.

1 Like

Thank you! I understand batched graph in g parm.

About parm h I should create tensor like this with only node features? I have 3 features so I need to combine all of them to one tensor like this?

tensor([ 0.7001, 19, 680.,
0.8522, 19, 1075.,
0.5990, 19, 875.,
0.5557, 19, 1116.,
0.5872, 19, 504.,
0.5647, 19, 460.,
0.5773, 19, 484.,
0.7529, 19, 576., ])

Or should I also add to this tensor somehow edge features? Or no?

Yes. In the example, GraphConv is used so h.shape[0] should be the number of nodes.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.