What is the best way to implement a Tree Convolution?

I was reading the paper Tree Gen (https://arxiv.org/pdf/1911.09983.pdf) and wanted to implement the TreeConvolution:

Screen Shot 2021-07-27 at 9.16.54 AM

I was wondering if this was possible to be done with PyTorch Geometric or DGL or something like that or if the best way was to simply generate the adjacency matrix, multiply my vectors with it and then do a normal convolution?

cross-posted:

The formula seems similar to SIGN, where you have the concatenation of [Y, YM, …, YM^k]. Maybe you could check the sign example?

1 Like

will check it out for sure! Thanks for the pointer.

On the more general note, is there a way to “insert a (any) GraphNN layer” in a feedforward model I already have in pytorch with dgl?

e.g.

imagine I wanted to implement a sequential model with a GCN:

f = nn.Sequential(OrderedDict([
    ('f1', nn.Linear(Din, Dout)),
    ('out', nn.ReLU()),
    ('gcn1', dgl.GCN(*params))
]))
y: Tensor =  f(x)

Something like that. Do you know if that is possible? @BarclayII

I don’t think it’s possible for nn.Sequential because nn.Sequential expects the modules to take in a single argument. This is not the case for DGL NN modules.

Apologies, I should have been more clear. I didn’t really expect it to be put in a nn.Sequential blindly that is why params was passed e.g. obviously it will need the adjacency matrix. My pain point was if I can use the dgl layers easily in a stack of layers in a model. Just plugging it in assuming I give the right parameters e.g. the adjacency matrix since it looks like it’s just:

H^l = GCN^l(H,A) = ReLU( A @ H @ W^l)

where @ is batch matrix multiply.

Is this what you are looking for? dgl.nn.pytorch.utils. Sequential

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.