Applying Rel-conv with edge weights

Hi guys

I’m looking to apply the Relational Graph convolution layer in my model and was wondering about the “norm” argument of the forward function.

Does this mean that when provided with tensor of shape (|E|,1) the message aggregation will use these as edge weights, “upgrading” the formula from:

h_i^{(l+1)} = \sigma\left( \sum_{r \in \mathcal{R}} \sum_{j \in \mathcal{N}^r_i } \frac{1}{c_{i,r}} W_r^{(l)} h_j^{(l)} + W_0^{(l)} h_i^{(l)} \right)

to

h_i^{(l+1)} = \sigma\left( \sum_{r \in \mathcal{R}} \sum_{j \in \mathcal{N}^r_i } \frac{e_{ij}}{c_{i,r}} W_r^{(l)} h_j^{(l)} + W_0^{(l)} h_i^{(l)} \right)

(in this scenario, the weights e_{ij} would be contained in the tensor passed as the norm argument).

Thanks for your answer :slight_smile:

if norm is None which is the default value, the formula will be
h_i^{(l+1)} = \sigma\left( \sum_{r \in \mathcal{R}} \sum_{j \in \mathcal{N}^r_i } \ W_r^{(l)} h_j^{(l)} + W_0^{(l)} h_i^{(l)} \right)

if norm is passed in with a tensor of shape (|E|, 1) with each element is e_{ji}, the formula will be
h_i^{(l+1)} = \sigma\left( \sum_{r \in \mathcal{R}} \sum_{j \in \mathcal{N}^r_i } \ e_{ji} W_r^{(l)} h_j^{(l)} + W_0^{(l)} h_i^{(l)} \right)

Thanks for your answer!

@MrRyschkov I’m sorry, my previous comment is not correct. pls check the updated one.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.