# Applying Rel-conv with edge weights

Hi guys

I’m looking to apply the Relational Graph convolution layer in my model and was wondering about the “norm” argument of the forward function.

Does this mean that when provided with tensor of shape (|E|,1) the message aggregation will use these as edge weights, “upgrading” the formula from:

h_i^{(l+1)} = \sigma\left( \sum_{r \in \mathcal{R}} \sum_{j \in \mathcal{N}^r_i } \frac{1}{c_{i,r}} W_r^{(l)} h_j^{(l)} + W_0^{(l)} h_i^{(l)} \right)

to

h_i^{(l+1)} = \sigma\left( \sum_{r \in \mathcal{R}} \sum_{j \in \mathcal{N}^r_i } \frac{e_{ij}}{c_{i,r}} W_r^{(l)} h_j^{(l)} + W_0^{(l)} h_i^{(l)} \right)

(in this scenario, the weights e_{ij} would be contained in the tensor passed as the norm argument).

if norm is None which is the default value, the formula will be
h_i^{(l+1)} = \sigma\left( \sum_{r \in \mathcal{R}} \sum_{j \in \mathcal{N}^r_i } \ W_r^{(l)} h_j^{(l)} + W_0^{(l)} h_i^{(l)} \right)

if norm is passed in with a tensor of shape (|E|, 1) with each element is e_{ji}, the formula will be
h_i^{(l+1)} = \sigma\left( \sum_{r \in \mathcal{R}} \sum_{j \in \mathcal{N}^r_i } \ e_{ji} W_r^{(l)} h_j^{(l)} + W_0^{(l)} h_i^{(l)} \right)