What is the difference between norm both and right in graphconv module?

Hello,
A quick question. What is the difference between norm both and right in graphconv module in DGL 0.5?
I guess that both and right are equivalent to averaging the received messages.
Thank’s in advance.

Let’s consider the vanilla message passing where I omit the learnable weights and bias:

h'_{v} = \sum_{u\in \mathcal{N}(v)}h_u,

where \mathcal{N}(v) is the set of neighbors of v, h_u is the original representation of node u.

The “both” normalization gives:

h'_{v} = \sum_{u\in \mathcal{N}(v)}\frac{1}{\sqrt{d_u \hat{d}_v}}h_u,

where d_u is the out-degree of node u and \hat{d}_v is the in-degree of node v.

The “right” normalization gives

h'_{v} = \sum_{u\in \mathcal{N}(v)}\frac{1}{\hat{d}_v}h_u.

Thank’s mufeili, very clear.