How to retrieve attention weights from GAT?

Hi,

I saw the GAT implementation and I wanted to retrieve the attention weights (equation 3) from
https://docs.dgl.ai/en/latest/tutorials/models/1_gnn/9_gat.html

However, looking at the values of alpha = F.softmax(nodes.mailbox[‘e’], dim=1) do not appear to be right (they’re uniformly distributed when I print them).

How can I get these values?

Thanks!

Hi,

You are right, these values are actually distributed uniformly. However, if some nodes has only one edges, the attention weight on that edges will be 1 after softmax operation. That’s why you see diverse attention weight in the visualization. It’s mainly because the degree distribution.

One of our team member @mufeili is doing research on this, you can find the paper here. Based on his research, PPI dataset showed significant diverse attention weight, which is different from cora and other datasets. A more detailed version will be released soon, so please stay tuned.

3 Likes

Is there any code example to extract attention weight for each edge?

@bean Does this help?

1 Like

Thanks, it helps, I can do it now.