Gradient pass through dgl.edge_subgraph

Hi all,

Firstly, thank you for the excellent package. I have a query about whether gradients can back-propagate through subgraph functions like edge_subgraph(). I am currently training a model that sampling edges based on certain probabilities. The essential requirement is for the gradients to pass through the edge_subgraph() function, thereby enabling effective model training. Could you please clarify if this is feasible? If not, do you have any suggestions to enable it?

Thanks in advance

If you are referring to backpropagating through edge features in the subgraph, then the answer is yes, the gradient can pass through it.

But it seems that you are trying to sample edge subgraphs based on the probabilities. Are you going to backpropagate through the probabilities? If so, since sampling is a non-differentiable operation, you will likely need something like policy gradients (see e.g. Probability distributions - torch.distributions — PyTorch 2.0 documentation).

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.