GNNExplainer with minibatch training

Hi @mufeili, does GNNExplainer work with minibatch training?

GNNExplainer can be applied to any trained GNN model. However, you need to satisfy the requirement of model in the doc.

Can you elaborate?

For example I have a model using minibatch.

class TwoLayerGCN(nn.Module):
    def __init__(self, in_features, hidden_features, out_features):
        super().__init__()
        self.conv1 = dgl.nn.GraphConv(in_features, hidden_features)
        self.conv2 = dgl.nn.GraphConv(hidden_features, out_features)

    def forward(self, blocks, x):
        x = F.relu(self.conv1(blocks[0], x))
        x = F.relu(self.conv2(blocks[1], x))
        return x

If I want to apply GNNExplainer to this model, I need to convert the blocks to DGLGraph? For example, the model has two blocks, so I need to merge the two blocks into a graph?

class NewTwoLayerGCN(nn.Module):
    def __init__(self, in_features, hidden_features, out_features):
        super().__init__()
        self.conv1 = dglnn.GraphConv(in_features, hidden_features)
        self.conv2 = dglnn.GraphConv(hidden_features, out_features)

    def forward(self, graph, x):
        x = F.relu(self.conv1(graph, x))
        x = F.relu(self.conv2(graph, x))
        return x

Is this the right way? Thanks for your reply. @mufeili

You can have two functions for NewTwoLayerGCN. Let’s say a forward_block function that takes blocks and you use during training and a forward function that takes a DGLGraph and you use for explanation.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.