I have a linear layer which has an output size of 128 dimensions
I have a graph which I have passed through gcn and increased the node size to 128 dimensions as well.
The problem is, I want to connect the outputs of the linear layer to the zero in-degree nodes of my graph. Can I do that?
This is the sample model
class gnn(nn.Module): def __init__(self): super().__init__() self.fc1 = nn.Linear(4,128) self.gcn = GraphConv(128,128, allow_zero_in_degree=True) def forward(self, x, dgl_graph, feat, weight): x = torch.flatten(x,1) x = self.fc1(x) x = F.relu(x) graph_out = self.gcn(dgl_graph, feat=feat, edge_weight=weight) return out